TL;DR
- Most analytics teams understand the three analytics tiers in theory, but stay stuck at descriptive reporting, unable to advance to predictive or prescriptive work.
- The gap between analytics leaders and laggards keeps widening, largely due to data quality issues and lengthy data preparation delays.
- Predictive and prescriptive analytics require clean, well-shaped data pipelines that analysts can iterate on quickly.
- AI agents are reshaping the analyst role by handling the heavy lifting of data preparation, allowing analysts to focus on analysis.
- Prophecy provides business analysts with AI-powered self-service analytics pipelines, so they can prepare data for analysis on their own without writing code.
Most analytics teams know the theory cold, where descriptive analytics tells you what happened, predictive analytics tells you what might happen, and prescriptive analytics tells you what to do about it. The harder problem is that most organizations stay stuck at level one, where reporting and forecasting still dominate daily work while predictive and prescriptive capabilities remain aspirational roadmap items.
This guide is written for business analysts and analytics leaders and focuses on the work analysts do to prepare data for analysis, rather than the upstream engineering work that loads raw data into the cloud data platform.
Descriptive analytics, or what happened
Descriptive analytics explains what happened. Analysts query prepared data from the cloud data platform to create reports, dashboards, or aggregated views, presenting historical data in understandable formats such as tables and charts, without predicting future events.
This is the tier most analytics teams live in daily, covering revenue dashboards, cost variance reports, customer segmentation, operational key performance indicators (KPIs), and more. It's foundational and is almost always delivered through Business Intelligence (BI) tools built on well-prepared datasets. Common examples include supply chain control towers that give operations teams granular visibility into material flows, financial planning and analysis (FP&A) reports on revenue trends and margin analysis, and customer segmentation that establishes the baseline for marketing and service strategy.
Descriptive analytics is the operational bedrock, and standardized reporting remains the dominant daily tool even in sophisticated organizations. BI tools work well in this tier as long as the datasets feeding them are trustworthy and well-shaped for the question being asked.
Predictive analytics, or what might happen
Predictive analytics shifts the lens from backward-looking to forward-looking. It uses historical data and models to predict what might happen next, forecasting outcomes such as whether a customer will churn or a borrower will default. Supply chain teams use it for demand sensing, customer success teams for churn prediction, and finance teams for revenue forecasting, improving on spreadsheet-based methods without replacing them wholesale.
Predictive and prescriptive analytics require sufficient historical data and well-shaped, analysis-ready data to build accurate models. Data engineers deliver the clean, governed foundation through ETL, and analysts then perform the additional shaping that each analytical question demands, including joining domains, deriving features, and reconciling edge cases that only surface during analysis.
That last step is where most teams hit a wall. Analysts understand the business question and know exactly how the data needs to be shaped, but they often have to hand those requests back to engineering and wait. If this sounds familiar, sign up for a demo, and see how Prophecy's AI agents can shorten the path from question to analysis-ready dataset.
Prescriptive analytics, or what should we do
Prescriptive analytics is the most valuable and least adopted tier. It recommends next steps and can execute actions to optimize outcomes through AI, machine learning, and optimization algorithms. Predict-then-optimize research offers a more technically precise framing: prescriptive analytics jointly perform prediction and prescription, yielding lower prescription error than handling them sequentially.
In production, this looks like supply chain optimization that evaluates available responses and selects the optimal one while balancing mitigation cost against benefit, automated offer optimization that tracks customer behavior and delivers personalized offers in real time, and the highest-maturity FP&A teams going beyond forecasting a budget shortfall to recommend the specific cost levers or revenue actions needed to close the gap.
At this tier, the analytics output stops being a report reviewed by a human and becomes an automated decision executed by an optimization algorithm. That's a different workflow, and one that's unforgiving when the data underneath is inconsistent.
What's changing with AI agents and the democratization of analytics
The analytics landscape is shifting fast. By 2027, 50% of business decisions will be augmented or automated by AI agents, shifting prescriptive analytics from recommending actions to generating and executing them within defined parameters.
By that same year, 75% of analytics content will be contextualized through generative AI (GenAI), displacing static dashboards with dynamically generated narratives. GenAI handles the volume of routine descriptive production while analysts concentrate on judgment-intensive analysis.
The analyst role is evolving, not disappearing. Instead of manually producing every dataset and report, analysts are shifting toward designing, governing, and monitoring analytics pipelines that AI agents help them build. When done right, analysts move faster and stay focused on the analysis that actually drives decisions.
Raw, ungoverned AI code generation deserves some caution. What analytics teams need is AI acceleration combined with human review and governance: the speed of AI with the reliability the business actually trusts.
Self-service analytics pipelines with Prophecy
The maturity conversation eventually comes back to one bottleneck. Analysts can't iterate fast enough on the analytics pipelines that feed descriptive reports, predictive models, and prescriptive systems. Data engineers have already done their part, completing the ETL process and landing the prepared data in cloud data platforms. Analytics teams need their own path to take that prepared data the rest of the way to analysis-ready, without pulling engineers into every request.
Here's what makes Prophecy different:
- AI agents across the workflow: Several agents take care of workflow authoring, documentation, quality checks, and legacy migration, helping analysts ship faster while maintaining governance.
- Visual interface backed by code: Analysts can build workflows visually or work directly in code, and review the generated SQL from the same versioned artifact.
- Built-in governance: Lineage, RBAC, and version history come standard with the platform, giving the data team full oversight.
- Deployment to cloud data platforms: Prophecy operates on Databricks, Snowflake, and BigQuery, keeping compute and governance within the platform your team already relies on.
One data practitioner used Prophecy to combine product usage, billing, CRM, and support data into a single customer health pipeline. A single business-problem prompt generated entity resolution, behavioral cohorts, health scoring, and expansion-opportunity ranking in a single connected workflow.
Here's what the shift looks like:
- Before: Weeks spent on hand-written joins, feature engineering, and scoring functions across four separate systems.
- After: A clear prompt, followed by time spent validating logic and reasoning through outcomes.
- Result: Production-ready pipelines delivered in a sitting rather than a sprint.
Prophecy is built to complement the rest of your stack. BI tools such as Tableau, Power BI, or Looker continue doing what they do best, namely, visualization and analysis on top of well-prepared datasets, while Prophecy helps analytics teams get those datasets ready.
Whether your team is building descriptive dashboards, preparing data for predictive models, or feeding prescriptive optimization systems, the bottleneck stays the same: analysts need clean, well-shaped, analysis-ready data without waiting weeks. Prophecy makes analytics teams self-sufficient in data prep, so the only thing standing between a question and an answer is the analyst asking it.
Ready to see AI-powered self-service analytics pipelines in action? Book a demo to walk through your use case with our team, or sign up for free and start building analytics pipelines today.
Frequently asked questions
What's the difference between descriptive, predictive, and prescriptive analytics?
Descriptive analytics explains what happened using historical data in reports and dashboards. Predictive analytics forecasts what might happen next using statistical models and machine learning. Prescriptive analytics recommends or automates the best next action using optimization algorithms on top of predictions.
Why are most analytics teams stuck at descriptive analytics?
The common blocker is data preparation. Predictive and prescriptive work requires well-shaped, analysis-ready data, and analysts often can't get there quickly because every shaping request has to route through engineering. That's the gap Prophecy's AI agents close.
Do analysts need engineering skills to use Prophecy?
No. Prophecy is built for business analysts. You describe what you want to build in plain language, and Prophecy's AI agents generate the analytics pipeline visually, so you can review and refine without writing engineering code.
Does Prophecy replace BI tools like Tableau or Power BI?
BI tools remain the primary way teams visualize data and build dashboards. Prophecy focuses on the step before, preparing and shaping governed data into analysis-ready datasets so BI tools can work on trustworthy inputs.
Ready to see Prophecy in action?
Request a demo and we’ll walk you through how Prophecy’s AI-powered visual data pipelines and high-quality open source code empowers everyone to speed data transformation

