TL;DR
- Data prep dominates analyst time: Analysts spend most of their time on data preparation, leaving little capacity for actual analysis.
- Category boundaries are dissolving: Business intelligence (BI) tools and data prep platforms are converging, blurring traditional category boundaries.
- Analyst firms are catching up: Major analyst firms are formally merging BI, augmented analytics, and data preparation into unified platform categories.
- Skipping data prep kills AI projects: Organizations that skip data prep and jump straight to last-mile AI risk.
- Prophecy bridges both: Prophecy's agentic data preparation delivers governed visual workflows and AI agents so analysts can manage both data prep and analysis in one place.
You need both, starting with data prep. AI-accelerated data preparation must come first because reliable last-mile analysis depends on governed, AI-ready data foundations. Without that foundation, AI-powered insights break down before they deliver value.
Your analysts spend most of their time discovering, transforming, and preparing data before any analysis begins. Meanwhile, stakeholders want faster answers, BI tools are absorbing data prep features, and data prep platforms are pushing into analytics delivery. The boundaries between these categories are dissolving fast.
The focus here is on what happens after data lands in the cloud data platform, not replacing data engineering or extract, transform, and load (ETL) pipelines. Analytics teams still need to transform, prepare, and shape that data for analysis, and that's where the bottleneck lives. At Prophecy, we build AI-accelerated data preparation that lays the governed, AI-ready foundation first so last-mile analysis can deliver trustworthy results.
The data prep burden is measured
Analysts spend the majority of their time on work other than analysis, and engineering teams absorb the overflow. The numbers tell a consistent story.
- Discovery and preparation dominate analyst time: Analysts spend their time discovering and preparing data.
- Most BI users can't self-serve: Only 20% of BI users are self-sufficient in the enterprise. The remaining 80% depend on that data-proficient minority to fulfill their analytical needs.
- The bottleneck compounds across teams: A small group of capable analysts absorbs the data prep burden for the entire organization. Consider a director managing 10 analysts serving five business units: the best people are stuck in data wrangling while stakeholder requests pile up.
- Engineering teams pay the hidden cost: Analytics data workflow requests can consume 10–30% of engineering time. For a team of 10 engineers, that's the equivalent of one to three full salaries spent on slow, ad hoc requests while the business is stuck with stale or untrusted data. If analysts could self-serve without opening a single engineering ticket, that capacity goes back to building product.
Data engineering teams own ETL pipelines, data ingestion, and governance, and that work continues. The problem is that analytics teams still need significant additional transformation to turn governed data into insights, and that demand is what overwhelms both groups.
BI tools are swallowing data preparation
BI tools are powerful for visualization and analysis, but they depend on well-prepared datasets. Tableau and Power BI are both building AI-powered data preparation directly into their platforms, and the shift goes beyond incremental feature additions. It represents a structural repositioning of what a BI platform does.
Power BI introduced a suite of AI-powered data preparation capabilities that position enterprise analysts as semantic layer engineers inside the BI tool:
- Prep Data for AI: A dedicated capability launched in May 2025 that lets analysts curate schema, pre-build verified answers to common questions, and embed business logic directly into AI-assisted workflows.
- Copilot in Dataflows Gen2: Users can transform data using natural-language commands, filtering, grouping, and reshaping data, all in a conversational way.
- Power Query in the browser: The core transformation engine moved into the browser as of the June 2025 update, removing the requirement to use Power BI Desktop for data transformation tasks.
Tableau followed a parallel path with its own set of AI-powered prep features:
- Tableau Agent: Now handles natural language calculation writing and complete Prep flow construction as of the Tableau 2025.2 release.
- Multi-step AI planning: The Tableau AI product page describes an agent that delivers multi-step plans for complex data preparation, generates powerful calculations, and instantly pivots tables.
- Agentforce integration: Covers data preparation, visualization generation, and semantic model construction in a unified workflow.
Another repositioning signal stands out. Tableau Prep's product page headline is now "Is your data ready for AI?" This reframes data preparation from pre-visualization cleanup to AI-readiness enablement.
Data prep platforms are pushing into last-mile analysis
The convergence runs in both directions. Dedicated data preparation and ETL pipeline platforms are expanding up the stack toward analytics delivery and targeting the same business users that BI tools serve.
Prophecy's trajectory illustrates this clearly:
- AI-powered analytics preparation: In June 2024, Prophecy introduced AI-powered capabilities to transform and prepare data for analytics and AI applications on cloud data platforms such as Databricks, Snowflake, and BigQuery, explicitly repositioning toward agentic data preparation.
- Extending to business analysts: By March 2025, Prophecy 4.0 extended its target user from data engineers to business analysts with agentic data preparation and self-service visual workflows, designed for analysts to work within guardrails. Prophecy works after the data is already in the cloud data platform. ETL pipelines remain the primary way data enters the platform, and data engineering teams continue to own that process.
- Specialized AI agents: The v4 release introduced multiple AI agents that make AI-generated logic visual and reviewable, avoiding the slow, error-prone process of validating pages of AI-written SQL or Spark code. Results that took days or weeks are being done in minutes with AI agents.
- Analytics use cases: Prophecy's Professional Edition now targets analytics data-preparation use cases such as marketing attribution, financial planning and analysis (FP&A), and product usage analysis. Analysts build transformation logic once and use AI to speed up preparation and decision-making rather than working in spreadsheets. Reporting and dashboards remain the domain of BI tools; Prophecy prepares the data those tools depend on.
Alteryx shows a similar pattern of category convergence. Gartner Peer Insights classifies the Alteryx platform across six market categories, including Data Preparation Tools, Analytics and Business Intelligence Platforms, and Data Science and Machine Learning Platforms. This formal recognition confirms that platforms increasingly span traditional category boundaries.
For teams evaluating their data prep and analytics stack, the question is whether their platform can run governed data workflows natively on the cloud data platforms they're investing in.
Analyst firm categories are merging
Major analyst firms are formally acknowledging convergence, confirming that this goes beyond vendor ambition. The platform category definitions are shifting in clear ways:
- Analytics and business intelligence (ABI) platform redefined: The ABI platform definition was updated last year to explicitly include data preparation as a core platform function, not an optional adjacent capability.
- Augmented analytics absorbed: The standalone augmented analytics category was merged into the broader ABI platforms category, signaling that AI-augmented analytics is now a baseline expectation rather than a differentiator.
- New converged categories emerging: A Data Lakehouse Platforms category is planned for the future, distinct from the BI or data integration predecessors.
- Unified data control plane: Generative AI (GenAI)-driven data intelligence and integration software are projected to merge into a new automated data control plane.
The convergence also shows up from the demand side. Serving the majority of BI users who lack self-sufficiency requires BI platforms to absorb data preparation capabilities. One BI platform's landscape characterizes BI tools as designed to enable last-mile intelligence for business pros, which implicitly positions data preparation as a prerequisite that must be absorbed or bridged. Data integration is also positioned as a functional prerequisite embedded within analytics delivery rather than a separable upstream process.
The old category boundaries no longer fit the way these platforms are evolving.
Sequence still matters, and data prep AI comes first
The "just do last-mile analysis" argument breaks down because you can't build reliable AI-powered analysis on unreliable data foundations.
Through 2026, organizations are predicted to abandon 60% of AI projects unsupported by AI-ready data. Traditional data management operations are too slow, too structured, and too rigid for AI teams.
When teams skip the data prep layer and go straight to last-mile with AI, they hit the same failure rate. Analysts lose trust in AI-generated outputs built on inconsistent or poorly structured data, and once that trust erodes, adoption stalls. The backlog gets worse because the team is debugging AI answers on top of the data prep work they were already behind on.
The same risk applies to teams considering raw AI coding assistants as a shortcut. Ask five engineers to build the same feature with no shared standards, style guide, or code review, and you'll end up with five incompatible implementations that nobody can maintain. Ungoverned AI-generated code creates the same problem at greater scale. Prophecy's AI-accelerated approach pairs AI speed with human review, standardization, and Git-based version control, so teams get the velocity of AI with the reliability of engineering. No separate code scanning tools are required.
Enterprises are also advised to build data pipelines, AI-driven insights, automation frameworks, and real-time decisioning engines as foundational infrastructure, explicitly listing data pipelines ahead of AI-driven insights in the dependency chain.
Teams need both, and data prep comes first.
How to decide where to start
The right starting point depends on your team's complexity and governance needs.
- BI-native prep features: If your team is already standardized on Tableau or Power BI and your prep needs are relatively simple (schema curation, light transformations, and verified answers for common questions), BI-native prep features may be the fastest path to value.
- Governed data workflow platform: If your analysts need production-grade, reusable analytics data workflows across cloud data platforms, or if your data platform team requires governance controls that a BI tool can't enforce, starting with a governed data workflow platform changes the equation.
- Migrating from legacy desktop tools: If your team has existing data workflows in tools like Alteryx Desktop and you're already running a platform with built-in transpilation, you can move those workflows to your cloud compute without a full rebuild, avoiding the cost and risk of starting from scratch. Engineering and platform teams tracking modernization progress can point to real, measurable momentum: workflows migrated, pipelines modernized, adoption numbers climbing. Every workflow built or transpiled in Prophecy serves as one more proof point for the platform they've invested in.
You don't need to overhaul everything in one cycle. The efficiency use case is where most teams start. Show your analysts a faster, better way to build and manage data workflows alongside their existing workflows. When the value is clear, the migration follows naturally. Your team stays productive without betting everything on a big-bang rollout.
Organizations that redesign workflows deeply are nearly three times as likely to report AI outcomes that reshape the business. And with AI projects failing without AI-ready data, getting the foundation right is the prerequisite for everything else working. You can try Prophecy for free to see how it fits your team's workflow.
Bridge data prep and analysis with Prophecy
Prophecy's agentic data preparation breaks the cycle of analyst bottlenecks and stalled AI initiatives by providing analysts and data engineers with a unified platform to build, govern, and deploy analytics workflows on cloud data platforms such as Databricks, Snowflake, or BigQuery, without writing SQL.
The business wants fast, trusted, accurate data. Analysts want to deliver it without waiting on engineering. But most teams are still stuck: analysts spend most of their time preparing data, stakeholders wait for insights, engineering burns a fraction of its capacity on ad hoc analytics requests, and AI initiatives stall because the underlying data isn't ready. The categories are merging, but most teams still lack a single platform that handles governed analytics data preparation and last-mile analysis together.
Prophecy works after data is already in the platform. ETL pipelines and data governance remain with data engineering teams. Analysts build and run governed data workflows themselves, on your cloud platform, within your guardrails. The analyst becomes the one delivering what the business has been asking for, and engineering stops being the bottleneck.
The platform delivers this through several core capabilities:
- AI agents: Multiple specialized agents accelerate analytics data workflow development, making AI-generated logic visual and reviewable so analysts can build and iterate faster.
- Visual workflows: An intuitive visual interface lets analysts build and refine transformation logic without writing code, making data workflows accessible to both technical and business users.
- Built-in governance: Guardrails control what questions users can ask the agent, and all logic is version-controlled and auditable, giving data platform teams the oversight they require.
- Cloud deployment: Governed data workflows (sometimes also referred to as data pipelines) deploy as high-performance code on Databricks, Snowflake, or BigQuery. Compute, governance, and security all live in your stack, not ours, so your platform team stays in full control.
- Transpiler for migration: Teams with existing workflows in tools like Alteryx can migrate to governed, cloud-native data workflows without rebuilding from scratch, preserving logic while shifting execution to modern compute platforms.
Case Study: HubSpot Contact Analysis
A marketing operations team needed to analyze HubSpot contact data to understand nurture campaign performance across lifecycle stages, sources, and engagement patterns. Using Prophecy's AI agents, they described their business goal in natural language, and the platform generated a visual data workflow preparing their HubSpot data for analysis.
The AI segmented marketing leads into three distinct groups, surfacing insights that would have required hours of manual CSV exports and spreadsheet manipulation. The entire analysis was completed without writing code, making it accessible to business users outside the data team while delivering the depth that trained analysts expect.
Analytics leaders see the productivity gap and want a better path. Data platform leaders want efficiency, data quality, and something their engineering team can trust and govern. Prophecy's AI-accelerated data preparation speaks to both, making analysts self-sufficient while giving platform teams full visibility and control.
Book a demo and bring the people who'll feel the difference: analysts and application teams who need to move faster, and the platform team that needs to trust what gets deployed. We show analysts how fast they can build. We show platform teams how governance and compute stay entirely in their control.
FAQ
Should teams start with AI for data prep or AI for analysis?
Start with data prep. Reliable last-mile analysis depends on AI-ready data, governed pipelines, and structured logic before conversational or insight-generation layers can deliver value.
Are BI tools replacing standalone data prep platforms?
Not entirely. BI tools are powerful for visualization and analysis, but they depend on well-prepared datasets. Data prep platforms are moving closer to analytics delivery. The categories are converging rather than one replacing the other.
When are BI-native prep features enough?
They're a reasonable fit when teams need relatively simple work like schema curation, light transformations, and verified answers for common questions inside an existing BI environment.
When does a governed data workflow platform make more sense?
When analysts need reusable, production-grade analytics workflows across cloud data platforms like Databricks, Snowflake, or BigQuery, or when the data platform team requires stronger governance controls than a BI tool can enforce.
Ready to see Prophecy in action?
Request a demo and we’ll walk you through how Prophecy’s AI-powered visual data pipelines and high-quality open source code empowers everyone to speed data transformation

