TL;DR
- Data work is a team sport; data engineers own Extract, Transform, Load (ETL) pipelines and governance, while analytics teams build data workflows (sometimes also referred to as data pipelines) on top of governed data.
- Prophecy gives analytics teams agentic, AI-accelerated data preparation so they can prepare data for analysis, build data workflows, and run ad hoc queries without filing tickets to data engineering.
- Multiple AI agents generate, refine, and explain data processes, rendered as visual workflows with production-ready code underneath and every change retained in Git.
- Prophecy runs natively on cloud data platforms like Databricks, Snowflake, or BigQuery, and works alongside tools like Azure Data Factory (ADF) that data engineers use for ETL.
- Analytics teams ship governed data workflows in hours, while platform teams keep full visibility and control over compute, governance, and security in their own stack.
Analytics teams are stuck in ticket queues while business questions pile up. Data workflow requests can consume 10–30% of engineering time, which for a team of 10 engineers equals one to three full salaries spent on slow, ad hoc requests while the business waits on stale or untrusted data.
Azure Data Factory (ADF) handles orchestration and ETL work well, and data engineers rely on it to move data into cloud data platforms like Databricks, Snowflake, or BigQuery. Analytics teams pick up from there with a different kind of work that is exploratory, iterative, and tied to specific business questions. This discussion is about data analytics workflows specifically, meaning what analysts build on top of governed data, rather than the ETL pipelines that load data into the platform in the first place.
Data work is a team sport
Productive data organizations divide responsibilities clearly between two teams, and the handoff between them is where analytics teams need the right tools.
- Data engineers: They own ETL pipelines, data ingestion, data governance, and the modeled, trusted data sets that land in the cloud data platform. Tools like ADF are a strong fit for this work.
- Analytics teams: They pick up from there, building data workflows, performing the additional transformation their specific use cases require, running ad hoc queries, and turning governed data into insights during the analysis stage.
Data engineers perform significant transformation during ETL, but analysts still need additional transformation to prepare data sets for analysis. That additional work includes reshaping, enrichment, and data cleaning as part of the transformation. Agentic, AI-accelerated data preparation lets analysts improve transformation quality, prepare data sets for analytics, and run ad hoc queries confidently, all within the guardrails that data engineering and platform teams define.
Closing the self-service gap for analytics teams
Analytics teams close the self-service gap when analysts can build and ship their own data workflows inside platform guardrails. Only about 20% of decision-makers are hands-on with analytics today, while the other 80% rely on that 20% for data sourcing, discovery, and integration; building metrics and KPIs; running analytics; and delivering insights. That dependence creates the ticket queue analytics leaders want to shorten, along with the engineering time sink data platform leaders want to reclaim.
Closing the gap means equipping analytics teams with agentic, AI-accelerated data preparation for the data workflows they already own, while data engineers continue to own ingestion, modeling, and governance upstream. The business wants fast, trusted, accurate data, and analysts want to deliver it without waiting on engineering. When analysts serve themselves within platform guardrails, the analyst becomes the hero, the business gets what it has been asking for, and engineering stops being the bottleneck.
What analytics teams need in a self-service surface
Analytics teams need a surface shaped around their day-to-day work. A few capabilities make self-service practical in analytics contexts:
- Agentic AI assistance for outcomes: Analysts describe an outcome, for example, "aggregate Q1 returns by region and flag outliers," and AI agents generate a data workflow that analysts can inspect and refine step-by-step.
- Approachable transformation logic: Something like a visual canvas for data workflows, with production-ready code available when an engineer wants to review it, makes transformation approachable for analysts.
- Analytics-level access control: Analytics teams might need granular edit rights on specific data workflows without broadening access across the whole environment.
- Lightweight testing and promotion: Built-in row-count checks, schema validation, and a promotion path let analysts ship their work, while data engineering teams keep full Continuous Integration/Continuous Deployment (CI/CD) control over the ETL layer.
These capabilities complement the ETL work that data engineers already do. ETL tools keep doing the orchestration and ingestion work they were designed for, and analytics teams get a surface designed for the data workflows they were hired to build.
Why not just use Claude Code or general-purpose AI directly?
General-purpose AI can generate code fast, but without a shared structure, each analyst ends up with a different output shape.
Agentic, AI-accelerated data preparation pairs AI speed with:
- Human review: Analysts inspect and validate every workflow the agents generate, so nothing ships unchecked.
- Standardization: Platform teams define the components, connectors, and patterns analysts use, so workflows look consistent across the team.
- Git retention: Every change is inspectable, reviewable, and retained as part of the platform team's source of truth, with no separate code scanning layer required.
Where Prophecy fits
Prophecy gives analytics teams agentic data preparation for the workflows ADF doesn't serve. Teams use it after data has been ingested, modeled, and governed in cloud data platforms like Databricks, Snowflake, or BigQuery. ETL pipelines remain the primary way data enters the platform, and pipelines stay with data engineering. Prophecy handles what comes next, so analysts prepare data for analysis, build visual workflows, run ad hoc queries, and transform data confidently without opening a ticket.
Teams often start small, alongside tools they already run for ETL. The efficiency use case is where most teams begin, meaning they show the team a faster, better way to build and manage data workflows alongside what they already have. When the value is clear, broader migration follows naturally.
Here's what Prophecy offers analysts and the platform teams who support them:
- A team of AI agents behind every analyst: Prophecy's agentic AI features span data workflow generation, documentation creation, error explanation, and code summarization. An analyst describes an outcome, and multiple AI agents collaborate to generate a governed data workflow that the analyst can inspect, refine, and validate.
- Logic anyone on the team can validate: Prophecy renders AI-generated results as visual workflows, so analysts can inspect the logic step-by-step while platform teams review, standardize, and retain every change in Git. Users describe an "intuitive visual editor and elegant UI."
- Governed self-service on your platform: Prophecy 4.0 introduced fully governed self-service data preparation designed so analytics teams work within the guardrails that data engineering and platform teams define. It has been described as a rare combination of technical depth, user-centric design, and enterprise-grade controls.
- Your cloud, your control: Compute, governance, and security all live in your stack rather than a separate vendor's infrastructure. Platform teams stay in control of the cloud data platform they already manage, which creates a very different conversation than asking Information Technology (IT) to adopt someone else's governance model.
- Ready for the BI tools your teams already use: Business Intelligence (BI) tools are powerful for visualization and analysis, but they depend on well-prepared data sets. Analysts use Prophecy to shape and transform data so that datasets flowing into the BI layer are ready for analysis. Reporting and dashboards stay with your existing BI tools, and Prophecy makes the data workflows feeding them more reliable.
- A practical path from legacy tools: Teams carrying data workflows from tools like Alteryx can use Prophecy's transpiler to convert them into governed, cloud-native data workflows on Databricks, Snowflake, or BigQuery. Platform teams can then show real modernization progress, with workflows migrated and adoption climbing, without a multi-year rebuild.
Who should see Prophecy in action?
The people who need to see Prophecy are the analysts and application teams who will actually use it, along with the platform team who needs to trust it. A leadership deck doesn't capture the value. Analysts feel how fast they can move from question to governed data workflow, and platform teams see how governance and compute stay entirely in their control. Leadership sees the outcome, while these teams feel the difference day-to-day.
Analytics leaders identify the productivity gap and look for a better path. Data platform leaders are the decision-makers who want efficiency, data quality, and something their engineering team can trust and govern. AI-accelerated data preparation speaks to both groups because analysts become self-sufficient on the data workflows they own, and platform teams get full visibility and control.
Close the analytics self-service gap with Prophecy
Analytics teams are stuck waiting on engineering for analytics work they should be able to ship themselves, and leaders on both sides feel the drag. Analysts lose momentum, and platform teams lose visibility into what gets built outside the guardrails. Prophecy closes that gap with agentic data preparation, so analysts get visual workflows they can inspect and refine, while platform teams retain full control and governance over the cloud data platform they already manage.
Teams that adopt this approach see analytics move faster on the data workflows they already own, while platform teams keep the control and visibility they're accountable for. Book a demo to see Prophecy's AI agents and visual workflows running on your cloud data platform.
FAQ
How does Prophecy help analytics teams build data workflows?
Prophecy uses multiple AI agents to generate, refine, and explain data workflows, rendered as visual workflows that analysts can inspect and validate step-by-step. That approach gives analytics teams speed, independence, and efficiency on the data workflows they own, while data engineering continues to manage ETL upstream.
Does Prophecy replace ETL or data engineering tools?
No. ETL pipelines remain the primary way data enters the cloud data platform, and data engineers continue to own ingestion, orchestration, and governance using tools like ADF. Prophecy picks up after that data lands in the platform, giving analysts agentic, AI-accelerated data preparation for the data workflows they own. Teams typically use both.
How does Prophecy compare to running Claude Code or a general-purpose AI assistant directly?
General-purpose AI can generate code quickly, but without governance, each analyst ends up with a different output shape. Prophecy pairs AI acceleration with human review, standardization, and Git retention, so platform teams get consistent, reliable data workflows instead of a pile of ungoverned scripts to audit later.
What if our team is migrating from Alteryx or another legacy desktop tool?
Teams might start with the Prophecy transpiler, which converts existing data workflows into governed, cloud-native data workflows on Databricks, Snowflake, or BigQuery. The efficiency use case is usually where teams begin, proving the value alongside existing tools and then migrating at a pace that doesn't put the team's credibility on the line.
Which cloud data platforms does Prophecy support?
Prophecy runs natively on cloud data platforms like Databricks, Snowflake, and BigQuery. Data workflows execute on the platform your data engineering team already manages, so compute, governance, and security stay in your stack.
Ready to see Prophecy in action?
Request a demo and we’ll walk you through how Prophecy’s AI-powered visual data pipelines and high-quality open source code empowers everyone to speed data transformation

