TL;DR
- Analytics backlogs stall the business: Analytics teams wait in the queue because requests grow faster than data engineering can service them alongside core ETL work.
- Generative AI shifts the analyst's job: AI drafts workflow code, learns data quality patterns, and lets analysts prepare data confidently without filing tickets.
- Context-aware agents beat generic copilots: Agents that understand your schemas and catalogs make agentic tooling genuinely useful for building and maintaining data workflows.
- Human review stays essential: Roughly one in five AI-generated queries returns wrong results even when the code runs cleanly, so analysts must validate the output.
- Prophecy delivers agentic data preparation: Prophecy gives analysts AI-accelerated visual workflows on Databricks, Snowflake, or BigQuery, so they ship faster while the data team keeps governance in place.
Every business wants faster answers from data, yet the people closest to the questions, like analysts and analytics engineers, spend their days waiting on someone else's queue. Requests pile up with data engineering, schemas shift upstream, and the backlog of "just one more transformation" keeps growing. Despite AI capabilities being embedded across every major cloud platform, most organizations still haven't turned those capabilities into real productivity for the people preparing data for analysis.
This article focuses on analytics workflows specifically, the additional transformation and preparation work analysts do on top of governed data already sitting in a cloud data platform. The goal is a faster, safer path from a governed dataset to an analysis-ready output, without sidestepping the governance the data team owns.
Why analytics capacity is the real bottleneck
Analytics capacity is the bottleneck because analytics work scales with every new business question, every new downstream consumer, and every upstream schema change. Responsibilities are typically split along clear lines:
- Data engineers own the platform: They build ETL pipelines, manage ingestion, and define governance across cloud data platforms like Databricks, Snowflake, or BigQuery, so governed data lands reliably.
- Analytics teams turn governed data into insights: They build data workflows (sometimes also referred to as data pipelines), perform additional transformation, run ad hoc queries, and analyze the data engineering team's output.
- The handoff is where things stall: Requests from analytics teams queue up with data engineering for routine transformation work that could be self-served with the right tooling.
The analyst's job is shifting from writing to directing
Analysts now spend more time directing and reviewing generated workflow code than writing it from scratch. Natural language workflow generation is one of the biggest shifts in day-to-day analytics work, and a few patterns illustrate it:
- Natural language workflow agents: Agents accept prompts like "load data from the customer_orders table, standardize dates, remove duplicates by order ID, and write it to an analytics table," then build the workflow and draft basic unit tests.
- Built-in cloud assistants: Cloud-native assistants let practitioners author jobs and troubleshoot issues using natural language.
- Visual workflow designers: Visual interfaces let analytics teams build workflows graphically while generating the underlying code, which keeps output reviewable and version-controlled.
Across these tools, the primary activity moves from writing code to directing AI agents and validating their output. That changes what productivity means for the role.
Context-aware AI agents replace generic code completion
Context-aware AI agents understand your data catalog in ways generic code completion tools don't, and that changes what analytics teams can ship. The difference comes down to a few capabilities:
- Grounded in your catalog: Agents that are aware of your specific schemas, datasets, and workflow structures produce output that actually fits your environment.
- Protocol-aware integrations: Model Context Protocol (MCP) integrations keep agents aware of project resources, including data, compute, and code across many sources.
- Full-lifecycle awareness: MCP-connected agents can discover schemas, map source and target, and write and test transformation scripts across the entire workflow lifecycle.
Data quality monitoring gets smarter without manual rules
AI-driven monitoring learns data-quality patterns rather than relying on manually defined thresholds. Rule-based systems require practitioners to define expected thresholds per column, which becomes unmanageable as workflows multiply. AI monitoring learns historical trends and seasonal behaviors, like weekend volume dips or tax-season spikes, and detects anomalies across tables without manual rule writing. That approach addresses a known failure mode in which threshold-based monitoring fires false positives due to predictable periodic variation.
Analysts are shipping more workflows with less boilerplate
Analysts ship more workflows when they direct generative AI across the workflow lifecycle instead of hand-writing every transformation. Generated workflows use the same American National Standards Institute Structured Query Language (ANSI SQL) standard and orchestration primitives as hand-written ones, so analytics output integrates cleanly with the data engineering team's governance and review practices. That pattern changes day-to-day delivery in a few specific ways:
- Analysts focus on analysis, not boilerplate: Scaffolding, unit tests, and routine transformations get drafted by AI agents, so analysts spend time on the questions the business is actually asking.
- Workflow changes move faster: Teams respond more quickly when schemas shift or new governed sources arrive.
- Analysis lands sooner: Business consumers get the insights they need with less waiting on shared engineering queues.
Together, these shifts let analytics teams keep pace with business demand without renegotiating their handoff with data engineering.
Where Prophecy fits in this shift
Prophecy gives analytics teams an AI-accelerated, self-service path to build and maintain data workflows on governed data, without bypassing the data engineering team's standards. The pattern emerging across cloud platforms, where AI agents generate, humans review, and governance stays embedded, is the same workflow Prophecy was built around. Prophecy is agentic data preparation for analytics, so analysts can prepare data, transform it, and build data workflows (sometimes also referred to as data pipelines) confidently, without submitting tickets for every request.
Prophecy is used after data is already in the cloud data platform, so ETL pipelines remain the primary way data enters the platform and analytics teams pick up from there. A few things distinguish the approach:
- Runs on your cloud data platform: Prophecy works on top of Databricks, Snowflake, or BigQuery, using the compute and governance that already live there.
- Multiple AI agents, not one assistant: Agentic features cover workflow authoring, documentation, and migration from legacy analytics tools.
- Visual workflows backed by code: Analysts build visual workflows or write code directly, and the generated Structured Query Language (SQL) is version-controlled in Git as the single artifact.
- Platform standards stay intact: Governance, lineage, and continuous integration and continuous delivery (CI/CD) practices that the data team already enforces extend to AI-generated workflows automatically.
- Prepared data for BI tools: Business intelligence (BI) tools depend on well-prepared datasets; Prophecy helps analysts prepare the data so BI tools can focus on reporting and dashboards.
Beyond new development, teams can also use Prophecy's transpiler to bring legacy analytics workloads from tools like legacy ETL platforms into their cloud data platform, so modernization happens step-by-step alongside new workflow development.
Governance has to scale with AI-generated code
Governance responsibility grows for the data team as generative AI accelerates analytics output. Generative AI introduces new risks, including logic errors in generated code, misconfigured access controls, and gaps in training data provenance.
To close those gaps, data teams need governance that extends to AI-generated workflows in several specific ways:
- Review rigor stays consistent: AI-generated code gets the same review as hand-written code, with analysts and data engineers collaborating on the same artifact.
- Provenance metadata is tracked: The AI agent, model version, and human reviewer are logged, keeping audits straightforward.
- Existing policies extend automatically: Rules defined by the data team apply to AI-generated components without rework.
- RBAC and audit trails are built in: AI-generated workflows inherit the same role-based access control (RBAC) and audit trails as hand-written ones.
Equipping analysts with governed, AI-powered tooling is safer than the alternative, where individuals paste code from chat-based assistants into workflows without lineage or review trails.
Human oversight is a permanent requirement
Generative AI for analytics work requires ongoing human validation, and that requirement will remain as models improve. A few findings underline why the review can't be optional:
- Debugging SQL is still hard: The leading reasoning model achieved only a 38.87% success rate on real-world SQL debugging tasks.
- "Runs clean" doesn't mean "correct": Research on structural SQL failures found roughly one in five generated queries contain errors that execution-level testing won't catch, because they run without errors but return wrong results.
- Enterprise schemas widen the gap: Multi-model evaluations show accuracy declines sharply in real enterprise scenarios, with internal organizational knowledge identified as a gap that alone can't close.
- Anchoring bias skews reviews: When AI output is incorrect, it can negatively affect human judgment because reviewers anchor to the model's output rather than independently evaluating its correctness.
Given current model capabilities, the Generate, Review, Refine, Deploy pattern, powered by multiple AI agents with analysts in the loop, is what holds up in production.
What this means for your analytics team
Analytics teams capture real productivity gains when they pair generative AI with consistent human review and the embedded governance that their data engineering partners maintain. Teams using AI save an average of 5.9 hours per week, and organizations are now redesigning their work around a simple principle. The AI agents generate first drafts; humans refine and validate, and the governance that the data team already owns stays automated throughout.
That approach respects the existing split of responsibilities with data engineers on ETL, ingestion, and governance; analytics teams on data workflows, transformation, and analysis, while accelerating the parts of the process that used to consume the most hours.
Accelerate self-service analytics with Prophecy
Analytics backlogs that keep growing create a productivity gap that widens over time, and business consumers end up getting answers more slowly than they need. Prophecy closes that gap with agentic data preparation, an AI-accelerated self-service path for analysts to prepare data and build data workflows on cloud data platforms, with the data team's governance built in. Analysts move from question to production-ready workflow without the usual boilerplate drag, and the data team keeps full oversight the whole way through.
Here's what sets Prophecy apart:
- AI agents across the workflow: Multiple agents handle workflow authoring, documentation, quality checks, and legacy migration, so analysts ship faster without losing governance.
- Visual interface backed by code: Analysts build visual workflows or work directly in code, and review the generated SQL from the same versioned artifact.
- Built-in governance: Lineage, RBAC, and version history are native to the platform, so the data team keeps full oversight.
- Deployment to cloud data platforms: Prophecy runs on Databricks, Snowflake, and BigQuery, so compute and governance stay inside the platform your team already trusts.
A data practitioner used this to unify product usage, billing, CRM, and support data into a single customer health pipeline. One business-problem prompt produced entity resolution, behavioral cohorts, health scoring, and expansion opportunity ranking as one connected workflow.
Here’s what the shift looks like:
- Before: Weeks of hand-written joins, feature engineering, and scoring functions across four systems.
- After: A clear prompt, then time spent validating logic and reasoning about outcomes.
- Result: Production-ready pipelines shipped in a sitting, not a sprint.
Ready to see agentic data preparation at work? Book a demo and see Prophecy's AI agents in action.
Frequently asked questions
How does generative AI change day-to-day analytics work?
Generative AI shifts analysts from hand-writing every transformation to directing AI agents and validating their output. Analysts still own analysis and the questions the business is asking, but routine scaffolding, documentation, and unit tests get drafted by AI, which frees time for drawing insights from the data.
Does Prophecy replace my existing ETL or BI tools?
No. Prophecy focuses on self-service data workflows on top of your cloud data platform, after governed data is already there. ETL pipelines remain the primary way data enters the platform, and BI tools remain the primary place for reporting and dashboards.
How do AI agents help analysts build data workflows?
AI agents draft workflows from natural language, author jobs, troubleshoot issues, and support transformation work with project-aware context across the lifecycle. Analysts review and refine the output in a visual or code interface, which gives them AI speed with the confidence of human-reviewed, version-controlled code.
Which cloud data platforms does Prophecy work with?
Prophecy runs on top of Databricks, Snowflake, and BigQuery. Compute, security, and governance stay inside your platform, so your data team keeps control of how data is stored, accessed, and audited across every workflow.
Ready to see Prophecy in action?
Request a demo and we’ll walk you through how Prophecy’s AI-powered visual data pipelines and high-quality open source code empowers everyone to speed data transformation

