TL;DR
- Data engineers own ETL pipelines and governance, while analytics teams build analytics workflows on top of trusted data.
- Analytics engineers who want code-first transformation models use SQL and Git.
- Prophecy is agentic data prep that gives analytics teams AI-accelerated, self-service analytics workflows on Databricks, Snowflake, or BigQuery.Multiple AI agents guide analysts through visual workflows while Prophecy generates reviewable SQL and Python code in Git.
- The two tools often coexist, with dbt supporting analytics engineering and Prophecy supporting analyst self-service.
Many analytics teams face the same pressure. The business wants answers fast, a handful of analysts have the domain knowledge to deliver them, and engineering is buried under a queue of data workflow requests. Hiring alone may not close the gap. A data talent shortage is one reason analytics teams need better tooling alongside any new headcount.
Data work is a team sport. Data engineers own ingestion, Extract, Transform, Load (ETL) pipelines and governance. They land trusted data in cloud data platforms like Databricks. Analytics teams pick up from there, shaping that governed data into analytics workflows, ad hoc queries, and analyses that answer business questions.
This article tackles one narrow question: once data lands in your cloud data platform, what's the best way for analytics teams to build analytics workflows on top of it? Our stance is that dbt and Prophecy solve different parts of that problem. Dbt handles code-first analytics engineering, while Prophecy provides agentic data preparation so analysts can build governed analytics workflows themselves, backed by real code in Git.
What dbt does well
Dbt fits analytics engineering teams. It has kept its strongest recommendation level, "Adopt," on Thoughtworks across multiple consecutive editions, signaling that the tool belongs in production analytics engineering stacks.
The tool supports engineering rigor and enables practices like modularity, testability, and reusability of SQL-based transformations. It also integrates well with cloud data warehouses and lakehouses, including workflows for testing and reusable transformations. For analytics engineers and SQL-fluent teams, that framework creates a consistent, reviewable way to ship transformation logic. A few capabilities stand out for teams evaluating it:
- Plain-text, reviewable code: SQL or Python files live in Git, so changes are reviewed exactly like software engineering changes
- Built-in testing:
dbt testdeployment gates validate schema, referential integrity, and custom assertions before anything lands in production - Automatic documentation: Documentation and lineage are generated from the model graph, so analytics engineers spend less time maintaining docs manually
Where analytics teams need something different
Analytics teams include more than analytics engineers. Financial analysts, marketing operations analysts, and domain specialists bring deep business knowledge but varying levels of SQL or Git fluency, and they still need to prepare data for analysis, build analytics workflows, and run ad hoc queries without waiting in a queue.
Code-first tools work well for analytics engineers, but they weren't designed for analysts who think in business logic first. CIOs are likely to leverage in-house developers and tools using generative AI and low-code approaches to build differentiation. Teams might look for ways to let analysts contribute to analytics workflows directly, inside the governance boundaries engineers already established.
How Prophecy fits into an analytics team
Prophecy is AI-accelerated data preparation that sits after ingestion and gives analytics teams a self-service way to build analytics workflows, sometimes also referred to as data pipelines, on trusted data. Data engineering teams keep ownership of the underlying platform, while analysts get a faster path from question to answer.
When building workflows, you iterate visually while Prophecy maintains the underlying codebase, which serves as a single source of truth. Analysts build visual workflows guided by agentic AI features, and Prophecy produces governed SQL and Python code that runs natively on the cloud data platform and is stored in Git.
How AI agents enable self-service analytics
Self-service analytics only works when the tooling keeps pace with the analyst, which is where AI agents come in. Prophecy's AI agents work on different parts of the analytics workflow, including understanding the data, suggesting transformations, creating documentation, generating visual workflows, and producing reviewable code. That structure changes what analysts can own end-to-end:
- Faster preparation: AI agents prepare data sets for analysis alongside analysts, suggest transformations, including cleaning, which is part of the transformation, and accelerate ad hoc work.
- Confident transformation: Engineers perform significant transformations during ETL, but analysts often need additional transformations for specific analyses. AI agents improve that quality step by step.
- Independence from the ticket queue: Analysts build and run governed analytics workflows themselves, on the cloud data platform, within the guardrails that data teams already defined.
Working alongside business intelligence tools
Business intelligence (BI) tools like Power BI and Tableau handle visualization, dashboards, and analysis well when the underlying data is well prepared. Their performance depends on the quality and shape of the data sets feeding them, which means the preparation step often determines how useful the BI layer becomes.
Prophecy focuses on that preparation step. Reporting and dashboards remain the job of the BI layer, while Prophecy delivers the well-structured datasets that those tools rely on, so analysis is faster and more trustworthy.
Governance and platform fit
Because Prophecy generates code that runs on the cloud data platform, governance stays with the platform and the data engineering team. Compute runs on Databricks, Snowflake, or BigQuery, and security, access control, and lineage inherit from the platform's native controls.
Here's how the two tools line up for analytics teams evaluating both:
- Version control: dbt stores SQL, Python, and YAML natively in Git, and Prophecy generates SQL and Python code stored in Git. Both produce reviewable, diffable transformation code, and Prophecy CI/CD documentation confirms standard Continuous Integration/Continuous Delivery (CI/CD) integration.
- Automated testing: dbt provides built-in schema tests, custom SQL assertions, and source freshness checks. Prophecy supports per-step unit testing.
- Platform integration: Prophecy runs compute on your Databricks, Snowflake, or BigQuery environment and integrates with controls like Unity Catalog integration, single sign-on (SSO), and role-based access control (RBAC). The dbt-Databricks connector access includes Unity Catalog access controls.
- Compliance: Prophecy holds SOC 2, General Data Protection Regulation (GDPR), and California Consumer Privacy Act (CCPA) certifications per security and compliance details.
How to think about choosing
Both tools can live in the same organization. Analytics engineers may lean on dbt for the transformation models they own, while analytics teams use Prophecy to build workflows on top of governed data. A few questions teams might ask when deciding where each tool fits:
- Who is building the analytics workflow? For SQL-fluent analytics engineers working in Git, dbt is a strong fit; for a mix of analysts with varying SQL depth, Prophecy's agentic AI features let more people contribute.
- Where does the data engineering handoff happen? ETL pipelines remain the primary way data enters the platform, and both tools pick up after that point, though Prophecy is designed specifically for the analytics work that happens after ingestion.
- What does the analytics team need to deliver? For reviewable transformation code alongside analytics engineering culture, dbt works well; for AI-accelerated self-service that still produces governed code, Prophecy is the better match.
Build faster self-service analytics workflows with Prophecy
Analytics teams want to deliver trusted insights quickly, yet waiting on engineering tickets for every transformation or ad hoc datasets slows the whole process down. Prophecy is agentic data preparation that lets analysts build governed analytics workflows on top of the data engineering team's work, without sacrificing governance.
Here's what that looks like in practice:
- AI agents: Multiple agentic AI features guide analysts step by step through data prep, transformation, and analytics workflow creation, so non-engineers can build confidently
- Visual interface: Analysts build visual workflows while Prophecy generates reviewable SQL and Python code in Git, keeping engineering standards intact
- Built-in governance: Analytics workflows deploy through standard CI/CD and inherit the platform's native access, lineage, and security controls, so there's no separate path to production for analyst-built work
- Deployment to cloud platforms: Compute runs on Databricks, Snowflake, or BigQuery, under the governance that the platform team already set up
With Prophecy, analytics teams move faster, data engineering teams stay focused on ingestion and governance, and the business gets answers sooner. Book a demo to see Prophecy's AI agents and visual workflows in action.
FAQ
How is Prophecy different from dbt?
Dbt is a code-first analytics engineering framework built for SQL-fluent analytics engineers working in Git. Prophecy is agentic data preparation for analytics teams, using multiple AI agents and visual workflows so analysts with varying technical skills can build governed analytics workflows that still produce reviewable code.
Does Prophecy replace my data engineering team?
No. Data engineering teams still own ingestion, ETL pipelines, and governance on the cloud data platform. Prophecy is used after data lands in the platform, so analytics teams build workflows on top of that governed data without adding tickets to the engineering backlog.
Does Prophecy work with Databricks, Snowflake, and BigQuery?
Yes. Prophecy runs compute on cloud data platforms like Databricks, Snowflake, or BigQuery and integrates with native controls such as Unity Catalog, SSO, and RBAC, so governance stays with the platform.
Does Prophecy replace BI tools?
No. BI tools like Power BI and Tableau remain the place for dashboards, reporting, and visualization-driven analysis. Prophecy prepares and transforms the data that those tools rely on, so analysts can deliver cleaner, better-structured datasets into the BI layer.
Ready to see Prophecy in action?
Request a demo and we’ll walk you through how Prophecy’s AI-powered visual data pipelines and high-quality open source code empowers everyone to speed data transformation

