Using the power of Claude Code for Data Prep & Analysis --> Read Blog Now

Enterprise
Pricing
Professional
Start free for personal use, upgrade to Professional as your team grows.
Enterprise
Start with Enterprise Express, upgrade to Enterprise as you scale company-wide.
Resources
Blog
Insights and updates on data engineering and AI
Resources
Reports, eBooks, whitepapers
Documentation
Guides, API references, and resources to use Prophecy effectively
Community
Connect, share, and learn with other Prophecy users
Events
Upcoming sessions, webinars, and community meetups
Company
About us
Learn who we are and how we’re building Prophecy
Careers
Open roles and opportunities to join Prophecy
Partners
Collaborations and programs to grow with Prophecy
News
Company updates and industry coverage on Prophecy
Log in
Get a FREE Account
Request a Demo
Get Free Account
Replace Alteryx
Prophecy for Databricks

Governed Self-Service for Databricks: Letting Analysts Build Workflows Safely

Learn how to let analysts build Databricks workflows without sacrificing governance, audit trails, or compliance.

Prophecy Team

&

March 9, 2026
Table of contents
Text Link
X
Facebook
LinkedIn
Subscribe to our newsletter
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

TL;DR

Here's what you need to know:

  • Gatekeeping backfires: Blocking analysts from building data workflows doesn't eliminate risk. It pushes work into ungoverned tools where you lose audit trails, lineage, and access controls.
  • Unity Catalog keeps you protected: Governed self-service on Databricks can rely on Unity Catalog's layered controls, including attribute-based access control (ABAC), row filters, and column masks, to enforce policies automatically, regardless of who builds the workflow.
  • Prophecy encodes governance: Prophecy is an AI-accelerated data prep platform that lets analysts build data workflows on already-prepared data, generating native Databricks code, respecting Unity Catalog policies, and deploying through your existing Continuous Integration and Continuous Deployment (CI/CD) workflows.
  • Full auditability: Every analyst-built workflow produces version-controlled, auditable code with a complete chain of evidence covering who built it, what data was accessed, and when it ran.
  • You can start small: Not everyone wants to undertake a comprehensive migration in one big push. Consider starting with a focused pilot. Work with your platform team to identify high-demand datasets, create reusable Prophecy Packages, and expand access incrementally as you validate controls.

You've been there. You need data to answer a business question, but the request sits in a queue behind dozens of others. Your platform team has already done the hard work, building extract, transform, load (ETL) pipelines, setting up Unity Catalog, and preparing the data. But getting from "the data exists" to "I can actually use it" still takes too long.

Your instinct might be to export what you can to a spreadsheet or find a workaround. That's understandable, but it creates the exact compliance exposure your organization is trying to prevent. 

Prophecy's position is that self-service and governance aren't in conflict. They're complementary. By giving analysts an AI-accelerated data prep platform that works on top of your engineers' already-prepared data, you can build your own data workflows without waiting in a queue, and without weakening the controls your platform team has put in place. This post explains how that works in practice and how to get started.

Gatekeeping creates the chaos everyone's trying to prevent

Every analytics team knows the pattern. You submit a request. It sits in the backlog. You need the answer now, not in three weeks. So you export data to spreadsheets, build workarounds in ungoverned tools, and end up creating the exact problem everyone was trying to avoid, just without audit trails, lineage, or access controls.

Teams commonly report spending a significant share of enterprise time on non-value-added work because data is hard to find, access, or trust. That's usually a governance architecture problem showing up as an analyst productivity problem.

Breach investigations reinforce this: shadow usage of tools and automation increases both the probability and cost of incidents. Meanwhile, self-service analytics adoption has plateaued in many organizations. Desktop data prep tools like Alteryx don’t yield accessible code and lack execution capabilities. Without real self-service capabilities, analysts are left dependent on the engineering team for every data request. That backlog reflects a structural delivery problem, not simply a staffing gap.

What does governed self-service actually mean on Databricks

Governed self-service means analysts can build their own data workflows, with automated policy enforcement, access controls, and audit trails working behind the scenes to keep everything compliant.

Your data engineers have already done the foundational work: building the ETL pipelines that bring data in, clean it, and organize it in Unity Catalog. Governed self-service lets you work with the prepared data directly, without going back to engineering for every new question or report.

Unity Catalog is what makes this possible. It provides centralized access controls, audit logging, and a consistent security model. In practice, your platform team has set up layered controls that protect the data you're working with:

  • Workspace-level restrictions: These determine which environments you can access, ensuring that production and development data are properly separated.
  • Privileges and ownership controls: These determine which catalogs and schemas you can read from or write to, based on your role.
  • Attribute-based access control (ABAC): Dynamic policies enforce access rules automatically based on data classification and your user attributes. No manual approvals needed.
  • Table-level filtering and masking: Row filters and column masks control exactly what data you see within tables, so sensitive information stays protected even when you have table-level access.

The end result is that you get access to the data you need without your platform team having to manually approve every request, and they get the confidence that policies are enforced consistently, in real time.

Prophecy doesn't bypass governance. It encodes it.

This is where things get practical. Prophecy is an agentic data prep platform that lets analysts build data workflows on top of already-prepared data, using visual workflows and AI agents instead of writing code. Instead of operating outside your organization's governance perimeter, Prophecy works within it:

  • Native code generation: Prophecy generates Databricks-native code that runs directly on your organization's infrastructure with no proprietary runtime.
  • Policy compliance: All Unity Catalog policies are respected automatically, including ABAC rules, row filters, and column masks.
  • CI/CD integration: Workflows deploy through your existing CI/CD workflows, following the same review and release process your engineering team already uses.

Unlike legacy tools, where you're locked into their governance model, Prophecy runs on your cloud data platform. Your platform team stays in control. Compute, governance, and security all live in your stack, not Prophecy's.

Your Unity Catalog policies stay intact

Prophecy integrates with Unity Catalog through single sign-on and consistent governance. When you build a workflow in Prophecy, the access controls your platform team has defined in Unity Catalog are honored automatically.

You read and write directly into Unity Catalog tables, selecting the catalog, schema, and table, while ABAC policies, row filters, and column masks control what you can see and touch. There's nothing extra to configure on your end.

AI agents help you prep data without engineering skills

This is Prophecy's key advantage for analysts. AI agents guide you through data preparation, helping you build workflows, apply transformations, and get data ready for analysis, all without writing code or waiting for engineering support.

Importantly, these agents work within your existing permission boundaries. As Prophecy puts it: "Guardrails are still in place. The agents are limited by the users' permissions. With our Databricks Unity Catalog integration, you can extend users' access to the power of agents, without creating security or governance risks."

If you don't have access to a table, the agent working on your behalf won't have access either. You get the speed of AI-accelerated data prep with the safety of your organization's governance controls.

It generates reviewable code, not proprietary artifacts

Even though you're working in a visual interface, Prophecy generates production-ready SQL or Spark code behind the scenes. Every visual workflow maps directly to the underlying code, so your engineering team can review it and manage it in Git. Prophecy uses AI acceleration plus human review, standardization, and Git retention so you get the speed of AI with the reliability of engineering. There's no proprietary runtime and no opaque black box, just standard code that runs on your existing infrastructure. 

Learn more about how visual and AI interfaces solve pipeline transparency.

Reusable standards help you get started faster

Your platform team can create Prophecy Packages, reusable, standardized components shared through a searchable hub, that encode your organization's best practices for common data preparation tasks.

Instead of building transformations from scratch, you use pre-approved building blocks that your platform team has already vetted. This means faster time to insight and fewer mistakes, while keeping your engineering team focused on core ETL pipeline work rather than fielding one-off requests.

Enterprise-grade security and compliance

For regulated industries, Prophecy maintains enterprise compliance certifications, including SOC 2, the General Data Protection Regulation (GDPR), the California Consumer Privacy Act (CCPA), and the EU-U.S. Data Privacy Framework.

The platform provides event and audit log management, automated compliance monitoring, and continuous monitoring with automated alert response. Authentication integrates through Databricks Partner Connect using OAuth, eliminating the need for Personal Access Tokens. Role-based access control (RBAC) enforces least privilege, layered on top of your existing Unity Catalog permissions.

Audit trails and lineage you can prove

Governance isn't just about preventing unauthorized access. It's about proving exactly what happened, when, and why. This matters to analytics leaders who need to demonstrate compliance to auditors, regulators, and executive leadership.

Every analyst-built workflow in Prophecy produces a complete chain of evidence, including who built the workflow, which code was generated, which datasets were accessed, which transformations were applied, and when the workflow ran. Version-controlled code lives in your Git repository, and Unity Catalog audit logging captures user-level data access events.

This directly addresses a primary concern for analytics leaders: answering "who accessed this data and what did they do with it?" with a definitive, testable record.

This auditability also shortens incident response. When teams can trace data movement end to end, investigations stay bounded rather than becoming open-ended forensic work. Basic log management practices help maintain that discipline.

Scale governed self-service for analysts with Prophecy

Analysts and analytics leaders face a persistent tension: you need timely access to data to drive decisions, but every ungoverned workaround increases compliance risk, erodes audit trails, and creates lineage gaps no one can trace. Prophecy, an agentic data prep platform, resolves this by giving analysts a governed channel to work with already-prepared data, build data workflows, and get to insights faster, all without engineering skills or ungoverned workarounds.

Prophecy provides:

  • AI agents: Prophecy's AI agents help you prep data for analysis without writing code or waiting for engineering support, while staying within your Unity Catalog permission boundaries so governance is never compromised.
  • Visual interface with open code: You build data workflows through an intuitive visual interface, and every workflow maps directly to production-ready SQL or Spark code that your engineering team can review, version, and manage in Git.
  • Pipeline automation: Reusable Prophecy Packages give you pre-approved building blocks for common data preparation tasks, so you can compose workflows quickly from standardized components your platform team has already vetted.
  • Cloud-native deployment: Prophecy generates native code for Databricks, BigQuery, and Snowflake, running on your organization's infrastructure and deploying through your existing CI/CD process with no proprietary runtime.

With Prophecy, analysts can go from data request to actionable insight faster, building governed data workflows on top of prepared data without waiting in an engineering queue.

See how Prophecy encodes your governance standards into analyst workflows. Request a demo and evaluate it against your Unity Catalog policies firsthand.

FAQ

Does Prophecy replace Unity Catalog's governance controls?

No. Prophecy works within your existing Unity Catalog policies, including ABAC, row filters, and column masks. It doesn't replace or override your controls. It ensures analyst-built workflows respect them automatically.

Can AI agents in Prophecy access data that the user can't?

No. Prophecy's AI agents inherit the permissions of the user they're acting on behalf of. If you don't have access to a table, the agent won't either.

Do I need engineering skills to use Prophecy?

No. Prophecy's visual interface and AI agents let you build data workflows on top of already-prepared data without writing code. Your data engineers handle the core ETL pipelines, and you use Prophecy to prep and analyze data from there.

How should our team get started with governed self-service?

Start with a focused pilot. Work with your platform team to identify high-demand datasets that are already prepared, create a small library of reusable Prophecy Packages, onboard a pilot group of analysts, and expand incrementally as you validate controls.

Does Prophecy replace tools like Alteryx?

Yes, Prophecy is a direct competitor with tools like Alteryx. Desktop data preparation was an incredible development, but cloud-native agentic data prep is the future of analytics. It’s faster, better governed, and designed for production-readiness.

‍

Ready to see Prophecy in action?

Request a demo and we’ll walk you through how Prophecy’s AI-powered visual data pipelines and high-quality open source code empowers everyone to speed data transformation

Prophecy for Databricks
Modern Enterprises Build Data Pipelines with Prophecy
AI Data Preparation & Analytics
3790 El Camino Real Unit #688

Palo Alto, CA 94306
Product
Prophecy EnterpriseProphecy Enterprise Express Schedule a Demo
Pricing
ProfessionalEnterprise
Company
About usCareersPartnersNews
Resources
BlogEventsGuidesDocumentationSitemap
© 2026 SimpleDataLabs, Inc. DBA Prophecy. Terms & Conditions | Privacy Policy | Cookie Preferences

We use cookies to improve your experience on our site, analyze traffic, and personalize content. By clicking "Accept all", you agree to the storing of cookies on your device. You can manage your preferences, or read more in our Privacy Policy.

Accept allReject allManage Preferences
Manage Cookies
Essentials
Always active

Necessary for the site to function. Always On.

Used for targeted advertising.

Remembers your preferences and provides enhanced features.

Measures usage and improves your experience.

Accept all
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Preferences