One Data Engineer for Every Team: Bauplan's MCP and AI adoption at Moffin

Data democratization
Natural-language queries on production tables
AI-native operations
Full data lifecycle is managed in Claude Code
Prod is the new dev
Work safely on production data, ship results immediately
“Once we attached Claude to Bauplan, the first question our operations team asked was: which client is most likely to churn? Claude queried Bauplan and predicted a client that had churned literally one day before. That was the moment everyone got it.”
Carlos Leyson, Head of Data at Moffin

Painpoints at glance

  • No unified data estate: operational source data scattered across Postgres, Stripe, and Monday.com with no centralized management.
  • Analytics coupled to engineering: gold data assets are fixed, determined by pipelines and dashboards; no self-serve for exploration.
  • No AI-ready data context: without code-first schemas and their documented transformations, AI initiatives could not scale.
  • Idea-to-dashboard cycle measured in weeks: ad-hoc requests queued behind pipeline work; most requests never shipped.

About Moffin

Moffin is a FinTech platform that provides credit bureau data, identity verification, and risk workflows to financial institutions across Latin America: when lenders need to make a financial decision, they reach out to Moffin to get the relevant data.

Moffin’s data engineering team supports every function in the company: leadership needs data to plan company growth, the operations team wants to track customer health, marketing has to measure the ROI of campaigns, and finally the engineering org is constantly monitoring service reliability.

The challenge: ad-hoc data requests do not scale

Before Bauplan, every data question followed the same slow path: someone had an idea, asked the data team, waited for a query, got a one-off answer, and then decided if it was worth turning into a dashboard. The data team was swamped with ad-hoc, non-foundational work, and most ideas never made it past step two.

In other words, analytics intelligence (how to query and report on polished data) and data engineering intelligence (how to prepare polished data from raw signals and systems powering the business) were coupled inside the data team. Bauplan and Claude let them decouple: analytics requests are handled through AI, so the data engineering team is free to produce high-value, repeatable work.

The solution

The foundation: data-as-code

Carlos and the data engineering team leveraged Bauplan to easily build a lakehouse architecture. When raw files—customer credit bureau data, internal application metrics, Stripe billing, Monday.com project tracking, and service health logs—land in S3 through DLT connectors, a few lines of Bauplan code perform Quality-Gated Updates to create verified Apache Iceberg source tables.

Source tables are then turned into silver and gold layers through pipelines, producing the final assets for downstream systems: unlike files, tables have authors, governance, lineage, and versioning. As building a data estate with Bauplan is “just code”, the investment in good code turns out to pay high dividends across the full data chain: clear naming conventions, proper namespacing, good column definitions, curated gold layers - what is easy for humans to reason about is also easy for AI to understand.

Bauplan MCP: decoupling engineering from analytics intelligence

After the release of Bauplan AI integrations—MCP and Skills —Carlos connected Claude Desktop to Bauplan and set up a demo with the operations team. The very first question was about churn risk. Claude examined the gold layer, reasoned about which signals might indicate churn, built a final query, and returned a ranked list. The client at the top of that list had churned the day before.

That was the “aha” moment. Within days, both the leadership and operations teams were using Claude Desktop daily: the AI is translating a high-level business requirement in English into a set of probing queries and analytics scripts that leverage Bauplan APIs and concurrency model to answer promptly and at no additional data infrastructure cost. The first result is a principled division of labour: the data team works on running reliable, scalable, general data models, so that Claude can use the polished data to answer business questions directly and reliably.

The new analytics loop not only freed the engineering team to do scalable work, but also improved the engagement non-technical teams have with data-driven decision making: it is not just that the turnaround is now much quicker, but Claude’s explicit reasoning traces inspire users to refine their questions, explore new directions, and make their requests more precise.

The setup works so well that it's changing how leadership thinks about making advanced data capabilities available more broadly across the organization.

Roadmap

The flows crystallized in Claude+Bauplan sessions are today already precise enough to be turned into a scalable pipeline producing assets on a schedule. Exploration becomes a draft spec, which engineering hardens into a scheduled pipeline.

What about closing the loop with Claude as well? While today moving between Claude Desktop traces (exported by business users) and a new pipeline is done manually by an engineer, the team is experimenting with new Claude Skills that could significantly speed up the process.

Since traces, skills, pipelines, infrastructure, and table descriptions are all “just code” in a Bauplan project, Claude Code has everything it needs to take a first stab at it: what used to be days of refactoring can become a focused code and data review, using abstractions such as data branches and tags.

Results

  • Teams self-serve daily: leadership, operations, and engineering use Claude Desktop + Bauplan MCP to explore data without filing engineering tickets.
  • Dashboard development accelerated: teams arrive with specific, data-informed requests instead of vague ideas, cutting the insight-to-dashboard cycle from weeks to days.
  • Code-first approach for AI-native operations: since the data platform is “just code”, Claude acts as a force multiplier across the entire company; polished, annotated tables simplify analysis, concise APIs and sandboxed execution enable safe development with coding agents on production data.
  • Customer impact: productivity gains allow the Moffin team to attend to its customers better, transferring internal efficiencies directly to customer experience.

Technology stack

Amazon S3
Apache Iceberg
Bauplan
DLT (data load tool)
Claude Desktop + Bauplan MCP server
Claude Code + Bauplan skills
Metabase
PostgreSQL, Stripe, Monday.com
Case Study
Share on
Share on