Lakehouse

The first lakehouse-native CRM.

Federated reads from Databricks and Snowflake. Outbound Delta Sharing in near real-time. Have your agent call ours — MCP that runs both ways. Zero data movement.

Federated reads
Outbound Delta Sharing
Bidirectional MCP

Architecture

Four streams. Zero ETL.

Your warehouse stays where it is. Factory speaks Delta Sharing, MCP, and SQL — natively, both ways. Hover any line for details.

Federated readsLive SQL on your warehouse — guarded, no copy.Delta Sharing publishCRM facts streamed to your lake as Delta tables.Bidirectional MCPGenie in. Claude / Cursor out. Same guardrails.Vector Search RAGFederated to your vector index — no re-embedding.YOUR LAKEHOUSE
Databricks
Snowflake
Unity Catalog
Mosaic AI
Databricks Genie
OPEN STANDARDS · BYO COMPUTE
FACTORYFactory CRMPipeline · cases · omnichannelFactory PIMCatalog · attributes · channelsFactory MarketingEmail · SMS · WhatsAppASSISTANT · WORKFLOWS · API
Federated reads
Delta Sharing publish
Bidirectional MCP
Vector Search RAG

Capabilities

Four streams, one platform

Each stream ships behind the same governance: encrypted credentials, per-tenant key derivation, query-guard, and audit trails on every byte.

Federated reads

Query your warehouse from the AI Assistant.

Expose Databricks or Snowflake tables as governed Assistant skills. The Assistant writes the SQL; query-guard enforces single-SELECT, column allow-list, row filters, LIMIT clamps, and payload caps.

  • Zero data movement — your data stays put
  • Scales on the warehouse compute you already own
  • Pluggable adapters — Databricks today, Snowflake today
Delta Sharing publish

Stream CRM facts back to your lakehouse.

Accounts, opportunities, activities, messages, and orders published as Delta tables with proper transaction logs. Recipients consume in Databricks, Snowflake, or PyIceberg with one bearer token.

  • Near real-time — fresh shares in minutes, not days
  • Delta v1.x protocol + Iceberg REST conformance (read-only)
  • Recipient revocation in under a minute
Bidirectional MCP

Have your agent call ours.

Inbound: register Databricks Genie or any MCP server — its tools become first-class Assistant skills. Outbound: expose /api/mcp so Claude, Cursor, and Mosaic AI can drive Factory with the same guardrails as the app.

  • Inbound MCP client with encrypted credentials (AES-256-GCM, per-tenant HKDF)
  • Outbound MCP server — 30+ typed tools, your tenant boundary intact
  • Same auth, same audit trail, both directions
Vector Search RAG

Federated retrieval from your existing index.

Already invested in Databricks Vector Search or Mosaic AI embeddings? Register the index — the Assistant calls it as a RAG fallback when CRM context isn't enough. No re-embedding, no parallel store.

  • Brings your domain corpora — manuals, contracts, runbooks — into chat
  • Ranked alongside CRM hits with provenance
  • Per-index allow-list + token budget guard

Fully integrated

Native to your lakehouse stack.

Databricks
Snowflake
Unity Catalog
Mosaic AI
Databricks Genie
Apache Iceberg
Trino
Apache Spark
PyIceberg
MCP

Built on open standards — Delta Lake protocol v1.x, Iceberg REST catalog (read-only conformance), Model Context Protocol. No proprietary connectors required.

Comparison

Why Salesforce and HubSpot can't ship this

Both vendors built their data layers before lakehouses existed. Adding federated reads + outbound Delta Sharing + bidirectional MCP isn't a feature flag — it's a re-architecture.

Federated SQL on customer warehouse
Query Databricks / Snowflake from the AI Assistant — zero ETL
Salesforce
HubSpot
Factory
Outbound Delta Sharing publish
CRM facts as Delta tables in your lake — near real-time, open protocol
Salesforce
HubSpot
Factory
Inbound MCP from your agents
Plug Genie / any MCP server in as first-class Assistant skills
Salesforce
HubSpot
Factory
Outbound MCP to external agents
Claude / Cursor / Mosaic AI drive your CRM with the same guardrails
Salesforce
HubSpot
Factory
Federated vector search RAG
Reuse your Databricks Vector Search or Mosaic AI index — no re-embedding
Salesforce
HubSpot
Factory
Open standards, no proprietary connectors
Delta Sharing protocol · Iceberg REST · MCP — all open specs
Salesforce
HubSpot
Factory
Scale on your own warehouse
Heavy analytics on your warehouse — no Data Cloud, no add-on tier
Salesforce
HubSpot
Factory

Based on vendor product docs as of 2026-Q1. Salesforce Data Cloud supports zero-copy federation with Databricks and Snowflake (paid Data Cloud add-on) and Agentforce action APIs, but no outbound Delta Sharing publish and no MCP-spec endpoints. HubSpot Operations Hub syncs into a hosted warehouse — neither federated nor bidirectional.

Governance

Open standards. Enterprise guardrails.

Federated reads and bidirectional MCP only ship if the security model holds. Here's what stands between an external agent and your warehouse.

Query-guard on every SQL

Single-SELECT enforcement, table & column allow-lists, row-filter SQL, LIMIT clamp, and payload caps — Assistants can't write a query that exfiltrates your warehouse.

Schema-per-tenant isolation

Every tenant gets a dedicated Postgres schema. No shared rows, no row-level-security retrofits, no cross-tenant query risk.

Encrypted credentials

Warehouse and MCP-client credentials encrypted at rest with AES-256-GCM. Per-tenant key derivation via HKDF — a leaked master key still can't decrypt other tenants' secrets.

Recipient revocation < 1 min

Delta Sharing bearer tokens are checked on every request — revoke a partner in the UI and their next read fails immediately.

Per-row, per-column policy

Hide PII columns from individual data sources, attach row-filter SQL per source, and audit every query in the warehouse query log.

Same auth on outbound MCP

External agents (Claude, Cursor, Mosaic AI) authenticate against the same OAuth surface as the app — your tenant boundary, your roles, your audit trail.

5-minute setup

From credentials to a queryable AI skill — in one wizard.

The wizards live at /settings/integrations/warehouse/databricks and /snowflake. Same shape, different auth.

  1. Step 01

    Drop in credentials

    OAuth M2M or PAT for Databricks; key-pair JWT for Snowflake. Secrets are encrypted with AES-256-GCM and a per-tenant HKDF key — never round-tripped to the browser.

    ~1 min
  2. Step 02

    Test the connection

    We round-trip a single SELECT through your warehouse, surface latency, and verify the catalog is reachable. Failed auth maps to actionable error codes.

    ~30 sec
  3. Step 03

    Pick tables and columns

    Browse Unity / Snowflake catalog and choose what the Assistant can see. Column allow-list and optional row-filter SQL apply on every query — server-side.

    ~2 min
  4. Step 04

    Activate as Assistant skills

    Each table becomes a typed AI tool. Optionally accept the AI-suggested semantic link to join with CRM accounts/contacts. Done — chat queries the warehouse live.

    ~1 min

Ready to put ERP truth in front of every rep?

We onboard a limited number of teams each month for hands-on setup, migration support, and ERP connector configuration.