MCP Bridge — Inbound Clients
Register Databricks Genie or any MCP server as an Assistant skill. Inbound MCP turns your existing agents into first-class Factory tools.
The MCP bridge is bidirectional. This page covers the inbound direction: register a remote Model Context Protocol server (such as a Databricks Genie space) and Factory wires its tools into the AI Assistant as governed skills, alongside CRM, ERP, and warehouse tools.
Want the other direction? For outbound MCP — Claude, Cursor, or a Databricks Mosaic AI agent driving Factory — see Connect Databricks Mosaic AI Agents and MCP Overview.
What an inbound MCP client unlocks
If your team already runs:
- A Databricks Genie space with a published room
- A Mosaic AI Agent Framework server
- A custom in-house MCP server (warehouse-specific, CRM-adjacent, anything)
…you can register its endpoint with Factory in ~5 minutes. Every tool the remote server exposes becomes a typed AI Assistant tool keyed mcp__<connId8>__<remoteToolName>. The tools are tickable per agent in the Custom Agent builder, are subject to the same role checks as native tools, and their calls + results are logged in the same audit trail.
How it works
The Factory MCP client speaks Streamable HTTP with OAuth 2.0 Bearer auth — the same transport Mosaic AI uses on the other side. Tool discovery happens on registration and is re-validated whenever you click Test on the connection.
Step 1 — Open the wizard
Go to Settings → Integrations → Connect MCP Servers, or directly to:
/settings/integrations/mcp-clientsClick Add recipient (re-purposed UI from the share page — same shape).
Step 2 — Configure the connection
| Field | Value |
|---|---|
| Preset | Pick Databricks Genie, Mosaic AI, or Custom MCP |
| Name | A label you'll recognize (e.g. databricks-genie) |
| Server URL | The MCP endpoint URL (full path, including /mcp or /api/2.0/genie/…/mcp) |
| Bearer token | A token the remote server accepts (Databricks PAT, OAuth access token, etc.) |
For Databricks Genie, the URL looks like:
https://<workspace>.cloud.databricks.com/api/2.0/genie/spaces/<space_id>/mcpThe PAT must have access to the Genie space — typically SQL + Vector Search privileges plus membership in the space.
The bearer token is encrypted at rest with AES-256-GCM and a per-tenant HKDF-derived key — see Governance → Encrypted credentials.
Step 3 — Test the connection
Click Test on the new row. Within ~30 seconds:
- Status flips from
pending→active. - The row shows
last_tool_count(e.g.4 tools). - A list of remote tools renders inline — typically
query,explain,describe, plus any custom Genie skills the room exposes.
The DB row in integration_connections records status='active', last_tested_at, and last_tool_count.
If the test fails:
| Code | Likely cause | Fix |
|---|---|---|
AUTH_FAILED | PAT expired, wrong workspace, missing space membership | Generate a fresh PAT, verify space access in Databricks UI |
ENDPOINT_UNREACHABLE | Wrong URL or workspace firewall | Curl the discovery URL: curl -H "Authorization: Bearer <pat>" <url> should return JSON-RPC |
TRANSPORT_MISMATCH | Server speaks SSE-only, not Streamable HTTP | Genie spaces and Mosaic AI both use Streamable HTTP. Custom servers must too |
NO_TOOLS | Remote server reachable but exposes 0 tools | Publish at least one Genie room or add a tool to your custom server |
Step 4 — Toggle individual tools
After a successful test, the row expands to show every tool the remote server advertised. Each has an on/off toggle:
- On — the tool is exposed to all default agents and tickable in custom agents.
- Off — the tool is excluded from
mcpClientTools()output server-side. The Assistant can't see or invoke it, regardless of agent config.
Toggles persist across page reload. New tools that appear on a future re-test default to off so a remote server adding a destructive tool doesn't auto-grant it.
Step 5 — Scope per agent
Open Settings → AI Agents → <your agent>. Under MCP client tools, the new tools appear keyed mcp__<connId8>__<remoteToolName>. Tick the ones this agent should use.
Default agents (Sales, Support, etc.) get every globally-on MCP tool by default. Custom agents start with no MCP tools — you tick them explicitly. This is the recommended path for production: one custom agent per use case, with a tight tool list.
Invoke from the Assistant
Open the Assistant. With an agent that has the Genie tool ticked:
"How many opportunities did we win last month? Use the Genie warehouse for the answer."
The Assistant:
- Sees the question needs warehouse facts and picks
mcp__abc123__query(the Genie query tool). - Calls Factory's MCP client → Factory decrypts the bearer token → POSTs to the Genie endpoint.
- Genie executes the query against your Databricks warehouse, returns rows + a citation snippet.
- The Assistant synthesizes a natural-language answer with the citation.
The agent_messages table records the tool args + result; the same data is available in the Assistant trace panel.
Error semantics
When a remote MCP call fails, Factory returns a structured tool result rather than crashing the chat:
{
"error": "Snowflake authentication failed (token expired)",
"code": "AUTH_FAILED"
}The Assistant interprets this gracefully — typically replying "I tried to query Genie but the connection authentication failed. An admin needs to refresh the token in Settings → Integrations → MCP Clients."
| Code | Meaning |
|---|---|
AUTH_FAILED | Bearer token rejected by the remote server |
REMOTE_ERROR | Remote returned a non-200 (5xx, 4xx other than 401) |
TIMEOUT | Remote didn't respond within 60s |
INVALID_RESPONSE | Remote returned malformed JSON-RPC |
RATE_LIMITED | Remote returned 429 (with Retry-After echoed) |
All errors are logged to mcp_client_call_log for incident triage.
Cross-tenant isolation
MCP client connections are strictly tenant-scoped. The Custom Agent builder in tenant B never sees connections registered in tenant A. The dynamic tool catalog (GET /api/v1/admin/agents/tool-catalog) returns only the tenant's own connections. The integration_connections table lives in the per-tenant Postgres schema, so cross-tenant SELECTs are not even physically possible.
Use cases
| Customer profile | What they register | Why |
|---|---|---|
| Databricks-native distributor | Genie space pointed at their sales_analytics schema | Replace ad-hoc dashboards — sales reps ask Genie via the Assistant |
| Multi-tenant data platform team | Their own internal MCP server with custom tools | Expose proprietary metric definitions consistently across CRM + chat |
| Snowflake shop with a Cortex agent | Cortex Search / Cortex Analyst MCP endpoint | Federate Cortex semantic models into the Assistant |
| In-house RAG platform | Their own /mcp endpoint over Pinecone / Weaviate | Bridge to corpora that aren't in the lakehouse |
Troubleshooting
Tools list is empty after a successful test
The remote server returned tools/list with {"tools": []}. Verify in the remote system that at least one tool is published to the connected user. For Genie, ensure the space has at least one published room and the PAT user is a member.
Tool toggles don't persist Browser tab probably lost session — Factory's tool toggle endpoint requires admin role. Re-sign-in and try again.
Assistant doesn't pick the MCP tool even though it's ticked The model picks tools by description. If the Genie tool's remote description is generic (e.g. "Run a query"), tighten the agent system prompt to mention the connection explicitly: "For warehouse questions, prefer the Genie query tool over generic SQL skills."
AUTH_FAILED after PAT rotation
Edit the connection (pencil icon on the row) and paste the new bearer token. Re-test. The old encrypted blob is overwritten — no separate purge step needed.
Related guides
- Connect Databricks Mosaic AI Agents — the outbound direction (Mosaic agents calling Factory).
- MCP Overview — architecture, scopes, transport.
- Connect Databricks (warehouse) — alternative path for "Assistant calls warehouse" via SQL adapter rather than MCP.
- Governance — bearer token storage, per-tenant key derivation, audit.
Delta Sharing Publish
Stream CRM facts (accounts, contacts, leads, opportunities, activities, cases) back to your Databricks, Snowflake, or PyIceberg lake every 15 minutes as Delta tables.
Vector Search RAG (Federated)
Federate the AI Assistant to your Databricks Vector Search or Mosaic AI embeddings index — RAG fallback without re-embedding or maintaining a parallel store.