Vault.
Early access open

Your private data
never reaches
LLM pipelines.

A privacy layer between your AI agents and any LLM. Sensitive fields masked before they leave your server — restored in the response.

Prompt before vault

name"Sarah Chen"
card"4111 1111 1111 1111"
amount"$8,400"
3 fields
PII
overhead
any
LLM

WHAT WE DO

Four steps.
Zero exposure.

01

Intercept your prompt

The user's plain-text instruction is intercepted before it leaves your server. Vault. scans it and masks every piece of sensitive information inline.

02

Mask sensitive data

Names, account numbers, UPI IDs and amounts are replaced with stable tokens — right inside the sentence. The masked string is what the LLM sees.

03

LLM routes to an agent

GPT receives the masked string, understands the intent, and returns a structured JSON — specifying which agent to invoke and the parameters to use. Tokens stay in place throughout.

04

Restore, then execute

Before the agent runs, Vault. swaps every token back to its real value. The agent receives clean, complete data — and executes with full fidelity.

vault.ts
1
2
3
4
5
6
// Raw user instruction — PII fully exposed
const userInput = 
  "Send Usha Sharma 1000 Rs from my\n   HDFC account 4081 2291 3301 4821\n   to usha.sharma@okicici"

// Intercept before sending anywhere
const { masked, restore } = await vault.protect(userInput)
vault.ts
1
2
3
4
5
6
7
8
// vault.protect() masks PII inline in the string
console.log(masked)
// "Send [NAME_0] [AMOUNT_0] from my
//  HDFC account [ACCOUNT_0]
//  to [UPI_0]"

// Real values held in vault — never leave the server
// Safe to send to any external LLM
const llmResponse = await llm.complete(masked) // ✓ zero PII on the wire
vault.ts
1
2
3
4
5
6
7
8
9
10
11
12
// LLM returns structured routing JSON
console.log(llmResponse)
// {
//   agent:  "payment_agent",
//   action: "send_money",
//   params: {
//     recipient_name: "[NAME_0]",
//     recipient_upi:  "[UPI_0]",
//     from_account:   "[ACCOUNT_0]",
//     amount:         "[AMOUNT_0]"
//   }
// }
vault.ts
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
// Unmask the LLM's JSON before execution
const agentCall = await restore(llmResponse)

// {
//   agent:  "payment_agent",
//   action: "send_money",
//   params: {
//     recipient_name: "Usha Sharma",
//     recipient_upi:  "usha.sharma@okicici",
//     from_account:   "4081 2291 3301 4821",
//     amount:         1000
//   }
// }

// Dispatch to the correct agent
await agents[agentCall.agent].run(agentCall) // ✓ transfer executed

THE MASKED FLOW

What the model
never sees.

Every sensitive field is swapped before the prompt leaves your server.

Field-level masking — only what's sensitive gets replaced
PII detected automatically — no config needed
Audit logs on every masked field
Works with structured and unstructured prompts

PROMPT SENT BY YOUR APP

name"John Doe"
card"4111 1111 1111 1111"
amount"$9,400"
email"john@acme.com"
ip"192.168.1.1"
vault. masks

WHAT THE LLM RECEIVES

name[NAME_0]
card[CARD_0]
amount[AMOUNT_0]
email[EMAIL_0]
ip[IP_0]
0 bytes
<2 ms
any LLM
Your data stays yours.Always.Your data stays yours.Always.Your data stays yours.Always.Your data stays yours.Always.Your data stays yours.Always.

EARLY ACCESS

Ship LLM features.
Without the risk.

No credit card. No spam. Just early access.