A chatbot wired straight to a foundation model will confidently invent details about your business. It will quote policies you don't have. It will recommend products you don't sell. Grounded retrieval is what makes the difference between "this AI knows our customer data" and "this AI is making things up about our customer data."
What it actually means
Grounded retrieval is the discipline of fetching the right documents from your corpus, injecting them into the model's context, and forcing the model to answer from them — not from its training data, and not from the open internet. Three working parts:
- Corpus that you control. A document store you own (not a model vendor's). Indexed, refreshable, scoped per tenant.
- Retrieval that is correct. Vector + lexical + recency + permission-aware. Not just "throw the closest embedding at the model."
- Citations + refusal. Every answer points back at the source paragraph. When the retrieval comes back empty, the model says "I don't know" — it does not guess.
What we build
For each Foundations engagement, we deploy:
- A retrieval pipeline targeting your specific document shapes (PDFs, web pages, ticket histories, knowledge bases, CRMs, contracts — whatever the actual material is).
- Embedding + chunking strategies appropriate to the documents (not one-size-fits-all chunk-by-512).
- Permission-aware retrieval so the AI never surfaces a document the user isn't authorized to see.
- A refusal layer — when retrieval is weak, the AI says so explicitly rather than fabricating.
- Per-call citations rendered in the UI so the human can verify.
The failure mode this prevents
You've seen the failure mode. The chatbot the marketing team bought gives accurate-sounding policy answers that don't match your actual handbook. It quotes a product feature that doesn't exist. It claims a customer's policy is active when it lapsed. Each of those is a grounded-retrieval failure — and each of them is a real liability event.
Grounded retrieval is the part of Foundations that catches all three before they happen.