Grounding
Connecting model outputs to verifiable external sources to reduce hallucination and improve accuracy.
Full Definition
Grounding is the practice of anchoring model responses in factual, retrievable sources — documents, databases, APIs, or real-time web search — rather than relying solely on parametric knowledge encoded during training. A grounded model response cites or directly derives from a retrieved source, making verification possible and hallucination less likely. Retrieval-augmented generation (RAG) is the primary technical implementation of grounding. Grounding is especially critical for time-sensitive information (after the training cutoff) and high-stakes domains (medicine, law, finance) where factual accuracy is non-negotiable.
Examples
A RAG-powered legal assistant retrieving the exact statute text before answering a question, and quoting the relevant clause in its response.
Google's Gemini with Search Grounding fetching current search results and citing them inline when answering questions about recent events.
Apply this in your prompts
PromptITIN automatically uses techniques like Grounding to build better prompts for you.
Related Terms
RAG (Retrieval-Augmented Generation)
Augmenting model responses by retrieving relevant documents from an external kno…
View →Hallucination
When a model confidently generates false or fabricated information not supported…
View →Vector Database
A database optimised for storing and querying high-dimensional embedding vectors…
View →