Generated Knowledge
A technique where the model first generates relevant facts, then uses them to answer the question.
Full Definition
Generated knowledge prompting splits a query into two phases: first the model produces a set of relevant background facts or considerations, then it uses those self-generated facts as grounding when composing its final answer. This reduces hallucination by making implicit knowledge explicit and checkable before it influences the output. It is particularly effective for commonsense reasoning and factual questions where the model has relevant knowledge but may not activate it reliably in a single pass. The technique is related to chain-of-thought but focuses on knowledge retrieval rather than procedural reasoning.
Examples
Prompt: 'List five facts about the water cycle, then use them to explain why deserts receive little rain.'
Asking the model to enumerate relevant legal principles before drafting a legal argument.
Apply this in your prompts
PromptITIN automatically uses techniques like Generated Knowledge to build better prompts for you.
Related Terms
Chain-of-Thought
A prompting technique that asks the model to reason step-by-step before giving a…
View →RAG (Retrieval-Augmented Generation)
Augmenting model responses by retrieving relevant documents from an external kno…
View →Self-Consistency
Sampling multiple reasoning paths and selecting the most common answer to improv…
View →