Home/Glossary/Foundation Model
Models

Foundation Model

A large model trained on broad data that can be adapted to many downstream tasks.

Full Definition

Foundation model is a term coined by Stanford's CRFM in 2021 to describe large models trained on diverse, massive datasets that serve as a general base — a foundation — for a wide range of downstream applications through fine-tuning or prompting. GPT-4, Claude, Gemini, and Llama are all foundation models. The concept captures the paradigm shift from training task-specific models from scratch to adapting a single powerful general model. Foundation models exhibit emergent capabilities (abilities not present in smaller models) and enable rapid prototyping of AI applications without task-specific training data, but they also concentrate risk: flaws in the foundation propagate to all downstream uses.

Examples

1

Using GPT-4 as a foundation and fine-tuning it on medical records to build a clinical note summarisation tool.

2

Anthropic's Claude serving as a foundation for hundreds of third-party chatbots and automation tools via API.

Apply this in your prompts

PromptITIN automatically uses techniques like Foundation Model to build better prompts for you.

✦ Try it free

Related Terms

Large Language Model

A neural network with billions of parameters trained on text to understand and g

View →

Pretraining

The initial phase of training a model on massive text data to learn general lang

View →

Fine-Tuned Model

A pretrained model whose weights have been updated on a specific dataset for a t

View →
← Browse all 100 terms