Skip to content

Perplexity

A metric that measures how well a language model predicts a sequence of text. Lower perplexity indicates the model is more confident and accurate. It is commonly used to evaluate and compare language models during development.

Related terms

Large Language Model (LLM)Benchmark
← Back to glossary