Skip to content

Data Poisoning

An attack where malicious data is injected into a training dataset to compromise model behavior. Poisoned data can cause incorrect outputs for specific inputs or introduce backdoors. It is a security concern for models trained on public or crowdsourced data.

Related terms

Adversarial AttackTraining DataAI Safety
← Back to glossary