Skip to content

Gradient Descent

An optimization algorithm that iteratively adjusts model parameters to minimize a loss function. It works by computing the gradient (direction of steepest increase) and moving parameters in the opposite direction. Variants include SGD and Adam.

Related terms

BackpropagationLoss FunctionLearning Rate
← Back to glossary