Optimization
Gradient Descent
Visualize how optimization moves across an objective surface.
Optimization Surface
Step through the descent path across a loss surface.
Controls
Surface equation
The selected equation drives both the contour field and the gradient update.
Learning rate
Step interval (ms)
x: 4.000y: 4.000f(x,y): 32.000grad: [8.000, 8.000]steps: 0
Next-step calculationgradient = [8.000, 8.000]x_next = x - eta * grad_x = 4.000 - 0.010 * 8.000 = 3.920y_next = y - eta * grad_y = 4.000 - 0.010 * 8.000 = 3.920
Implementation note
Gradient descent is a harness pattern as much as an optimization algorithm: each step uses feedback from the current state to choose a better next action.