Gloqo AI
Visual Labs
Optimization
Gradient Descent

Visualize how optimization moves across an objective surface.

Optimization Surface
Step through the descent path across a loss surface.
-4-2024-4-2024contour lines show equal objective valuearrow points along the next descent direction
Controls
Surface equation
f(x,y)=x2+y2

The selected equation drives both the contour field and the gradient update.

Learning rate
Step interval (ms)
x: 4.000y: 4.000f(x,y): 32.000grad: [8.000, 8.000]steps: 0
Next-step calculationgradient = [8.000, 8.000]x_next = x - eta * grad_x = 4.000 - 0.010 * 8.000 = 3.920y_next = y - eta * grad_y = 4.000 - 0.010 * 8.000 = 3.920