Then we'll use this update rule again, to make another move. If we keep doing this, over and over, we'll keep decreasing CC until - we hope - we reach a global minimum.
Summing up, the way the gradient descent algorithm works is to repeatedly compute the gradient ∇C∇C, and then to move in the opposite direction, "falling down" the slope of the valley. We can visualize it like this: