Ready to review what you’ve learned before taking the quiz? I’m here to help.
We keep your highest score
Graded Quiz. • 10 min
Gradient descent is an algorithm for finding values of parameters w and b that minimize the cost function J.
When ∂w∂J(w,b) is a negative number (less than zero), what happens to w after one update step?
The learning rate is always a positive number, so if you take W minus a negative number, you end up with a new value for W that is larger (more positive).
For linear regression, what is the update step for parameter b?
The update step is b=b−α∂w∂J(w,b) where ∂b∂J(w,b) can be computed with this expression: i=1∑m(fw,b(x(i))−y(i))