Skip to main content
Coursera
Coursera for Reliance Family
  • Menu pop-up Collapsed
    • Completed
      Quiz: Practice quiz: Train the model with gradient descent
      Submitted
  1. Week 1
  2. Practice quiz: Train the model with gradient descent
Quiz

Practice quiz: Train the model with gradient descent

Ready to review what you’ve learned before taking the quiz? I’m here to help.

Activity Completed

Submit your assignment

DueAugust 4, 11:59 PM PDTAug 4, 11:59 PM PDT
Try again
Activity Completed

Receive grade

To Pass70% or higher

Your grade

100%

We keep your highest score

Practice quiz: Train the model with gradient descent

Graded Quiz. • 10 min

DueAug 4, 11:59 PM PDT
Assessment passed

Congratulations! You passed!

Grade received 100%
Latest Submission Grade 100%
To pass 70% or higher

1.

Question 1

Gradient descent is an algorithm for finding values of parameters w and b that minimize the cost function J.

When ∂J(w,b)∂w\frac{\partial J(w,b)}{\partial w}∂w∂J(w,b)​ is a negative number (less than zero), what happens to www after one update step?

1 / 1 point
Correct

The learning rate is always a positive number, so if you take W minus a negative number, you end up with a new value for W that is larger (more positive).

2.

Question 2

For linear regression, what is the update step for parameter b?

1 / 1 point
Correct

The update step is b=b−α∂J(w,b)∂wb = b - \alpha \frac{\partial J(w,b)}{\partial w}b=b−α∂w∂J(w,b)​ where ∂J(w,b)∂b\frac{\partial J(w,b)}{\partial b}∂b∂J(w,b)​ can be computed with this expression: ∑i=1m(fw,b(x(i))−y(i))\sum\limits_{i=1}^{m} (f_{w,b}(x^{(i)}) - y^{(i)})i=1∑m​(fw,b​(x(i))−y(i))