Best writers. Best papers. Let professionals take care of your academic papers

Order a similar paper and get 15% discount on your first order with us
Use the following coupon "FIRST15"
ORDER NOW

In Gradient descent technique, we chose an alpha value

In Gradient descent technique, we chose an alpha value (learning rate)
in computation of parameters (theta zero and theta 1). What will happen if we assign a very small value to alpha?

1) The model computations may take a long time to converge

2) The model may never converge

3) There will be no need to iterate

4) The speed of the computations will be very high

 
Looking for a Similar Assignment? Order now and Get 10% Discount! Use Coupon Code "Newclient"