logistic regression 的导数形式竟然如此简单。

和 linear regression 的一样。。。

https://www.coursera.org/learn/machine-learning/exam/Fd9cu/logistic-regression

第 4 题:For logistic regression, sometimes gradient descent will converge to a local minimum (and fail to find the global minimum). This is the reason we prefer more advanced optimization algorithms such as fminunc (conjugate gradient/BFGS/L-BFGS/etc). ? 这个不对吧?

Learning Rate and lambda

学习率和惩罚项 week 2 没有讲惩罚项。

注意

Note that you should not regularize the parameter 0 - page 9 不要在 cost function 里面加 ${\theta}_0$

results matching ""

    No results matching ""