Cost Function:
It is a function that measures the performance of a Machine Learning model for given data. Cost Function quantifies the error between predicted values and expected values and presents it in the form of a single real number. Depending on the problem Cost Function can be formed in many different ways. The purpose of Cost Function is to be either:
- Minimized - then returned value is usually called cost, loss or error. The goal is to find the values of model parameters for which Cost Function return as small number as possible.
- Maximized - then the value it yields is named a reward. The goal is to find values of model parameters for which returned number is as large as possible.
Gradient descent :
an algorithm called gradient descent for minimizing the cost function.
It turns out gradient descent is a more general algorithm, and
is used not only in linear regression.
It's actually used all over the place in machine learning.
And later in the class, we'll use gradient descent to minimize
other functions as well, not just the cost function J for the linear regressionAndreu NG coursera.com 1week (*****) to see the gradient descent.
-- Firstly try to understand the cost function and gradient for one variable/ one parameter/ one dimensional.
Ref:
- https://towardsdatascience.com/coding-deep-learning-for-beginners-linear-regression-part-2-cost-function-49545303d29f
- https://medium.com/@lachlanmiller_52885/machine-learning-week-1-cost-function-gradient-descent-and-univariate-linear-regression-8f5fe69815fd (VVI)
- https://towardsdatascience.com/machine-learning-fundamentals-via-linear-regression-41a5d11f5220
- https://www.mathsisfun.com/calculus/derivatives-introduction.html (VVI)
- https://www.mathsisfun.com/calculus/derivatives-partial.html (VVI)
- https://www.quora.com/What-is-the-purpose-of-derivatives-in-calculus (VVI)
No comments:
Post a Comment