Coursera lecture summary
![](https://blog.kakaocdn.net/dn/di1UIj/btr61gW3WDH/ejellGP0r1r9wFs7WKt251/img.png)
Cost function
We can measure the accuracy of our hypothesis fufnction by using a cost functon. This takes an acerage difference (actually a fancier version of an average) of all the results of the hypothesis with inputs from x's and the actual output y's
![](https://blog.kakaocdn.net/dn/tav5f/btr63ma7QSF/NPgBswi3VKPB8r170zaKTK/img.png)
To break it apart, it is 1/2x where x is the mean of the squares of h@(Xi)-yi, or the difference between the predicted value andthe actual value.
This fuction is otherwise called the "Squared error function", or "Mean squared error".
ther mean is hlved (1/2) as a convenience for the computation of the gradient descent, as the derivative tem of the square function will cancel out the 1/2 term. The following image summmarizes what the cost function does:
![](https://blog.kakaocdn.net/dn/Geh03/btr611ZmLOn/RuIqhZEQIxTJwZc576wAG0/img.png)
'Machine Learning' 카테고리의 다른 글
데이터처리 파이프라인 프레임워크 (0) | 2023.03.31 |
---|---|
[Machine Learning]Linear regression with one variable - cost function intuition1 (0) | 2023.03.31 |
Linear regression with one variable - Model Representation (0) | 2023.03.31 |
Supervised Learning & Unsupervised Learning (0) | 2023.03.31 |
Supervised Learning & Unsupervised Learning (0) | 2023.03.31 |