##### [Linear Regression] Polynomial Regression and Features-Extension of Linear Regression

In this post, we talk about how to use linear regression and its extention polynomial regression fit data

In this post, we talk about how to use linear regression and its extention polynomial regression fit data

To any input $\boldsymbol{x}$, our goal in a regression task is to give a prediction $\hat{y}=y(\boldsymbol{x})$ to approximate target $t$ where the function $y$ is the chosen hypothesis. And the difference between $t$ and $\hat{y}$ can be called 'error' or more precisely 'loss'.

Squares of the difference between the output of a predictor and the target are wildly used loss function especially in regression problems

An introduction of multiple linear regression

this post talk about how to assess the accuracy of the model, and we take a linear model as an example.

How correct we believe the parameters of the model are always concerned by us. To have more confidence in using methods talked previously, we would like to make a reliable framework, under which the method is always feasible.

We have already created a simple linear model in the post "Introduction to Linear Regression". $y=w_1x_1+w_2x_2$ is a linear equation of both $\boldsymbol{x}=[x_1 \; x_2]^T$ and $\boldsymbol{w}=[w_1 \; w_2]^T$. According to the definition of linear, we come up with the first simplest linear regression:

Linear regression is a basic idea in statistical or machine learning, especially in supervises learning. The linear regression is a statistical model whose structure is based on the linear combination, and it is usually used to predict some quantitative responses to some inputs(predictors).