Kernel Regression

Chris wrote a very kind article explaining kernel regression. Plus, thank you for referring my blog.

Chris McCormick

Having learned about the application of RBF Networks to classification tasks, I’ve also been digging in to the topics of regression and function approximation using RBFNs. I came across a very helpful blog post by Youngmok Yun on the topic of Gaussian Kernel Regression.

Gaussian Kernel Regression is a regression technique which interestingly does not require any iterative learning (such as gradient descent in linear regression).

I think of regression as simply fitting a line to a scatter plot. In Andrew Ng’s machine learning course on Coursera, he uses the example of predicting a home’s sale value based on its square footage.

HousingPricesPlot

Note that the data points don’t really lie on the line. Regression allows for the fact that there are other variables or noise in the data. For example, there are many other factors in the sale price of a home besides just the square footage.

Gaussian Kernel Regression…

View original post 979 more words

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s