Chris wrote a very kind article explaining kernel regression. Plus, thank you for referring my blog.

Having learned about the application of RBF Networks to classification tasks, I’ve also been digging in to the topics of regression and function approximation using RBFNs. I came across a very helpful blog post by Youngmok Yun on the topic of Gaussian Kernel Regression.

Gaussian Kernel Regression is a regression technique which interestingly does not require any iterative learning (such as gradient descent in linear regression).

I think of regression as simply fitting a line to a scatter plot. In Andrew Ng’s machine learning course on Coursera, he uses the example of predicting a home’s sale value based on its square footage.

Note that the data points don’t really lie on the line. Regression allows for the fact that there are other variables or noise in the data. For example, there are many other factors in the sale price of a home besides just the square footage.

Gaussian Kernel Regression…

View original post 979 more words