Computational Finance Journal

Sunday, April 23, 2006

linear regression continued...

Although lacking many of the statistical properties of regression.. the aforementioned method of computing the linear regression coefficients can be very easily optimized to minimize the function Sum ( i to N ) ( ( yi - xi ^T beta ) * f ( y ) ). where f( y ) can be a sort of funtion specifying how much importance to give to a point in the sample. Many applicatons of regression may require f(y) to be say, a step function of the absolute value of y or a positively biased sampler or a sampling proportional to the absolute value or maybe a square of it.
Mansi pointed it out to me that linear regression if doen with all the variables gives the same coefficients as a hillclimb verion of it ( in the newest post )

linear regression versus a system of equations

It turns out that the closed form linear regression solution of Y predicted with X with coeffs beta = (X^tX)^-1X^tY is the same as the one that would be obtained by differentiating the sum of squared deviations with each parameter in beta and solving the system of linear equations... courtesy Rakesh Kumar