Saturday, 14 September 2013

Java: Linear Regression -- What's wrong with my code?

Java: Linear Regression -- What's wrong with my code?

This is not Homework. I am planning to implement linear regression in
mapreduce and to get started, I am trying to implement in the regular way.
The below mentioned is the code. When I try to run I am getting the cost
function value as zero and goes to negative infinity. I am debugging since
4 hours -- Please advice where I am going wrong.
I tried to declare variables outside the main method (within class) so
both main and GradientDescent methods can use them but I am getting
errors. Appreciate your help in this
public class LinearRegression{
public static double GradientDescent(int thetaparam){
int i=0,j=0;
double[] y = new double [10];
double[][] mat = {{1,1,1,1,1,1,1,1,1,1},{5,6,7,1,8,11,2,3,13,5}};
double[] price = {7,7,6,3,1,3,5,7,4,8};
double[] theta = new double[2];
int m = mat[0].length;
for(i=0;i<mat[0].length;i++){
for(j=0;j<theta.length;j++)
y[i] = y[i] + (mat[j][i]*theta[j]);
y[i] = (y[i] - price[i])*mat[thetaparam][j];
}
double a=0;
for(i=0;i<mat[0].length;i++)
a += y[i];
return a;
}
public static void main(String args[]){
int j=0,i=0;
double[][] mat = {{1,1,1,1,1,1,1,1,1,1},{5,6,7,1,8,11,2,3,13,5}};
double[] price = {7,7,6,3,1,3,5,7,4,8};
double[] theta = {0.0,0.0};
int m = mat[0].length;
double alpha = 0.01;
double[] y = new double [10];
for(int z=0;z<=20;z++){
theta[0] -= (alpha/m)*GradientDescent(0);
theta[1] -= (alpha/m)*GradientDescent(1);
for(i=0;i<mat[0].length;i++){
for(j=0;j<theta.length;j++)
y[i] = y[i] + (mat[j][i]*theta[j]);
y[i] = ((y[i] - price[i])*(y[i] - price[i]));
//y1[i] = y[i]*y[i];
}
double a=0;
for(i=0;i<mat[0].length;i++)
a += y[i];
System.out.println("Value of Costfunction at "+z+" iteration is
"+(1/(2*m)*a));
}
for(j=0;j<theta.length;j++)
System.out.println(theta[j]);
}
}

No comments:

Post a Comment