fminunc alternate in numpy
Looks like you have to change to scipy
.
There you find all basic optimization algorithms readily implemented.
http://docs.scipy.org/doc/scipy/reference/optimize.html
I was also trying to implement logistic regression as discussed in Coursera ML course, but in python. I found scipy helpful. After trying different algorithm implementations in minimize function, I found Newton Conjugate Gradient as most helpful. Also After examining its returned value, it seems that it is equivalent to that of fminunc in Octave. I have included my implementation in python below find to optimal theta.
import numpy as np
import scipy.optimize as op
def Sigmoid(z):
return 1/(1 + np.exp(-z));
def Gradient(theta,x,y):
m , n = x.shape
theta = theta.reshape((n,1));
y = y.reshape((m,1))
sigmoid_x_theta = Sigmoid(x.dot(theta));
grad = ((x.T).dot(sigmoid_x_theta-y))/m;
return grad.flatten();
def CostFunc(theta,x,y):
m,n = x.shape;
theta = theta.reshape((n,1));
y = y.reshape((m,1));
term1 = np.log(Sigmoid(x.dot(theta)));
term2 = np.log(1-Sigmoid(x.dot(theta)));
term1 = term1.reshape((m,1))
term2 = term2.reshape((m,1))
term = y * term1 + (1 - y) * term2;
J = -((np.sum(term))/m);
return J;
# intialize X and y
X = np.array([[1,2,3],[1,3,4]]);
y = np.array([[1],[0]]);
m , n = X.shape;
initial_theta = np.zeros(n);
Result = op.minimize(fun = CostFunc,
x0 = initial_theta,
args = (X, y),
method = 'TNC',
jac = Gradient);
optimal_theta = Result.x;
There is more information about the functions of interest here: http://docs.scipy.org/doc/scipy-0.10.0/reference/tutorial/optimize.html
Also, it looks like you are doing the Coursera Machine Learning course, but in Python. You might check out http://aimotion.blogspot.com/2011/11/machine-learning-with-python-logistic.html; this guy's doing the same thing.