LTH-image

Handin7

a) Determine the gradient and Hessian for the function

f(x) = log(sum(exp(a_i' * x)))

where a_i' is row i in the matrix A.

Hint: See pp 74,  642 and 645 in the book.

b) Your task is to implement a Newton algorithm with line search for the following problem (described in CVX language):

randn('seed',0); m=500; n=100; A = randn(m,n);
cvx_begin
variable x(n)
minimize(log_sum_exp(A*x))
cvx_end

(with optimal value 6.0899388).

As a help you can use the following code fragment, which however solves problem 9.30 for the goal function in that problem. Note: you can remove the first while statement which iterates to find a feasible solution.

Use "cputime" to measure the runtime of your code. You should be able to beat CVX easily. My code was e.g. > 100 times faster than CVX.

ALPHA = 0.01; % feel free to tune these parameters
BETA = 0.5;
MAXITERS = 100;
NTTOL = 1e-8;
x = zeros(n,1);
for iter = 1:MAXITERS
val = -sum(log(1-A*x)) - sum(log(1+x)) - sum(log(1-x)); % change here
d = 1./(1-A*x);% change here
grad = A'*d - 1./(1+x) + 1./(1-x);% change here
hess = A'*diag(d.^2)*A + diag(1./(1+x).^2 + 1./(1-x).^2);% change here
v = -hess\grad;
fprime = grad'*v;
if abs(fprime) < NTTOL, break; end;
t = 1;

while ((max(A*(x+t*v)) >= 1) | (max(abs(x+t*v)) >= 1)),% you can erase this search
t = BETA*t;
end;
while ( -sum(log(1-A*(x+t*v))) - sum(log(1-(x+t*v).^2)) > val + ALPHA*t*fprime ) % change here
t = BETA*t;
end;
x = x+t*v;
end;