Solution for Q1 is in the book Stephen Boyd’s Convex Optimization.
This is course blog for Theory courses at ITU, Lahore. Current course is Statistical and Mathematical Methods for Data Science.
Solution for Q1 is in the book Stephen Boyd’s Convex Optimization.
Q2 page #3 on 3rd update i think it should be
x:= 1.26-.2*.52
=1.156
LikeLike
YES!
LikeLike
In question 3, how do you know if the function is convex or concave?
LikeLike
Do we assume its concave, by comparison with gaussian? Is this what multivariate gaussian look like?
LikeLike
we do not NEED to know if the function is convex or concave to perform GD. The information is given for your understanding. You can plot to see!
LikeLiked by 1 person
you used x=x – a*f'(x) in question 2 and x=x + a*f'(x) in question 3. I am assuming the negative sign is for gradient descent and the positive sign is for gradient ascent. Knowing if the function is convex or concave will help in picking the formula.
LikeLike
correct! since the objective function is concave, the goal is to maximize it. The update equation is then adjusted accordingly.
You can also solve it by converting it to a convex function and use the update equation given before.
LikeLike
Do specific properties of matrix A play any role in determining concavity and convexity? Or is the general form of the objective function sufficient in deciding?
LikeLiked by 3 people
Yes, the nature of A will affect whether its a convex or a concave function. Based on that you will then decide if you should maximize or minimize that function.
LikeLike
Can you please share any resource for that?
LikeLike
on how to tell concavity/convexity from the matrix A.
LikeLike