Consider the problem of minimizing f and g using the Newton method (NM) with ey = 1. (a) f (x) = a + bT x + -21xTCx, where a E IR, b E Ir , and C ›- O. (i) What is the minimizer x* of f? Is it unique? (ii) Show that NM converges to x* in just one iteration from any initialization x° E Rn. (b) f (x) = (x / a) — Iog(x), for a, x > 0, where log denotes the natural logarithm. (i) What is the minimizer x* of g? Is it unique? (ii) What are the values of the first 3 iterates generated by NM for a = 1 and x° = 1/2? (iii) What are the values of the first 3 iterates generated by NM for a = 1 and x° = 10? (iv) Find an interval I C R such that for a > 0 and x° E I, NM monotonically converges to x*.

Sample Solution

This question has been answered.

Get Answer