 Hi and welcome to this next lecture on data structures and algorithms. In this session we will discuss another algorithm to find roots of an equation. So, recall that we are looking at the roots of f x equals 0. We discuss the bisection method which halves the possible search range at every iteration. So, bisection requires an interval a, b to be specified in which you expect the answer to occur. On the other hand Newton Raphson which we will discuss today does not require an interval. It starts with a point estimate rather than an interval estimate. So, as compared to the bisection method it requires only one initial guess of the root. This is an advantage and if the method converges it performs faster than bisection. That is because the search range is not just based on a fixed pre-decided factor of half reduction in the search space. It actually varies with the shape and curvature of the function. So, in practice rate depends on curvature of the function. The rate at which the interval shrink is what we are referring to here as rate. So, more specifically given a function we are going to look at the curvature through its derivative. So, f prime x i is the rate of change or decrease or increase of f at x i. Now, we will canonically talk about the rate of change as rate of increase. If this f prime happens to have a negative value we interpret it as a decrease. So, what you are going to do here is look at f prime and we find that well you need to decrease you need to move in this direction. How much should you move down? Well you should keep moving down this direction of decrease until you find the x intercept. This x intercept is what we find to be our next approximation x i plus 1 is the next x intercept based on f prime x i. Let us look at theta the angle made by this descent direction. What we expect tan theta to be is this height which is f x i minus 0 divided by x i minus x i plus 1. So, if x i was the initial guess of the root the idea is to find an improved estimate x i plus 1 as x intercept of the tangent to the curve at f x i. The slope f prime x i which is tan theta is f x i minus 0 the 0 is basically that is what the x intercept x i plus 1 has as its y coordinate. So, f x i divided by x i minus x i plus 1. So, this rearrangement gives you x i plus 1 as x i minus the ratio of the function to its slope at x i. The idea is to keep repeating this process till you get closer to the root within some desirable tolerance. As pointed out earlier you are moving based on the rate of change of the function. The function f could have small rate of change. So, small rate of change will means small values of f prime and if f prime is small because f prime sits in the denominator this will mean that the reciprocal will be large and therefore, lead to large steps. On the other hand the function could have a very large rate of change in which case f prime is large and this will mean that the reciprocal of large f prime will lead to small steps. This is actually quite intuitive. If the function is changing very slowly you might want to move fast. If the function is changing too rapidly you might want to move very slowly. Basically you do not want to miss the root in the second case and you do not want to take too long to get to the root in the first case. So, here is the overall algorithm you are going to have certain max number of iterations and certain tolerance limit. You are going to evaluate f prime x at every step and until a point at f prime x becomes very small which means the rate of change is negligible what you want is f prime x to be ideally 0 no rate of change as a necessary condition. You keep finding the updates x i plus 1. What are some issues? Well first problem is you need a closed form expression for the derivative. So, programmatically finding the derivative of function is challenging. So, if the function f has a known form you could have a closed form expression for the derivative. If you need to do this programmatically you would also land up with certain numerical errors. There are also some classical cases when the function fails to converge to the root. The first example is overshooting. So, it is possible that the function where to cross a root. However, it is possible that subsequently the function has a very different shape. So, if your x i initially were here you suppose move to x i plus 1 and landed up on the other side due to some reason. It is possible that you overshoot so much that x i plus 1 is actually on this side. This can happen if the curve that we show here is so small that this intercept lands up getting you to a point x i plus 1 on the other side. There are also other possibilities. Your f prime itself might become very small in which case you land up dividing by 0. Division by 0 is when derivative x is close to 0. You can have wrong initial estimate that is x i which is here. For example, the x i could be here. In which case you make a move in this direction which might never get you to 0. So, this is an example of wrong initial estimate and finally one might oscillate. You might have an x i which is reasonably initialized somewhere here, but it is possible that you oscillate between two iterates on either side of the x axis. Thank you.