 So last time we were looking at properties of vector norms and the punch line was that given two norms alpha and beta we can find constant c small m and c capital M such that the beta norm of x is sandwiched between c small m times the alpha norm of x and c capital M times the alpha norm of x for every x in r to the n. So that's the key part is that this is a bound that's valid regardless of which x you choose in r to the n and these constants c small m and c capital M may depend on n the dimension of the space over which you want to find these bounds. The consequence of this is that over finite dimensional vector spaces all norms are equivalent in the sense that whenever xk converges to x with respect to say the alpha norm then it converges to the same x with respect to any other norm. So in particular since all norms are equivalent all norms are equivalent to the infinity norm which is the max entry which in turn means that if the limit of xk as k goes to infinity is x with respect to any norm that is true if and only if the limit as k goes to infinity the ith entry of xk being equal to xi for i equal to 1 2 up to n that means component wise convergence is equivalent to convergence with respect to any norm. So the other thing here is the same x if it converges to one norm it converges with respect to any other norm. So this is what we saw in the previous class. Okay so today we will proceed with a little bit more discussion about norms.