 The second talk and the topic for today's talk is frequency function. Some of you probably have looked at the notes already. It's a bit technical thing that we're going to do, but I'll try to do very simple sync on the blackboard and try to convince you how beautiful and nice it is. I learned it myself when I was a PhD student, basically from a very nice radical of Geroffola and Van Halen. And I would never dream of lecturing about it here, but it's a very nice and fascinating sync. I hope you will enjoy it even if you're missing the match right now. So we'll start with harmonic functions, usual harmonic function in RD. We want to have some feeling about how this function grows. So we fix points, say the origin, where you can move it whatever you want, and look at the average of your function over the sphere of radius r. We'll forget about the volume of the sphere in the normalization constant and write it in this way. So h of r is just up to a constant average of the harmonic function on the sphere of radius r, where in RD, so this is normalization factor. If your function is a constant, you'll get a constant. If your function is a polynomial, if h is a polynomial of degree n, a homogeneous one, then when you average the function there, you'll get constant times r to the power 2n. So for this case, h of r is constant r to 2n. What happens for more complicated harmonic functions if you try to feel what harmonic function is? Look at the balls, at least we know the maximum principle. We believe that the maximum grows when you let the radius grow. You would expect that the same would happen with the average. This harmonic function should grow somehow. Let us make a simple computation and take the derivative of this one in r. I need to differentiate this. Think about taking the average over sphere of this one. You can convince yourself rather easily that what you get is the normal derivative of this function. If you think about it as a normal derivative of h squared times 1 and apply the Green's formula, you get immediately that this is integral from the ball, the laplacian of this function that is 2 times gradient h squared, since we have a harmonic function h. You can do it in a different way and take the derivative of this part, this derivative of this part and see what happens, but you will get the same answer here. So the derivative is positive. It's a nice increasing function, but there is much more to the story. If you look at the ratio, I take the logarithmic derivative of h multiplied by r and I will use the following notation. i of r will be one half of this integral. So there is 2 here and this is r to 1 minus d integral b r, which is squared. So not only our function h is increasing and the derivative is positive, you can check that the derivative of this ratio is positive. There are two ways to do it. One way is outlined in the lecture notes for a more general situation when you have elliptic equation actually and we will go back to it. Another one is in the exercise you can use for harmonic function expansion in polynomials in spherical harmonics and check that the function, that the average over sphere satisfies this inequality. It tells you that actually function is, in a way, log convex. Once again, go back to it. So starting from a harmonic function and looking at the averages over spheres, we buy a straightforward computation that is also, there is some miracle happens there. It's nice to understand. Get function with this property. Let me use this property in some way. So I will write n of r, that is this ratio, and the claim here says that function n is increasing when r is increasing. Say that n is the frequency of frequency function. While it's a frequency, once again, if we think about homogeneous polynomial, then h is r to n up to a constant if you look at this ratio, you get exactly to n with our notation, where we divide it by 2. If h is homogeneous harmonic polynomial, then n of r is just a constant that is the degree of this polynomial. Yeah, so I should write it this way. For polynomials, it's a constant function, but otherwise, it's a non-decreasing one. First query that we have from this observation that the frequency function is monotone is the following. Let me consider three concentric circles with three different radiuses. Do you know them or not r1 and r2? And I want to compare the average values of my function, harmonic one over these three spheres. When I take average all the time I think, all the time I'm thinking about, this one square over the function. So take first to r0 and r1 and to go from r0 to r1, I have to integrate the logarithmic derivative. So the integral from r0 to r1 of this one, multiply by r, divide by r, and use the fact that this function is my frequency up to factor 2. So what I get is two times the value of frequency at some point between those two times the integral of dr over r. This is log r1 minus log r0. I can do exactly the same thing between r1 and r2. Let me... Yes, this one here, r1, is just some number between r0 and r1. Very mean value theorem. As I'm integrating this one against this measure, it's a value at some point time that one. And now I'll have to move back to the first blood board and you will try to look at both. So I'll do the same between r1 and r2 and get that this is the value of the frequency function at some point, r2. r2 is between r1 and r2 times log r2 minus log r1. And finally, I want to use the fact that this frequency function is increasing, so this value is strictly larger than the one over there than this one. So what we will see from this is that this ratio is larger than the one that we get from the first two spheres. If you write down exponents of both sides, you will easily see that this inequality means exactly the following one. I can bound h on the middle sphere by h on the smaller one times h of the larger one with the right exponents. And the exponents are exactly the ones that you need to go from r0 and r2 to r1. So you have a very beautiful, precise inequality for all harmonic functions. If you remember complex analysis course, probably realize immediately that it looks very much like Hildemar's three-circle theorem that tells you that if you have an analytic function in the complex plane or part of the complex plane and you look at the maximum value of this function, say, on the circle of radius r, then you have exactly this inequality. But also, if you think about this one and try to remember what was the way to prove it, the way I used to think about this inequality, it's the consequence of the fact that the log on the analytic function is subharmonic. This is what allows you to get multiplicative estimates for the maximum value of a function. If you work between analytic functions and harmonic functions in higher dimensions, you know how frustrating is the fact that in complex analysis, you can always use this trick. You have an analytic function or gradient of harmonic function. You know that the log of the gradient is subharmonic, and it gives you many estimates. Here, we have an estimated look exactly like that, except for two norms, but not maximum norms. And this one holds in any dimension without any subharmonicity of the log that we use in complex analysis. So I think it's already the first miracle that you see that you have these very nice and precise inequality for harmonic functions in higher dimensions. Turns out that much more is going on there. For harmonic functions, you can still say that, okay, they are real and analytic in a sense, and they should have some properties that are close to analytic functions. But we will see that this is not about real and analytic, it's about ellipticity. A very similar result holds for all elliptic equations. Before I do that, I will do one more thing. I will rewrite three sphere inequality. That's what this is. For harmonic functions, it's very precise and nice, and it tells you that the L2 norm of harmonic function of a sphere is, if I choose, for example, 2rr and 4r here, you will see that my normalization cancels and I have this nice inequality. We also know that different norms are equivalent. We have local boundaries result for not only harmonic functions, but solutions of elliptic equations, which tells you that the maximum of a function over a ball is bounded by a 2 norm. So the sphere of radius 2r there. And this kind of inequality allows you to go from L2 inequalities to L infinity inequalities that look similar to undermarth's three-spheres theory. No, because if you average all these places, they will cancel. By a, yeah, it's here, here, I need the average, yeah. Thank you, here, I need the average, definitely. Let us write it in this way, or like a constant R21 minus d over this, this one. Yes, and the power, where should we put it? I could have written square, yeah, sorry for that. So by comparing these norms, you see that for any three concentric circles like that, you have the inequality for the maximums. I'll call those just b0, b1, and b2. With some better between 0 and 1, it depends on the ratios of the radiuses of the balls. It's not so beautiful as in undermarth's three-circle here, but there is some b2 between 0 and 1, and the constant c such that for any harmonic function, we have this inequality. I would also like to have a general version of this statement, and instead of three circles, I'll have three domains. So I will have my harmonic function h in some domain. Omega have a small ball inside and some compact set that you can think about it, a set in between. It's not important that it contains the ball. What is important is it's a compact in omega. Then if you use three balls many times and go from this ball to a little bit larger one, then probably take a part of it, go to a larger one, you can reach from this ball inside by looking at the chains of balls at any point of your compact set. And you will need finite number of steps to do it. This number of steps doesn't depend on function. You iterate this estimate, and you can see that there is this version of the three domain theorem, so the maximum. When a compact set of our function h is bounded by constant maximum of the ball to some new power, say gamma. Once again, it's power 2, 0, and 1 times the maximum over the whole omega. And the constants are terrible, but they don't depend on the function. And all of that is a consequence of the fact that the frequency function is monotone. Now we'll go from harmonic functions to solutions of elliptic equations and see that a lot of this is still true there. So once again, we look at solutions of second-order uniformly elliptic equations in divergence form, and we will definitely need that this is uniformly elliptic. We'll do the following trick. We will look at one point and real normalize our coefficients, such that at this point, say the origin, a is identimetric, can change the coordinates, and then when you define your balls, the balls will correspond to these new variables. And defined function mu of x. That is, some weight, remember this is uniformly elliptic, so the weight is controlled by two constants. Define h of r in a similar way. Now it will be r2 1 minus d integral of mu of x. The surface measure for harmonic functions, the weight is 1, and we get back the average that we were working with before. And if you differentiate this one and look at the main term, you will see that the main term is integral over the gradient squared, but now it's matrix A over the whole ball. And as before, define this function, depending on the function and the equation of u and a, we define this quantity, yes, I'm sorry for that, y on top and h on top. My theorem that due to Nicola Geroffel and von Haulien, says that this function is already as good as it was before. If you just do a small correction and multiply it to something like that, this is non-decreasing on some interval from 0 to 0. So I have two constants, c and r0 here. They depend on the constants for the operators and the dimension of the space. By the constants, I mean the ellipticity constant and the Lipschitz constant here. You still have this phenomenon, sorry, this phenomenon. The same thing happens for elliptic equations. And you see that if you just compute the derivative, it's not positive anymore, but you can control it in a nice way. And it tells you that after multiplication by exponential function, you get an increasing function. I will not do computation there done in the lecture notes, but this is enough to show three-ball theorem. Three-ball theorem, and from there you go to maximum version using local boundless systemates and to the version with three domains as well. It was a very nice approach that immediately gives you strong unique continuation result from the monotonicity of the frequency function. You can tell more than that, you can see that your function is a nice wagon-halved weight. I'm not going to go there as well right now. But what I want to announce at this point is that our aim at the last lecture was to prove this kind of inequality where you replace the small set by measurable sink. Go from a ball to a measurable set there. This will be our aim at the end. But for the moment, let us enjoy this nice frequency function and see what it tells us. So we'll take away this one. One more remark about the frequency if you look at this function. And first let us think about this ratio. Let's take order that is very, very small and see what happens when we go to small r near the point. Already know that this function is almost monotonic in r. When you go to very small r's, your function, suppose everything is smooth, your function looks like its first term of the Taylor expansion at this point. So you'll get something that looks like r to the power 2k, where k is the vanishing order of your function there. And when you compute the frequency, let us divide it by 2 as before, you'll get exactly k that is a vanishing quarter of the function at this point. So by controlling the frequency of the function, we also control the vanishing quarter of the function. And if you believe in the fact that this frequency is monotone up to this thing, you can, from global information in your function, knowing how it grows on large scales, you can have some control on the small scale. You can go back to 0 and see that if I know that the frequency is bounded by something on a large scale, the vanishing quarter is bounded in a small scale. There is one thing that I want to prove. Now I connect the frequency to eigenfunctions. But before that, I will change my notation in a way. There is another way to feel the local growth of the function. This was the frequency I will call the other quantity doubling index and try to convince you that they are almost the same thing. So given function u and the ball v, I'll look at the doubling index of u at the ball v. To do it, I take the maximum of the double ball divided by the maximum of the ball. And I want this.