 OK, so we discussed some other notions of compactness and compare. So definition, a space x, a topological space x, is limit point compact. So first, space x is limit point compact. It's point compact. If every infinite subset has a limit point, the limit point is each neighborhood intersects the set in a point different from the limit point, from its set. That doesn't sound too much like limit point now. If you have house dwarfs, then we have infinitely many points, OK? But still, we don't have a sequence. This depends on first count or something like first count, OK, if you want to find the sequence. Anyway, the second one is x is sequentially compact. If every sequence in x has a convergent subsequence, this is more interesting for us for the moment, limit point compact. Every infinite subset has a limit point, OK? Sequentially compact depends too much on sequences, no? And sequences are not good for all spaces. If they are first countable, then maybe yes. But otherwise, the sequence is too very strict, OK? So the first proposition then is if x compact implies x is limit point compact. If x is compact, then it is limit point compact. So proof. So what means limit point compact? Every infinite subset has a limit point. So let A be a subset, any subset. Any subset. Let A be a subset of x, which has no limit point. Suppose it has no limit point. What we want to prove is that A is finite, OK? If it has no limit point, it must be finite. So if it's infinite, that's a limit point. So as you recall, this implies that A bar, the closure, what is this? A, union A prime, no? That's, these are the limit points, OK? So this is A in this case. It has no limit point. So A is closed, OK? This implies that A is closed. It has no limit point. It means closed. A is closed. So what about A is in x, no? x is compact, so you don't need house stuff here, no? A closed subset of a compact space is compact, OK, without house stuff. So this implies A is also compact. Now I will construct a covering of A. So for each A, take any point A in A. A is not a limit point of A, OK? A is not a limit point of A, because it has no limit point. Not a limit point of A. What means not a limit point? A is not a limit point means, limit point means each every neighborhood intersects in a point different. So not a limit point means that there is a neighborhood which intersects only in the same point, OK, there's no other point. So it's not a limit point. This implies there is a neighborhood, neighborhood U, what is our point? UA of A, such that UA intersection A is only A, the point A, no other point, OK? Then what we have is UA, A in A to take all these neighborhoods. For each point, we choose one of these. So this is an open covering of A, clearly, in x in the whole space, no? It's an open covering of A in x by open sets in the whole space. So A is compact, that was the first observation, A compact. So this implies that, finally, many suffice, that A is contained in, finally, many of these neighborhoods. A, UA1, UAN, UA1, union, union, UAN, finally, many suffice, OK? But that means, of course, that A is contained in? Yeah, A is finite. So A is contained in A1, AN, because the intersection with A of these is A1 and so on, there are no other points. So this means A is finite, that's it. So if A is infinite, it has a limit point, OK? So x compact implies x limit point compact. They are not equivalent, OK? They are not equivalent. This compact is stronger than limit point compact. However, a reasonable example is not, I don't know, it's not so easy to find. If you don't want house stuff, then you can construct something very artificial, OK? Of a space, which is not, if you don't want house stuff, you just construct something which is limit point compact but not compact. That's easy in some sense, OK? If you don't want house stuff, there's some exercise in the book or some example, but this is not very interesting. If it's not house stuff, if it's house stuff, I don't know. Then it's not so clear how to construct such an example, OK, for house stuff spaces. For non-house stuff spaces, it's not so interesting anyway, OK? But for house stuff spaces, it's not so clear. So we will see, this is the last of the main examples of the book, OK? That's the main of the, so the examples of the book, which are important, RL or limit topology on R, then the ordered square. Of course, R omega, OK, or RJ is different topologies, product box uniform, OK? And then there's the last example, which is S omega, which I will not discuss now, because I want to go on first with the theory, OK? And then I come to this last important example, S omega. That's an example of a house stuff space, which is limit point compact but not compact. It's in some sense, it's a natural example. Well, it comes more from logic, maybe, but it's a very natural construction. Well, we use A, it's compact. So you say you have a short approve or what? What? Yes? Sorry, A has no limit points. Then what? You do it for every x in x, it has a neighborhood, which intersects A only in the point of several empty intersections. And now you have an open covering of the whole space, you say. And then you say you have a finite sub-covering. And one on no element. Yeah, that's OK, yes, right. Maybe it's shorter, yeah, OK? You, it's clear? So he wants not to use that A is compact, but that x is compact only. And also he does it for every point, this neighborhood, OK? So we get the open covering of the whole space, which is compact, one second. And then we have finitely many of these neighborhoods. And these neighborhoods are, that they intersect A, either empty the intersection or just in the same point, OK? So then A is finite again, OK? So in this way, he's right. He avoids that the fact that A is compact. Anyway, A is compact, OK? So, but that's a nice observation, yes, OK? Well, so as we discussed later, this example. So now I prove, so there's a little lemma, which I, lemma. I will prove the back number, lemma. But what we need is, if you have a metric space, if x is metric and, one second, I have to concentrate. And compact, let's say. Metric and compact, this implies limit point compact, would be OK also, OK? Compact implies limit point compact anyway. Now we have metric and compact, x is sequentially compact. But here we have, so this is a small lemma, which will be useful for. So if x is metric, that is the important metric. And compact or limit point compact, then it's sequentially compact, OK? But metric means first countable, OK? So here sequences are better for first countable, OK? In first countable spaces, sequences are reasonable, OK? In not first countable sequences are not sufficient in general. Then you need something more general. And this will be filter or something, OK? Then you can work with, instead of sequences, but that we will not do. So I will sketch the proof. I mean, that's an exercise, proof, proof. X is metric, so metric is important here. Otherwise, it's not true, such things, OK? So x is metric and compact. So x is sequentially compact. So we have a sequence, take any sequence, OK? Let an xn be a sequence in x. What you have to prove is that x is, we have to construct a convergent sub-sequence, right? And there are two cases. Two cases, we have that this set xn, n in n. This is our sequence, no? This may be finite or infinite, no? If this is finite, if it's finite, then we have a constant sub-sequence, OK? And then it converts. Then there is a constant sub-sequence. Because one element here has to be infinitely many often, for infinitely many n's, OK? So we have a constant sub-sequence. So this is a trivial case. Otherwise, it's infinite, OK? So it's finite, then, so we can assume that it's infinite. xn, n in n is infinite. I should have given a name. Well, this is s. That's not the sequence. That's the set underlying the sequence, OK? This might be one point in all the set. If it's a constant sequence, it's just one point, no? The set. The sequence is the sequence. OK, so we can assume that it's infinite, right? But now x is limit point compact. It's reasonable to write your limit point compact, OK? So we need only limit point compact. So this is better, OK? So we can assume that it's infinite. Now, x is limit point compact. What does it mean? If you have an infinite subset, it has a limit point, OK? So s has a limit point, x in x. And so this, now we want to construct the convergent subsequence, which should converge, of course, against this point, OK? So we want to construct the convergent subsequence, OK? We have to use at a certain point that the space is metric. Otherwise, it is not true, OK? We need metric here, OK? So I take, I show you how to construct this. So I take bd, the ball, around x. This is x, OK? Let me call it x0. So it's this fixed point, x0, OK? And with radius 1, let's start with 1, OK? Intersection of a, no, s, what is it? s, no? Intersection of s. This is our same. s has a limit point, OK? So this is non-empty. And we find a point also different from x0, OK? In s. So I call this xn1, OK? I don't know the index, which one, OK? But there's some element of the sequence, xn1, which is an element here, intersection s, right? Because this is a limit point, OK? So this is a neighborhood. And the neighborhood is a sex s, OK? So we have a point here. Now I take bd, x0, 1 half. So here's first count, OK? I take these neighbors, 1 over n, OK? So intersection s, this is non-empty. This is non-empty. But now it's much stronger. Non-empty is not immediately, it's not exactly sufficient non-empty here, OK? Because we may find always the same point or something like that, OK? That's not so good, no? But however, this is not, what do we know? It's a metric space. So this is infinite. This is infinite. It's not, it's non-empty, but it's infinite, OK? In a metric space. In a house of space, what? In a house of space. It's infinite, OK? In a house of space. So take n, there is x and 2 here now, OK? But now I can choose n2 bigger than n1. I have to choose n2 bigger than n1. I want a sub-sequence. I don't want to go this way in sequence, OK? x and such that n2 is bigger than n1 because it's infinite. I can go over n1, OK, and choose a point, OK? And then you see the construction, right? So x and 3 in BD, x0, 1 over 3 intersects s, which is infinite now. So you take n3 bigger than n2, bigger than n1, OK? And so in this way, you construct convergent sub-sequence, OK? Which converges, obviously, against x0 here, OK? You get smaller and smaller these neighborhoods, OK? So we get this convergent sub-sequence. x and i, x and i, which converges to x0. This i goes to infinity, OK? So this is the construction. House of is not sufficient here also. I mean, if it's house of, it's infinite, OK? However, you don't know what to take for these neighborhoods to get convergent, OK? You need first countable. If it's first countable and house of, OK? Then it may be OK, OK? House of and first countable. Then you will get infinite, and you have this sequence, which gets smaller and smaller, OK? And every neighborhood contains one of these, no? That's OK. So we need really metric here, OK? House of is not sufficient. Because it might, a house of space might not be first countable. Then sequences are bad, OK? Anyway, this is a proof, OK? That's an exercise, which I gave sometimes as an exercise in some tests, OK? Because you have these two cases, no? One is trivial, and the other one, you have to use infinity, OK? If you don't know infinity here, then, so I suppose you need both. House of, you need metric is OK, now that's our proof. But I suppose you need both, house of, OK? Then it's infinite. But you need also first countable to have these neighbors. Otherwise, you don't construct the sequence, OK? And only this is also not sufficient. If it's not house of, and this might be final, no? Maybe there's a problem. Because then you cannot choose this bigger, no? Sometimes you get one point, then you get back the other point, then you bet. So you are jumping in the sequence, no? But that's not a subsequence, right? Subsequence means you go in one direction, right? Anyway, it's sometimes interesting to analyze. Analyze what do you really need here, OK? Metric is OK. Well, anyway, that's an exercise. And now the theorem, which is a lemma also, which is called the Lebesgue number lemma. This is a nice result which has various applications. And so what does it say? So let A be an open covering of the metric space of the compact metric space X. So these are the compact and metric. They exist, then they exist, they exist, there is, exists, a number delta bigger than 0. And this will be called, it's not unique. You can, a Lebesgue number associated to the open covering, a Lebesgue number. Of course, it's compact. So you have an open covering, you have a finite subcovering. But that's not what we are interested in, is it OK? There exists a delta, a Lebesgue number associated. So this is a Lebesgue number. A Lebesgue number associated to the open covering, A. Such that, such at what? Such that every subset of X with a diameter smaller than delta. With a diameter smaller than delta, less than delta, smaller than delta, less than delta, is contained in at least one A in A. It's in one of these sets. It is contained in at least one A of our covering. What is the diameter of a subset? Yes? Yeah, exactly. The least upper bound, the super-remum of the distances of average of two points, OK? It might be infinite, no? If it might be, yeah. So if you have a, so the diameter of a circle is the diameter of the circle, no? Twice the radius. So that's a Lebesgue number lemma, proof. However, we need something more proof. We need something a little bit stronger. We prove the theorem with a hypothesis, a limit point compact instead of compact. So we may write here limit point compact, not limit point but sequentially, sorry, sequentially compact, OK? However, the standard Lebesgue number lemma is compact because compact is more important, OK? So it's compact. But we, in some application, we will use not compact but sequentially compact. We don't know compact, but we know sequentially compact. So we prove the theorem under the hypothesis, under the weaker hypothesis, that x is only, is metric anyway, metric we need, a metric and only sequentially compact, OK? Yes, compact, but this is our hypothesis. OK, so what? So now we, suppose there exists delta. So A is given, A is an open covering, right? A is given. Suppose there is no delta, there is no delta. Suppose that no such delta bigger than 0 exists, OK? But it's contradiction. We prove by contradiction. We cannot find such a delta. What does it mean? That means for each, then, what does it mean? We cannot find delta. Then for each delta bigger than 0, there exists a subset. There is a subset, which I call A delta or something. Let me see what is called here. There is a subset, well, no name, I don't know. There's a subset A delta of x, with what? Diameter whose diameter is smaller than delta, whose diameter is smaller than delta, is less smaller than delta. But A delta is not contained in any A in A, OK? But not contained, but A delta is not contained in any, no delta is good, OK? That's exactly what it says, no? For each delta, we find a bad set, A delta. Diameter is small, but not contained in any of the curve. So take, we consider now delta equal 1 over n again. So we have the sets A1 over n. Consider, we consider only these, OK? A1 over n. So what do we do? We choose a point, I suppose, in this point. So let xn or an, well, I have to be careful with the notation here, an to a1 over n. So this gives us a sequence, an, OK? Clearly. For each n, OK? So we have a sequence. Maybe let me call it xn, sorry, or sequence xn. Now, x is sequentially compact, OK? x is sequentially compact, only sequentially compact. So what? It has a convergent subsequence, right? So x is sequentially compact, which implies that the sequence xn has a convergent subsequence. So how to call this? xni, again, now, which converges against x0, I suppose, it was before, OK, in x. So I goes to infinity. This is our subsequence, right? xni, which converges to x0. And now, we get a contradiction. I make my picture, what is the contradiction? It's, well, this is just a statement. So the contradiction you see immediately now. So we consider this point, of course, no? We have an open covering, right? So let this point x0 be in, I'm probably with a notation, a, I don't have, no? A, because this is an open covering, it's somewhere here, OK? x0 is, so we have x0, and here we have this open set, OK? Where's the contradiction now? So we have this subsequence, xni, which converges to this point, right? We found this subsequence. But each of these points is in one of these sets, no? These bad sets, a1 over n, right? This, the next point is in some of these, the next point is in some of these, right? And the diameter is smaller than 1 over n, OK? Now what you do, you have a metric space anyway. So here you have some ball which contains around this point, which is contained in a, which is open, OK? This a is open, so we find this ball. And now what happens? This converges, it comes arbitrary close to x0, OK? Each of these points is in one of these a1 over n. And the diameter of this becomes arbitrary small. So what does it mean? If you go very close here to this point, also the diameter of this became arbitrary small, then obviously it's here, OK? So it's in some a, OK? a and a. That's the contradiction, OK? That cannot happen. So now I will write. But here you see the contradiction, OK? You have this sequence, so xn a1. You have a convergent subsequence, OK? The diameter of these become arbitrary small. And the point, the subsequence, gets arbitrary small to this point, OK? That means that finally these sets here, a1 over n, in the subsequence are in this a, OK? So now I write a formal proof, OK? But you see from the picture what happens. These become smaller and smaller, these become smaller and smaller. And finally, we cannot avoid this. So maybe I should write the proof then, also, right here. So choose ni such that the distance between x and i, so this is our subsequence. And x0, this is the limit, is smaller. Ah, sorry, I forgot. So where is it written? x0 is in a, OK? So there is a ball bd x0 epsilon, which is contained in a, OK? So this is epsilon, OK? True is epsilon, OK? For some, for epsilon bigger than 0, for some epsilon bigger than 0, OK? So we have epsilon, also. And now you make this smaller than epsilon over 2. This comes, this converges. So it becomes, it comes up close to x0, so this is, OK? And 1 over ni is also smaller than, and choose large ni, and 1 over ni, 1 over ni is also smaller than epsilon over 2, OK? True is ni is large enough, no? Then you have both. And now the claim is then, if you choose this, then, so, now I take a point in a1 over ni, OK? So let z, y be a point in a, so this is ni, 1 over ni. Then the distance between y and x0, smaller equal what? X and i and y, the distance of x and i and y plus distance of x and i and x0, yeah, x and i, x0, right? So this is smaller than epsilon over 2. So these are two points, where's y? These are two points here, OK? Both these two points, x and i and y, are in this set. And the diameter is smaller than epsilon over 2. And here, anyway, I wrote smaller than epsilon over 2, plus epsilon over 2, so this is epsilon. And this means what? So all this implies that x0 is in, no. That y is in the ball, around x0, this rate is epsilon. And this is contained in a. But this means that, yeah, exactly, y is an arbitrary point here, so this implies, I don't know, so all this implies that a1 over ni is contained in a. And that's a contradiction, because this was a set. It was not contained in any, OK? So this is from, you see from here what happens, OK? Up to a small, and the diameter gets smaller and smaller. And finally, you are here. No way to avoid that, OK? So this is the back number, then. We see various applications. The first one is the following nice theorem, so theorem. So let x be metrizable. Or metric, that's the same, no? Let x be metrizable. Then the following equivalent, x is compact. 2i, x is limit point compact. And 3i, x is sequentially compact. So for metrizable spaces, metric spaces, for metric metrizable spaces, the three notions of compactness are the same, OK? Compact, limit point compact, sequentially compact. For metric spaces, there's no difference. For other spaces, there's a big difference, OK? There are still more notions of compactness, by the way, which we'll see in some exercise. We'll give some exercise today, but next week. Other notions of compactness, counter-be-compact. There are many counter-be-compact. Lindler, but that's not so, I mean, they are very similar. So proof. Let's see, why? i implies 2i. This is a general effect, OK? We don't need metric here, even, OK? The general. We don't need anything, not metric, OK? 2i implies 3i. So now we have limit point compact and sequentially compact. That's our lemma. That's what I've proved before, OK? That's a lemma, OK? Proceeding lemma. And here, we need metric, OK? We use metric. That was this lemma. We need metric. House of is not sufficient here. So we have to prove. So it remains 3i implies i. So we have to prove this, OK? So we have sequentially compact, and we have to prove compact. So I prove another lemma. I start with another lemma. So lemma. So the hypothesis is now, well, in this case, we have 3i. Now x is what? Sequentially compact and metric, also. Suppose x is a metric and sequentially compact. That's our hypothesis, also. Then for each epsilon bigger than 0, then for each epsilon bigger than 0, there exists a final covering of x by epsilon balls. There is a final covering of x by open epsilon balls. We want to prove that x is compact later, no? If x is compact, this is clear, right? It's not clear, but it's OK. You're looking critical, I think so. If x is compact, this is trivial, no? You take all epsilon balls, this is an open covering, and then you have a finite sub-covering, no? So we have finitely many, OK? So if it would be compact, then this is trivial, OK? But here we have sequentially compact, right? That's only our hypothesis, sequentially compact, right? So we are on the way to prove that it is compact. OK, so let's prove by picture. So by picture, there is a final covering of x for each epsilon, so prove. Fix epsilon bigger than 0, some epsilon bigger than 0, OK? Then I take any point, let x1 be in x, any point. So we have, here's x1, and we have the epsilon ball, bd x1 epsilon. Ah, then for, sorry, sorry. So we prove by contradiction, OK? For each epsilon, there's a final covering of them. Suppose there is epsilon bigger than 0, such that there's no final covering by epsilon, OK? So this is my epsilon. Suppose that there is an epsilon bigger than 0, such that there's no final covering of x by epsilon balls, such that there's no final covering of x by epsilon balls. This is our epsilon, OK? Let, now the same as before, let x1 be any point in x, OK? So now I make this picture, which I wanted to make before. Here's x1, and here's the epsilon ball, bd x1 epsilon, OK? This is one epsilon ball, no? So it's contained in x, but certainly it's not so, why not? It's not, there's no final cover by epsilon balls, not by 1, not by 2, not by 3, OK? This is one, cannot be everything, right? There is no final covering of x by epsilon balls, OK? Here you see one, so this means it cannot be everything, right? Tired, so it's OK, yes, it's trivial, no? It's not covered by one epsilon ball, right? So they're different, so let, so there is one point outside, x2. So let x2 be in x minus bd x1 epsilon, OK? Take the epsilon ball, bd x2, now we have 2. This cannot be everything, right? Because there's no final cover, not by 1, not by 2. So let x3 be in x minus bd x1 epsilon minus bd x2 epsilon. So where is this? x3, here is x3, I should have left more space, OK? x3 is outside these two, right? So take again this, no? Now we have 3, this is not everything, so you find x4, OK? And so on. So what do we do we construct? We construct the sequence, so induct, what do you say? Recursively, inductively, recursively, we construct the sequence, how is it called? xn in x, such that, what is the important property now of this sequence? Such that the distance between xi, xj is bigger equal, it's not important, bigger equal than epsilon for all i different from j, right? If i is not equal to j, the distance is always bigger or equal to epsilon. But that, of course, means that xn has no convergent subsequence, OK? So this implies xn has no convergent subsequence. Because in a convergent subsequence, the distance gets arbitrary small, no? However, no. So x is not sequentially compact, OK? That's what it says. So x is not sequentially compact. OK, that's the proof. So if x is sequentially compact, we cannot find such an epsilon, OK? That's the lemma, which is trivial if x is compact, OK, by definition of compact. But here, we have to prove it in this way. And now, we prove that 3i implies i. So what we have to prove? We have to prove that x is compact, right? And we know that x is sequentially compact. So compact means we have an open covering, OK? Let a be an open covering of x. And we have to find the sub-cover. Let a be an open covering of x. So x is what? x is metric sequentially compact. That's our hypothesis, no? Which implies. And here, we use our Lebesgue number. But under this hypothesis, OK? That's why, in general, it's for compact, OK? But it's true for sequentially compact. So the Lebesgue number lemma, there's a Lebesgue number. What was it, delta? Because it's associated to this open covering. There is a Lebesgue number. There is a Lebesgue number. Delta was delta, no? Delta bigger than 0, associated to A, to the open covering A. What does it mean, Lebesgue number? That means, again, any subset with a diameter smaller than delta is contained in some A and A, in at least one, OK? That's it. There's a Lebesgue number delta associated to the open covering A. By the lemma, the preceding lemma, there's an open covering. There is an open covering of x by what? Of by balls, open balls, of for any radius, OK? And of radius, so delta over 3, epsilon, delta over 2, maybe. But I'm not sure if it's the equal or close of F radius, epsilon equal delta over 3, right? That's all lemma. Choose this. This is this delta, OK? There's an open, finite, yeah, of course. There's a finite, thank you, finite covering of x by open balls of radius, OK? Final covering. Each of these balls has diameter, well, two times this, no? Each of these balls has diameter 2 delta over 3, smaller than delta. So it's contained in some A in A, OK? Each of these balls is, so it is contained in an element of A. But there are only finitely many, which cover. So this means finitely many elements of A cover, OK? So this implies finitely many of A cover x. And this means, of course, then that x is compact. That's what we have to prove, OK? So this is a proof. That's a nice proof in some way, OK? The Lebesque number lemma. We've seen other applications of the Lebesque number lemma, which is in some sense useful. OK, so there are other notions, but this will be in exercises and not now. So we have compact, limit point compact, sequential compact, OK? Three notions. Important is compact. Compact is the important notion, this definition, which is very simple. Each open covering has a finite subcover. Very simple, OK? Everything else is more technical in some sense, already the definition. And sequences for general spaces are not so good anyway, OK? So now comes this example, the last important example of a space, of a house of space. It will also serve as an example of a house of space, which is limit point compact, but not compact, OK? A house of space, which is limit point compact, but not compact. So what is well-ordered definition? You know somebody, what means well-ordered? So we have ordered. Linear ordered, OK? Completely ordered. Say again, every non-empty subset. So an ordered set is well-ordered. If any non-empty subset, if every non-empty subset has, what do you say, minimum. So this is well-ordered, the definition of well-ordered, OK? Example, what is an example? The natural numbers, of course, n. Of course, finite sets, OK? Finite order sets, that's no problem, OK? But n, OK? And then, after n, already one has to think in a second, no? It's not so clear, no? That, no. Yes, that, no. There's no minimum itself, OK? QR. But also, OK, so what we need, so there's a notion of the book. So let x be an ordered set. And given the point alpha in x, then we say S alpha. This is a section of alpha. This is a section of alpha. That's what we know already, section of alpha, what is it? In some sense, it's minus infinity alpha. That's what it is, OK? That's another way to write minus infinity alpha. This is a section of alpha. So these are all points x such that x is smaller than alpha, right? Everything is smaller than alpha. That's a section of alpha. That's just another way to write this, OK? So theorem, well, maybe there are these three famous things in mathematics, OK? One, if you have a course of linear algebra, you have every vector space as a basis, OK? That's very strange, no? And what you use is a lemma, so-and-slamma, OK? If you heard that or not. There are three things which are equivalent in mathematics. This is the axiom of choice. Whatever that is. This is equivalent to so-and-slamma. And this you use to construct, to prove that every vector space has a basis, OK? What? No, no, no. Yes? Yes. Yeah, if you, this is dependent on this. Yes, yes, every vector space has a basis. But you have to assume so-and-slamma. Otherwise, you cannot prove. You have to assume axiom of choice. Then yes, OK? It's more logic than mathematics. Final dimension is OK, but. And there's a third one which is equivalent to this. And this is a well-ordering theorem, OK? The well-ordering theorem. There's some name associated with this, you know? What is the name? No. Thermal, yes, exactly. So this is a, these are equivalent. This is an axiom. This is a lemma. And this is a theorem, OK? But they are completely equivalent. Without axiom of choice, you cannot prove neither so-and-slamma nor the, you know? But in mathematics, in general, you never think about axiom of choice, OK? What? Yeah, but some say that's a strange axiom. We don't want it or we want it, no? This is independent of the standard set theory, OK? The axiom of choice. That's an important point. You can assume it, and it's OK. This is standard axioms of set theory. And you can say I don't want that. I say it's wrong. It's not true, OK? And then it's also OK. So you can choose. It's true or not true, OK? This is a standard axiom of set theory, OK? But in mathematics, in general, you don't think about this, no? You never know. Did I use axiom of choice in some proof or not, OK? Just don't know. You don't care too much about this. Well, you use, OK? That's it. What does it say? This well-ordering theorem says we are going, this is more logic than the mathematics form. Well-ordering theorem, every set x can be well-ordered. In other words, if you have a set x, you can, there is an order, a linear order, on x, which is well-ordered, OK? That's a well-ordered theorem. And this is not so difficult, this equivalency, OK? That's not really very difficult, OK? So if you assume the axiom of choice, you have this one. And then it happens, some people say, this is a very strange theorem, OK? Because the real, for example, you don't, example x, you take the reals, OK? So they can be well, the standard ordering is not well-ordered, OK? That's very far from being well-ordered, OK? However, this theorem says there is a well-ordering on the real. You can choose the one, OK? It's not constructive. And then this was a bad point, because then some people say, if this is a consequence of the axiom of choice, this theorem here, OK, it's not constructive, it exists, OK? Then the axiom of choice is also strange, OK? And so we don't want the axiom of choice, OK? So there are some mathematics without axiom of choice. But in general mathematics, in the standard mathematics, you never know, OK? You don't think about this, OK? If you have some infinite choice, no? There might be a problem with the axiom of choice, OK? Well, anyway, this is, of course, interesting in logic, notice that this is independent of the standard axioms, OK? So the name associated with this is Gödel and Cohen, OK? The two parts. You can not prove it, OK? But you can assume it's true, or you can assume it's not true, OK? Both is correct, OK? We don't get the contradiction. Well, whatever. Now our example, so theorem, that's what we use, OK? This was just an introduction to, it's interesting, no? You've heard sometimes, OK, these terms of names. So the names here are Gödel and Cohen. Not the proof, I mean, of the independence, OK? That you can assume and not assume, OK? These equivalences, easy, OK? That's not the problem. But, OK, so the theorem is the following. Theorem, there exists an uncountable, there is an uncountable well-ordered set, an uncountable well-ordered set, such that every section is countable, such that every section is countable. So this, in some sense, says this is minimal. There's also uniqueness, by the way, OK? This is a minimal uncountable well-ordered set. That's what it is, OK? The minimal. But I write the minimal well-ordered uncountable. Every section is countable, uncountable. That's intuitively what it should be. There is an uncountable well-ordered set, such that every section of it of the set is countable. So the proof is, I use this well-ordering theorem, OK, for the proof. The proof is easy now, OK? But I use the well-ordered set. You can avoid that well-ordered theorem, OK? You can avoid. However, then if you want to do something with this set, OK, you need the axiom of choice again. So you can prove the existence without axiom of choice, OK? But then you cannot do anything with it. You cannot prove anything, OK? So the axiom of choice comes through the back door, OK? And first you say, well, I'm glad I don't need this axiom of choice here for this theorem, OK? But then you cannot do anything with this set without the axiom of choice. So it's useless in some sense. I'm not from logic, but OK, so proof. So take any well-ordered, using the well-ordering theorem, OK? I will use it, no? Using the well-ordering theorem. Let x be any uncountable well-ordered. For example, the reals, OK? You take the reals, and you know there's a well-ordered, OK? So that's OK. So let x be any uncountable well-ordered set, truth. If any section of x, if every section of x, so what is the section sx, x in x, of x, is countable, then this is our set, then this is the set of this theorem, OK? Then this is our set. Then we are done. That's what we want. We want uncountable well-ordered, and every section is countable, OK? What? If this is already the case, that every section is countable, then that's our set. Otherwise, take all x in x such that the section is not countable. As x, the section is not countable. If every section is countable, it's OK. Otherwise, this is non-empty, OK? Otherwise, there are some points where the section is uncountable. But x is well-ordered. What does it mean? Call this y. y has a minimum. I call this omega in x. That's a strange notation, but let me call it omega in x, OK? Otherwise, y has a minimum omega, which is in y. What does it mean? Then, now it takes the s omega. The section of this omega is uncountable, because it's here, OK? The section of omega, this is s omega, OK? It's uncountable, by definition, no? It's uncountable. Every section of s omega is countable, because omega is the first element that's uncountable, OK? Now, omega is not in s omega, no? That's the point, OK? So omega is not here, no? It's uncountable, but every section of s omega. Every section, so this you can call s x, do I have some x here? Well, too many. S x of s omega, so this means x is smaller than omega, no? It's countable. Omega is the first element of x where the section is uncountable, the first. If you go to a smaller element, it's always countable, countable, countable, OK? So then take s omega. Take s omega, s omega, so for this. By the way, you call it always s omega, this set. So then take s omega. And you call this set always s omega, OK? In any case, you call this set of the theorem s omega. So then take s omega. In any case, in every case, well, there's some uniqueness here also, OK, which we are not interested in. Call the set of the theorem s omega. This is the notion, OK? S omega. And there is also some, so omega is not in s omega here, OK? That looks like a section. That's a section of omega. So you may, there's also s omega bar. So s omega is a topological space, no? With the order topology, OK? So with the order topology, this is a well-ordered set. With the order topology, s omega is a topological space. You take the order topology. It's a sort of strange notation s omega, no? But we will see why s omega. Then you can also define s omega bar, s omega. So this is a definition. It's not the closure. It's also the closure, but first is the definition. s omega bar, you take s omega and add one element. And this is omega. You add one element, and you say that this omega is the largest element, OK? So add one element, which is the largest element, OK? So the largest element of s omega bar. You just call it omega. And then also s omega bar is well-ordered, OK? Also, s omega bar is well-ordered. Then also, adding this largest element doesn't change anything, OK? Also, s omega bar is well-ordered. Well, it's uncountable and whatever. So we have s omega, s omega bar. And then, of course, s omega is the section of omega in s omega bar, no? Right? Then you exclude this element, OK? So this explains also the notion, OK? So we have two spaces now. s omega and also s omega bar, two topological spaces. By the way, s omega bar is the closure of s omega in s omega bar, OK? Otherwise, you wouldn't give this kind of notation to this, which I will not prove in this moment in the time. So the last thing I do now, there's some uniqueness, OK? So what is a well-ordered set now? How to think of a well-ordered set, OK? So now I make a picture of a well-ordered set, OK? So this I will use next week as omega, OK? Again, s omega is a well-ordered set, which is uncountable, OK? Uncountable, but each section is countable. That's what s omega is. So we have an order and then we have a topological space, OK? Which is denoted s omega. That's the notion of the book, s omega. So now I make a picture of s omega. I cannot make a picture. It's a proof. It's not constructive, OK? It uses, well, anyway, what is a well-ordered set? They're finite sets, no? There's a natural numbers. So what is another well-ordered set? What numbers? It's also a well-ordered set. Real? Well, let's start. So in any case, what we need is an order set, linearly ordered, OK? And we suppose that this is well-ordered. So we want a well-ordered set, OK? A well-ordered set, first of all, has a minimum, no? It's not empty, OK? Well-ordered, of course, I'm not interested in the empty set. So it's a minimum, OK? There's a minimum. I don't know how to call this. Maybe alpha. The thing is, of what? No, no, let's wait. So this is a minimum, OK? Then in a well-ordered set, each element has an immediate successor, OK? That's an important fact. In a well-ordered set, each element has an immediate successor. The next one, why? You take all elements which are larger. So in a well-ordered, each element is in it. That's not completely true. It might finish here, OK? If it's not everything, OK? If it's not the largest element, if it is not, I have to add the largest element. There's nothing then. In that case, I have to exclude, OK? If you have the largest element, then it finishes, OK? If it's not the largest, there comes something after this. And then you take the smallest of everything which is larger. And that's a successor, right? You have an element which is not the largest. You take everything which is larger. This is non-empty, so it has a minimum. And this is the next element. So you get the next element, OK? If this is not everything, you get the next element. You have always the next element. You go to the next element, next element, next element. So what do you construct? There's nothing in between here, no? This is schematic picture, right? These are our elements, only these, OK? So what you could construct here, the natural numbers, in some sense. The order of the natural numbers, no? The first, the second, the third, the answer on, OK? Or the next one, the next one, the next one, the next one. So this is like the natural numbers, OK? This is a copy of the natural numbers. The first, the second, the third, the fourth, the next one, the next one, the next one. If this is everything, then it's the natural numbers, right? If it's not everything, there's something which is larger, then all these, and you have the minimum again, OK? Right. If this is not everything here, OK? Then there's something larger, and again, it's not empty, and you have the minimum, because it's well ordered, OK? So how do you call this element? No, no, no, a good name for this element is omega, which we know, r omega, no? Omega stands for countable, OK? Small omega, OK? Stands for countable. You will call r omega, OK? This is this omega, small omega, right? If this is a well-ordered set, OK? You have the first element. You have the second element. You have the third. And you suppose it's not everything, otherwise it's boring, it's finite, OK? So you get all this, the next one, next one, next one, next one. So you construct the sequence of elements, OK? If this is everything, then we have, in some sense, the natural numbers, the first, the second, the third, OK? If this is not everything, there's something which comes after this. You take everything, which is larger, and this has, again, the smallest element. And I just call this omega. If this is not everything, then there is something which comes now. And this I call omega, small omega. It's just a name, OK? You can give any name. So it's a Greek alphabet now. Alpha is the first, and omega is the last letter, OK? What do you think is your name? What? So now, this might be everything now, right? This is a well-ordered set, what you see here. Maybe it finishes. It's well-ordered, but it's not n. Because here, if you have one element added, which is larger than each, OK? This is a well-ordered set. It says nothing on this side, OK? It's well-ordered. But it's not like the natural numbers, because this is the largest element now, no? If it's not everything, what happens? You go in. This next one, next one, next one, next one, and so on. Right? There's always the next one. There's always the next one. There's not always the previous one, OK? Here, this element has no immediate predecessor. Only immediate successor. This is the first limit point, OK? This would be the first limit point. And then this goes on. So what you call here, how do you call this? So you have another copy of n after the first. So this is called 2 omega, this point. We suppose we have not everything, OK? This is countable, countable. Let's suppose it's uncountable, OK? So it doesn't finish here, OK? Then you go on. I'm just giving you an idea how the typical well-ordered set looks like this, right? Each well-ordered set starts like this. So what is this? 3 omega, right? And then this is another copy of n. I have to finish, yeah, one minute. This is another copy of n, right? I will go on next time with this picture. So now you see you have 4 omega, no? Not writing this, you have 5 omega. And then at a certain point, how do you call this? Omega times omega, no? Omega squared. Countable times countable, OK? But everything which is before is still countable. They are not uncountable, OK? It's countable. This is a typical well-ordered set, what you see here, OK? And then it starts from the beginning. Next one, the next one, the next one, the next one. Then you get omega squared plus omega, OK? You have some kind of notation here, OK? And so on. At a certain point, what you certainly get is omega 3, OK? And then you get omega 4, OK? It starts always from the beginning. And then there's something even bigger. So how do you call this? Of course, what else? Omega, omega. This is a new quality, OK? But this is countable. It's countable. It's still countable. It's countable. But this is constructive, what I show you here. This is constructive. Now I construct the set, right? First element, second element, and so on. I construct it. I write down, OK? I make a picture, OK? And then, of course, you get strange things. Now it's still countable, OK? So you get more and more strange things here. But it's countable. It's not so strange, OK? And then there's a certain point where you first time you have something where it becomes uncountable, OK? Here, this is the first point where it comes from finite to countable, OK? And now there's a point where you go from countable to uncountable. And this is the famous capital omega, OK? That's our point. But this is not constructive. So now it became uncountable, OK? No. Yes, we apply the well-ordering theorem, OK? I gave you the proof, right? I gave you the proof that as a well-ordered set, S omega, S omega, we can see, which is uncountable. But every section, if you go here, is countable. Every section is countable, OK? But here it's uncountable for the first time. What is difficult to see is this part, OK? This is uncountable. This is countable, OK? It's always this part which is uncountable. And here, you cannot make any picture. It's not constructive, OK? It's existence. There is this set, OK? And that's it. And you don't even need the axiom of choice that you see in the book in the first chapter. It gives a constructive without axiom of choice, OK? But as I said, then if you don't use the axiom of choice, then you cannot do anything with this set. So then you need the axiom of choice, OK? So it's not very useful to avoid it anyway, OK? But this is an idea. I mean, probably everybody of you know what is well-ordered, OK? Well-ordered, it's a definition, no? But how do they look like the well-ordered sets, no? And the well-ordered set looks like some part of this if it's countable, OK? This is 1n, OK? Then there comes a second copy of natural numbers, OK? Larger. Then comes the third one, OK? These are the well-ordered sets. They all well-ordered sets start in this way. Like the natural, then you start again, OK? And then you start again. So these are well-ordered sets. And every well-ordered set, which is countable, is somewhere here, OK? If it's uncountable, it starts here. But this is difficult to imagine, OK? What happens in this part here, no? That's exploding, what?