 Okay. So thanks everybody for making it out to the first first meeting of our seminar. Before we get started, I want to say if anybody wants to give a talk or has recommendations for people for me to send emails out, please, please reach out. I don't want to start pressuring the people who are here yet to talk as I've got a good bit of people lined up the next two or three weeks are settled. But if you have any recommendations for people for me to email please send them my way. Okay. So today we have Josh Cooper continuing his lectures on the Broward conjecture. I don't think a round of applause is a good thing to start or something I don't know be kind of weird so I guess Josh go ahead and just take us away. Thank you. Thanks everybody for joining. This is kind of excited to see how this pans out. This is a new day in the discrete seminar. You can attend, no matter where you are on the planet. It should be interesting. It looks like I mean we've got quite a crowd today and it seems like 10 days 15. So, Yeah, so I'm going to talk about some, not not so recent anymore fairly recent our conductor, some circles is considered important. It's a very interesting question. And because this is a talk that is really it's the he's a third talk in a series of a last semester. On this topic. I'm going to do a brief overview, not even so brief overview of kind of the, the context here the questions being asked what's known etc. Before getting to the piece of the, the story that I wanted to talk about today. Because it's the first week of classes and everything is not some just this is going to be a blackboard talk, meaning a whiteboard talk over zoom. So, let's see, can I share my whiteboard. Is that everybody said okay. Yeah, looks good to me. All right, so yeah this is I'm going to talk about Broward's conductor. And this is actually appeared first in Broward's book with hammers hammers from 2012. Although, it's funny you can actually find some papers in the receiving year or two on the topic because it was already being discussed and being called Broward's conductor. And, by the way, everything I talk about today is it's basically in this manuscript that currently being revered but is visible on the archive at this number is this one follow up on a number of archive. So, yeah, the idea here so this is a this is a conjecture about the properties of the Laplacian eigenvalues of a graph of an ordinary simple undirected graph. So, it's funny how little is known about Laplacian eigenvalues. And this is one of these questions that you'll see it sort of looks weird initially depending on where it's coming from. So let me define suppose you have G is just a simple undirected graph. And I'm going to find a few associated matrices, of course the game and spectral graph theory of you have your graph and you associate some of your favorite kind of matrix of it and you study the spectral properties of the matrix and related back to the this is the adjacency matrix. And there, of course, the idea is that the rows and columns are indexed by by the vertices of the graph and entry in the matrix is one if the corresponding row and column are adjacent in the graph and zero otherwise is a very standard way of treating matrices for computations, but also for theoretical purposes also learn to find d of G is the diagonal degree matrix. And it's just the diagonal matrix you get by putting the degrees on the degree vector of the graph, but those on the diagonal and zeros of the rails. The only thing to be careful of is that the adjacency matrix and the degree matrix have the rows and columns index the same way so that we can use a consistent labeling for the rows and columns. That's a and D and then L of G is just d of G minus a of D. This is the Laplacian matrix of G. And actually there's there's a couple of different Laplacian's out there. Sometimes called San Diego Laplacian normalize Laplacian. Or to you to Fanchong, but there's also the service people call this one that the combinatorial Laplacian and actually just differs by a similarity transformation from the normal normalize Laplacian but it's combinatorially the simplest although sometimes not best for Laplacian use any other applications you have in mind. So this is the combinatorial Laplacian. And of course, it's a real symmetric matrix and so it's eigenvalues are real. And so we can we can number them let's lambda one is the biggest maybe and then and the two and then well there's going to be end of them right because it's an end by a matrix. And so there are n eigenvalues Laplacian eigenvalues and the some things are known about these these lambdas. So, for example, these things will be the important the biggest one is always at most n and lambda n is always zero. Actually, the reason for that is that you might say okay well so where's the zero eigenvalue coming from. It means that there must be something in the null space of LG. The null ones vector, think about it for a minute, adding up the rose the adjacency matrix gives you the degree and so line LG by the all ones vector gives you the zero vectors that's where that's coming from. And in fact there's this well known property that the number of zeros and multiplicity of zero as an eigenvalue is equal to the number of components of the graph so if you have a connected graph you only get one. I can value at the very end zero but if it's disconnected there are more zeros that occur at the end of this list of eigenvalues, Laplacian eigenvalues. Okay, so those are the some basic properties of the Laplacian. Let me state the rower conjecture and then talk a little bit about where this is coming from. So, Rower's conjecture says, and it says that for any graph and these Laplacian eigenvalues, the sum of the key largest eigenvalues is bounded by the number of edges. This is a weird statement for a lot of reasons. You may not see the motivation immediately. I certainly didn't. And the reason for the just one of the sort of weird aspects of this is that, in addition to this being highly inhomogeneous and going out of nowhere. The left hand side of this expression is adding up a decreasing sequence, which means that if you plot them, you get something that's concave down. You get some concave down expression because the lambdas are decreasing. Of course, this is not always exactly what happens because you can have lots of lambdas that are equal and so there's black parts occur. And speaking of this, because they're decreasing, they are, well not flat, they're, well, they are flat in the sense of being aligned. Because these lambdas are decreasing, we just ordered them that way, the sum of them is concave down. And this is one way of saying that the sequence of chi-phan norms of your Laplacian or concave down because actually the sum is the kth chi-phan norms of your matrix L of v is the most written L of v, chi-phan are actually not even obvious that it should be a norm, but it's the sum of the k largest singular values, which in this case just correspond to the eigenvalues. In particular, these are, all these eigenvalues are non-negative and real. So that's the left-hand side. It's concave down. The right-hand side is concave up and doesn't even start at zero. It's in homogeneous. So you start at, you know, e of g, number of edges, which I'll often just call that m. And then you've got something that is a parabola up in k. And so the fact that the left-hand side is at most the right-hand side is sort of a strange fact. You might wonder where's the, what kinds of graphs do these two curves meet? And I'll talk about that in a moment. I just want to get this picture in your mind. So just to point out, there are some easy observations you can make right away about when this holds. Let me call this condition e c k of g of r r conjecture kth version by v g. So when k is zero it's trivial. When k is one you just get lambda one and we already said lambda one is less than or equal to n. And so, well on the right-hand side, you also get something that, well, the number of edges is at most, at least n minus one because we can assume that g is connected because it's easy to show that if Rauer's conjecture is true for some graphs and it's true for their distraught union, essentially because of the convexity of the function of k plus one equals two. So we only have to really consider connected graphs in order to prove Rauer's conjecture or special cases of it. And so we can assume the graph is connected, which means it's got already a tree in it, which means it's got at least n minus one edges, which means this m right here has at least n minus one. And well then the second sum and is then one. So some of those then so the left-hand side certainly gone to the bottom by right-hand side. With quite a bit more work, Rauer and some co-authors showed that this is true always also for k equals two. And then also something that I am not going to prove today actually I think it did in the first lecture in this report series, plus four report series, that if it's true for k, it's not hard to show that if BCA is always true, then that implies that BC n minus one minus k also it's also true there and it's quite complimenting the underlying graph as you might expect. So the idea is that if it's true for k equals one and k equals two, like I just said, so it's true for k equals one and it's true for k equals two. Well then it's also true for k equals n minus two and then minus one. So that's a few of the things that we know about this. Like I said that comes from complementation. We also know that eck g is true when G is a threshold graph. Actually there's a number of characterizations of threshold graphs. One of them is that you're starting from a single vertex and then you're allowed to take disjoint unions and complements. That's one way to do it. Another way to do it is what I call the threshold graph is you just define G to be, you have a vertex set B and then we have a function from the vertices to the reels and we're going to put an edge into the edge set of G, let's say x, y, whenever f of x plus f of y is greater than or equal to some alpha that we've also chosen in advance. So if you can, you get a class of graphs this way they're called threshold graphs, it turns out so very easy to show that threshold graphs are, they satisfy the grower conductor and in fact they're tight so you can hit the bound for any k, you can always construct a threshold graph which hits that bound. And so that's one reason for believing this conductor is that that's an upper bound for threshold graphs and threshold graphs are sort of widely believed to have the largest of these sums of eigenvalues, these chi-fan norms of Bola-Plaussian. And actually there's a larger class of graphs called split graphs where these contain the threshold graphs. These also, it's known by the master's thesis of Myoc actually, that the Brouwer conductor is true for split graphs. Split graphs are graphs where you can split it into some complete graph and a complement to the complete graph. So it looks like a complete graph on one side of the split and an independent set on the other side of the split and then something in between. So anything in between is fine, that still counts as a split graph no matter what which goes into the edges in between. So if you guys have this structure, notice you can, what is the split for a threshold graph? It's, you take all the vertices whose value according to F is greater than or equal to alpha over two. Those all have, they're all adjacent to each other because of course the sum of any two of them is at least alpha. On the other side, in the independent set, if f of x and f of y are both less than alpha, then there's some, sorry, alpha over two. So one is less than alpha. So, those don't have any, then things in between. Okay, they may or may not have edges. So yes, that's threshold graphs are also split and those are known to satisfy the Brouwer conductor. We also know things like if you have at most 10 vertices, it's true. Actually, I was able to extend this using some of the results in this manuscript to 11 now. So all of the graphs on the left of the vertices satisfy the Brouwer conductor. And by the way, if you're interested in research topic that is probably just a matter of putting in some work and not too much luck involved you probably get it to 12 without too much breaking too much of a sweat. It's going to be more efficient with the computational efficiency of the algorithms. So, we also know this the Brouwer conductor is true for trees, and Horus, Horus I already mentioned, once you know, for some graphs, you know they're just right union. And it's also known for regular graphs and random graphs. And also things like unicyclic graphs and bicyclic graphs one cycle cycles. I discussed in the previous talks about this, this Brouwer conjecture and some of the results around it, able to show that if K is at least four times the, the arboricity minus one. Then, it's true and so in particular, K granted 11 constant 11, the planar graphs, and also able to show that if K is at least, remember K is varying from one up to and so K is at least the square root of 32 and that's for G bipartite so it's true for almost all the case, just maybe a few at the beginning, you think it is true but I can show that for once you get past 30m for bipartite. Actually, it's not important that it be bipartite it just has to belong to a hereditary class with some forbidden sub graphs basically. So you get a class with a classes of graphs, which the new sub graphs belong to the class as well. Once that happens you get a class of graphs for something like this occurs. Essentially, this changes the constants. Also, same thing for bipartite graphs and just using bipartite. So if K is at least square root of 32 ended the rehabs, then also hold the bipartite graphs. And just to point out right like the number of edges could be as much as quadratic and so this is saying it's, it's anything more than very sparse and it has been sparse graph. If you have a bipartite graph it's reasonably dense. So, I'm able to show that if the variance of the degree sequence I'll tell you exactly what I mean by that in a second is at most. Actually, it's not quick there's a little pickup here it's not quite that it's almost that. The variance the degree sequence I just mean, you know, think of them as samples and take the variance of that. That's of examples or if you like, it's the take a random variable which so uniformly a random pickup vertex of the graph, and then output degree, the variance of that is. And so on the right hand side, you know the variance in general it's going to be order and squared so this is beta. Beta is. It's going to be order density so I'm over the fraction of possible edges which actually are. Number m always means the number of it is not graph. So, if the variance is not very high. Then Brouwer holds, and so this by the way this implies the result about regular graphs because they're the variance is zero. Also implies it for air dish rainy random graphs because the variance of those sequences is much smaller it's linear instead of quadratic. And so, yeah so it also graphs where the max degree in the mid degree and out far apart. Things like that this applies to lots of different kinds of graphs. And it's another one of our results and another case where the Brouwer conductor, I can show it's true is when the splittance which is often written sigma I'll tell you what splittance is in a second is at least the three halves of the splittance of a graph is the edit distance to the class of split graphs. This is the minimum number of edges that you have to flip from well from edges to non edges or from. In order to make your graph split. So, the splittance that's the splittance of the graph and it as long as it's at least end of the three halves over to know that you're reasonably far from being a split graph, then, then the conditional so it's only graphs that are very close to being split that that we don't know. And also show the slightly different flavor that this number we're comparing the sum of the first J Eigen values to this question is this at most this quantity. So, in complete generality for any G, the left hand side, minus the right hand side, it should be always not non positive right that's the conjecture the left hand side but it's right hand side is non positive, but I can show this is it's certainly never bigger than some calls and times and what's where again that both sides could be quadratic so that's small compared to n squared. So, for bipartite G, again, like it's not. It's just that it was a hereditary class but for bipartite G same thing left hand side by right hand side. I can show it's the most linearly off. So we're always even in the case where you're very close to the splittance and all those other conditions fell. So the annihilation of the conductors, the most end of the five boards and bipartite and and finally, I can show that for any G. The set of case, so that, you know, the BC architecture fails for for G is contained in an interval case are the very from zero up to add a lot and dated an interval. It holds everywhere except short interval namely one that has length and most end of the reports, we're comparing the end possibilities are k varies from zero up to and or one up to and and show that, well, in general, it's everywhere but the most for end of the three, what's values a and they actually fall in an interval actually you can kind of see how you prove that from a picture. The picture I do previously, consider this and how could you know a downward concave down curve. How could that go above this concave up curve. And not exceed the value by very much. So this is basically taking these facts and noting that if you're going to not exceed the this this upper curve by at most, you know, and or end of the five boards. Well, you have to kind of turn around immediately after you cross the curve you can't go too far. And so the interval where you pass. It's an interval where you could potentially pass the browser conjecture. It's, it's the most length. And it's really just because the two curves are one of those content about the other ones down. So yeah, that's just just the results gave you feel for what's known about this and something to note is that there's another matrix associated with graphs that doesn't come up as often and isn't as highly studied but it's definitely of interest. It's this unsigned process. So this is it'll look almost the same. It's not the degree matrix minus. As you know degrees on diagonal and minus ones everywhere. It's instead plus. Now you've got ones everywhere. It's conjectured that the same statement is true there. And so we're for the, but now instead of the lambdas being the plus and I do the sinless Laplace. Even less is known about the sinless Laplace. So it's conjectured this is true and not in a keeper off had a interesting thought. So the way to kind of interpolate between the sinless Laplace in and the, and the unsigned Laplace in. Namely, what if we flip edges from pluses to minuses one at a time. So what I want to do is I'm going to define a signed graph. The graph is it's a pair G and how how takes the edge set to just plus minus one. So we're going to give a side to each edge. The degree matrix is what you think it is. It's just the usual diagonal of the degree sequence of G. And the adjacency matrix though is now. It's really great. So it's the you've signed. A of G tau. This means the matrix which is, let me just tell you for a given pair of EW vertices. It's. Well, it's just tau of EW if EW is an edge. And it's, it's zero otherwise. And then we'll define the signed Laplace in. How the, you know, do you minus. So, right, so the idea is that if, if tau assigned all ones, the edges, this is just the ordinary Laplace in. If it's assigned assigns minus one solid edges, then it's the unsigned Laplace in. And so by considering all the sign graphs are kind of interpolating between these two regimes that unsigned Laplace in the sign process. I have for a question, if you think that the Brouwer conjecture is true for Laplacian's and you think it's true the analog of it is true for the sign supply so well then maybe it's true all the way in between. And I'm a key for off started looking at examples and very quickly found a graph with I think it's I vertices and six edges, which violates the conjecture. You found a small graph to the property of a small signed graph to the property that the left hand side to some of the signed Laplacian eigenvalues exceeds the right hand side that M plus K plus one to two. And so I was wondering, you know, is this, is this a common phenomenon or maybe this is only true for some very small graphs only graphs with most eight vertices and then it's true or more. So no, it turns out, it's usually false. It was kind of a shock. True one and true the other it's probably true in between but nope. So, here I'm, and just says that one way to say it is that the Brouwer conjecture for G. The complete graph assigned complete graph is false for almost all. Actually what I mean is asymptotically almost early. Right so, namely, as n tends to infinity the probability that the Brouwer conjecture fails tends to one. That's what this is. Once you once you make n big enough you can make the probability of failure it is close to one of your life. So usually fails this. Usually not true. Signed graphs. It's somewhat surprising. And just to kind of give you a little picture of what's going on here if you think about what does Brouwer say about the complete graph so complete graph is a split graph. So it's, it's certainly true there Brouwer true what does this look like. Well, on the left hand side, some goes from one to land the J. And then on the right hand side. And then on the right hand side, you get a number of edges plus two. Well, a number of edges is and choose to. And on the left hand side. What is that. Well, it's not hard to see that the Laplace and I can values of a complete graph are these are the And minus one. And that these are all equal and minus one, and then of course that zero, which means that the left hand side here until you get to k equals and which we already know. And so what are these two curves look like, you know, if you think about a going from going to keep drawing this graph where we're going from K is going from zero to n. So, this, the right hand side. It starts at about n squared over two and then choose to, and then it goes up to well when K is is n. It's n plus one choose to, it's basically n squared over two again. And so the sum of them is n squared. So we think of this is going up to n squared, the upper bound looks like this. That's the upper bound. And then this quantity, the left hand side lower bound, but it's just a line slope and minus one and actually, when K is, and K is n minus one, he's meat. It's a line that basically hits it right at the end, but it's a little, the last value it does. I don't want it to go over the curve. There we go. And then there's a little last value doesn't matter. But it actually meets it. And these threshold graphs they meet the Rower bound it looks like that right that's left hand side certainly must be right hand side. So you can imagine what's going to happen as we add the signings to our, our complete graph. And we're going to perturb this red line somehow. And the question is what does that perturbation look like. And you can it down you see actually it makes the red curve go over the blue curve. Only very, very slightly turns out it's pretty delicate actually a couple of days of sitting in this plane with the calculations to get this right. Find, find detail in the calculation. This is really in the lower order terms of this happens. I'm going to try to convince you of this today, although I'm not going to do all the details if lots of nasty rigorous details. Essentially, there's a bunch of integrals, kind of messy things that happen, but the story I'm about to tell you can make rigorous. So, what's the idea here. You think about it right the Laplacian of the signed graph. What, what does this look like well Laplacian of the complete graph is, you've got what ends on the diagonal. And minus ones and minus ones on the diagonal. And so, take the identity matrix times a minus one, and then. And then we're going to subtract the all ones matrix is often represented J. So actually, in order to get this to work I did one. So 10 times the identity matrix minus this. This J matrix and all ones. And by all ones. So right that's the, that's the ordinary Laplacian what happens when you, when you give it the signing. Well the degrees don't change so the first term doesn't change. The second term is going to change though. So, it's instead. And then, okay, you're going to, you're going to be adding and subtracting some kind of symmetric plus minus one matrix right so it turns out that it's N times M, where M is an I ID. Symmetric matrix with entries plus minus one of a root N. Without you, your, your buses and minus ones in multiply. I decide well I mean is that you choose all of the entries above strictly above the diagonal independently and identically just just uniformly 5050 for the coin. Assign these one of these two values according to the outcome of the coin and then copy all those values in the lower triangle. That's M. And so this is an expression for the Laplacian of that matrix of the signed. And the reason to write it this way is that the spectrum of N is well under of M is well understood. And in fact, by this relation, you get that the case Laplacian Eigenvalue of the side Laplacian is there in bijection with the Eigenvalues of M. So just this, take a, take an eigenvalue of M, and let's call it new K, take an eigenvalue of M and multiply by root N and then add N to it, get an eigenvalue of the sign Laplacian. And, okay, well, again, the matrix M is real symmetric. And so the eigenvalues can be ordered. And, like I said, it's actually very well understood what these news look like what the eigenvalues of a random symmetric matrix look like. There's a very famous result known as big news semi circle law. It says what the news look like. Oh, keep in mind M is a random matrix right so these these news are also random. And we can only describe their probability distribution. It turns out that the, the PDF of the eigenvalues is, it's a semi circle. It's one over two pi square root of four minus x squared. If you have one of these matrices, the reason that this this one of a root N, you might wonder why am I multiplying by root N divided by root N. Well it's so that this matrix has a norm one. It's just normalization so that it fits the standard statements of figures semi circle law. So this is the, this is what this curve looks like. It's one over two pi times square root of four times x squared. And the idea is that the height of the curve height of a circle is the probability density of the eigenvalues that you get lots of eigenvalues near the middle. Right and then they sort of Peter out near the edges, but you get, you get the most of them near the middle and if you plot a histogram of these things you will get a in the limit, get a semi circle. And even I'm not going to be precise about that's the statement of figures semi circle law it says that essentially any ID symmetric matrix, symmetric matrix has its eigenvalues distributed this way not just plus minus one, huge class of matrices. Even like if their entries are Gaussian distributed. So that's bigger semi circle law. And, okay, so that should give us a way to compute approximately at least these new case. And then that gives us by virtue of this expression, the lambda case and then we can add them up and see how it compares to number of edges which is then choose to plus a plus one choose to. So that's kind of a game plan. So, by the way to do that, you need to know where the new case are with quite a bit of precision so it's not enough to know just the semi circle law and it's sort of classical formulation, which is a, it's a limit theorem. And it says that the essentially that the number of eigenvalues up to a certain point. Mark an X here on the line officer is just equals the fraction of eigenvalues to the left of the value X is the integral under this semi circle. There's a local limit law version of that that's known. And quite a bit is known about these the spacing between the eigenvalues a lot is known. This is precise. This is true in a very precise sense of the location of the eigenvalues is sort of exactly what you would expect from this picture that I did. Of course, they're random, but you can put the eigenvalues in a very small window. And, okay, so actually, where is the key to eigenvalue that you might wonder, you know, given this picture, where does the key to eigenvalue fall. The alpha and I can go I sort of think of alpha is varying from zero to one right so we have the, the first eigenvalue and alphas one of around, think of this Europe, all the way up to an alpha is one. Right of course that the subscript is supposed to be an integer but we're going to pretend like it isn't for the moment. When does this happen when when is lambda. When is new alpha and to some value T well it's when approximately local limit window that it's inside of. So that'll be true that new alpha and it'll be about T if, well, the integral from T up to two of, let's call this function row of X of row of X. The X is alpha, right if you want an alpha fraction of the eigenvalues where you have to integrate up to the point where you get an alpha fraction of the area under the curve. And notice by the way I'm indexing backwards or I'm indexing from right to left. So, that's why the, this is integrated from T up to two instead of negative one or negative two. That's happening. So, okay, if we call this function, this integral, all it, let's say f of T, then we're really saying is that new alpha and will be about f inverse it off. Okay, so we've got our. And then so what does F look like to F of T is the integral from T up to up to two of row. If you integrate this function. You get, you get, this is one negative two to two. It looks like this right so all the eigenvalues are negative two and none of them are good. So that's what this integral looks like this is your F, and then right and so the new alpha and as well it's about an alpha. And so what does F inverse look like well okay. All that. So, if you draw F inverse okay that looks like this about a minute graph inverse, and it's going from zero to one and negative two to two. That's the graph of F inverse. And. Okay, so what does the sum of the eigenvalues look like one up to. Alpha and think of K is being multiple. Third of an or two thirds of an etc. Well you're going to get. Remember the lambda K is n plus root and times new k so you get an end for each one of these lambda so it's n times alpha and, and then you pick up something from the new end. The new ends, namely, you get end of the three halves because the root and gives you a factor of end of the one half and then there's a factor of and because I'm rescaling everything from zero up to one notice I wrote it up to one here instead of up to one. So the rescaling gives you another factor of M, and then it's just the integral under this curve now. So this is the integral from zero to alpha of inverse of x. And if you plot that. It looks like this. Think about think about integrating this integrate the whole thing you get zero because the symmetric about a half. And so that's why it comes back to zero over here, but this curve is concave down and it's, it's positive. It's just the integral. A lot of the integral that so okay so that means that we should expect here that the, the alpha and I can value looks like this. The integral from zero to alpha inverse of x, the x. And so, now you can see in that that picture that I drew where there's this line from the complete graph and then a parabola from the upper bound in our conjecture. So if you add something to that line, which is positive and concave down, it will do something like this and cross the upper bound, but it'll cross the upper bound near the end, because this term is really small compared to this term. It's really in a much, much lower order so you really should see this crossing of the bound, much, much closer to the M, or M. You know, so what's what's the upper bound, you know, M plus key plus one choose to this is, well, it's, you know, and squared over to Amazon choose to, and then plus, if we have alpha, then it's, you know, alpha m squared over to about and so you get what n squared over to one plus alpha squared, right, or if you write beta to be one minus alpha. So we're starting from the end instead of starting from the beginning, then you get that M plus a plus one choose to is about n squared over to plus and times one minus beta squared over to which is n squared one minus beta, then squared from both of the terms and then n beta times two all over to that's where beta comes from, and then as a lower order n squared beta squared over to. And so okay so I want to consider both of these expressions and want to compare these two that expression and that expression. If we take beta to be about one over root and so we're getting really close to the end, it's not quite zero zero would be all the way at the end one over root and as close. Well, in that case, the, this express I'm really running out of room. So, this expression is what it's about n squared one minus beta plus some constant times and right because we're getting alpha is about one over root and and we're integrating something that's, you know, just some some positive value that you just replace it with the average value on that interval. It's about we're getting something it's about one over root and because this distance is one of a root and right here that's what it is chosen to be. And so you get one of a root and times end of the three have that gives you a linear term right, but if you compare it to the the other term, what do you get so this the second term is n squared one minus beta I'm going to leave that as it is. And then the second term is, well beta squared is now one over n, one of our n times n is times n squared is also linear. So you get some other constant C prime times as you see it is very delicate because now we're just comparing these two linear terms. And it turns out because of this convexity this the idea of this this thing is is actually it's positive down. It turns out this C here is is three halves. Whereas this C prime is one half one half no it says one. And so the upper back this the upper bound is exceeded by the thing that's supposed to be an upper bound is exceeded by the left hand side by about n over two. And I don't know if that's optimal but it's fairly close. And so yeah so the that implies right that the average, the average case or almost all signings of a complete graph. So some of the Laplacian eigenvalues, the sign Laplacian eigenvalues exceeds the Brouwer bound, when K is very close to and namely it's about root and less than, and so the most most signings actually finally. And just a comment, I think that's not true of almost all graphs because a random graph is quite far from violating the Brouwer conjecture they all satisfy the Brouwer conjecture, but maybe it's true for random signings of threshold graphs, I'm not sure. It was essentially it was easy to show it for easy, but it's just this calculation to show it for complete graph, because all the degrees are the same. Okay, so they have very tight control of the form of the Laplacian, but for other threshold graphs I may not be the case. So one question follow up question is, is this true is it true that the Brouwer conjecture fails assigned to Brouwer conjecture fails for random signings of Brouwer of threshold graphs. Thanks, Josh. Maybe everybody turn their mics on and saying thanks would be too much or collapse or something, but maybe some thank yous in the chat would be would be nice for Josh. If anybody has any questions go ahead and we'll open it up. Are we going to get cut off the 330. I don't know Lincoln, do you know whether this will just shut us off to find out. I think so as light no other sections coming up. So we should be can be like it's not problem. That's a question of whether it's simply employed to stay on or whether we're actually. Yeah, well, we schedule this seminar we put some buffer between this is searching, so you can somebody spend longer than one hour. Hi, well, if nobody has any questions. A lot of the classes of graphs you mentioned up front that satisfy the Brouwer conjecture, I think are also distinguished by low dimensional vice filer layman for graphs for graph isomorphism. And in particular two dimensional vice filer layman distinguishes non co spectral graphs so the eigenvalues are determined by two dimensional vice filer layman. I'm wondering if there any combinatorial properties that are picked up by two or three dimensional vice filer layman that would be useful in informing us about why these graphs do satisfy the Brouwer conjecture that we possibly leverage and generalize from that. Yeah, so I know virtually nothing about the year you're referring to. Interesting. Yeah, I guess I mean somehow that you know it's the split graphs that are also threshold graphs are specifically are the most dangerous right they're the ones that things are close to come close to split graphs are only grass. So I wonder what is what is what is what does look like. Do you have any other questions. We'll get a chat a second to back out some questions if we have anything otherwise we'll go ahead and close it out here. Thanks everybody it's like amazingly smoothly circumstances. So I guess I'll have another one next week. Yep. Thanks again. Have a good weekend. Yep.