 Okay, first of all, this is giving you the first glance of the experiment from the sky. You can see now it's a complete facility of the array of cosmic ray extensive air shelter. In cross from this point to this point is about 1.3 kilometer. And the total size of this area is also 1.3 square kilometer. And with many, you know, the dots as detectors, and in the middle we have the water chunk of detector with a total area of 78,000 square meter. And this just give you the idea of the geologic position of, you know, in the country. Basically, this part is the Chin-Tang Plateau that's the highest place in the world. So we just catch the edge of this plateau, find out to the someplace at the height of 4,410 meters. And then, you know, it's a good place to build this kind of detectors. The very interesting part of this detector, you can see there was a lot of waters actually in this region. You see, this is the, you know, we built, basically we built a tunnel to guide the water surrounding this area, instead of going through, you know, we still have a little water going through. But the water is very, very important for this type of experiment. We totally spent a lot of water just gathered from the streets. And then let's introduce you the instrument first to do this coverage on search. So this is a layout of the detectors. Instead of that beautiful picture, now it's a sketch of this detectors array. Again, this is from here to here is one point, the three kilometers. Basically, in the central part, you see, we have the full coverage of all kind of detectors. So in this part is, in total is one square kilometer covered basically by this type of detectors, which is the one square meter of a centimeter detector. You can see, it was with the fibers we got to the night from the circulator base to further to the middle to get to the information about the particle density and also arriving time of the particles. This is the important part. Basically, we have 5,000 pieces to cover the whole bridge with the spacing of 50 meters in the middle of the array. And for this skirt area, and we have the spacing a larger reach to 30 meters between detectors, basically service as a veto to get to make sure the shower is falling inside of this array. And this detectors also this array is also covered by this big muon detectors. This is the seven meter diameter in this region and the pure water inside the one PMT center and to read the muon signals. So instead of to get the electrons and photons to produce this detector. So we have the overburden of 2.5 meters above of this detector, make sure it's only muons getting to. And the essential part as I mentioned is the size is the 78,000 square meters of water chunk of the time to inside to just look like this. Okay, was the 4,4.5 meters of water. Fuel top and then at the center we have every sale we have for the tubes there to measure the signals. The big one is about 20 inch of this, you know, the main detectors. And also we have the three inches to say side by side with this kind. This is two part of the main part of this experiment to do the photon measurements. So this is inside of the water chunk of detector. This picture just give you an idea once you have something around a 10 TV showers falling in this area, then basically just line up everybody is in this detector. And as I said, the area is this and the total channel we have, like this, this combination, we have 3,120 of this kind of detectors. The energy range, basically, yes, so from 100 GV to 10 TV for this type of detector. And for the array, the detector array, as I described, photo we will have 5,220, 200, 216 detectors, as the scintillator detectors. And also we have 1,188, the muon detectors, as I described, that enormous big detector by using this with the spacing of 15 meters. So we can cover the energy range from a 10 TV to 10 PV. This just give you an idea of once the shower falling down to this array, this is typically as one PV events, you know, with a size of this, this is a one quarter. This part is the WCDA is what a chunk of detectors. So you see, for this, you know, almost a quarter of the array being lined up by this one event. So the idea is that we can use the muon to do the separation between photon shower here. And the, you know, if we have a chance to catch the one PV photons, then you can see, you can tell the difference immediately by using this number of the muons measured by this, you know, indicated by the circles here. And for the gamma ray events, you have much, much fewer muons being measured in this event. I will give you more details about this part. So we have a data acquisition system on site, basically supported by a very advanced computer system on site. But you cannot imagine at such a remote area, we actually build up such a, you know, the computing system. Okay, almost now is almost 10,000, the course of the computer CPUs there. So this is pretty powerful computing system. And then we build up our basically trigger these data acquisition with the data rate of the full gigabytes per second. This is the data, you know, eventually we sort out the event to store it here temporarily. We have the capability to hold the data for one month almost here before sending to the computer center in the IHP in Beijing. And you'll see we have so many temporary storage there and also we have the bandwidth of 2.4 gigabit per second to transmit the data from this computer center to the computer center in Beijing every moment. This is the hardware and then there is the software. Let's see the operation of this system. Basically, we had three stages of the operation as in the first phase, we have only the half of the detectors built in the whole array in this kind of strange shape. Okay, but this strange shape is very, very useful. It started from the end of 2019. And we spend almost 11 months to reach the first piece of the scientific results, which is this discovery of 12 parameters by using this part of the array. And before that, actually we have, at the moment, we have only one water pool. We actually have three cores, you know, to form this whole WCA. At the moment, we have only one pond actually has been operating. And second phase, they called three quarters of detectors, just like even more stringer shape of the array by the end of 2020. And then we had something around eight months of operation with this kind of strange shape. And now we have, at that moment, we have two ponds working together. And at the end of 2021, we build up everything, just the three ponds together and the whole array. Okay, so that is the last moment of the construction of the whole array. And then we go to this operation phase. Now I just give you some idea about how stable the system is. Sorry about the spelling. You see this is the beauty cycle recorded in last year, this is 2022. Basically, you see almost no interruption of the operation. This is for WCD. No, this is for a ground array. Okay, and you can see how many detectors actually survive. So you can see this is number actually keep going very smoothly with some kind of, there's a bigger, you know, bigger loss. This is basically a due to a well, the thunder, the, you know, the striking to our array. Basically, we lost something like 200 pieces of this synthetic detectors. And then we just recover it very quickly and keep the whole system working smoothly. In the end, we have this kind of duty cycle reach to something like 99 99.6% of the duty cycle. And this is the water channel of detector. So you can see the similar situation. The operation is very smooth and the duty cycle reaches to 98.6 46%. And the number of the detectors. Remember, we have this 3,120 indicated by this dash nine. And this is a real number we have, you know, keeping the knife. This is to give you some amazing number. The data actually recorded in terms of number of events, we got one training in last year for the no energy event and the 70 billion for the highly events basically indicated by the refer to the events measured by the ground array. So we had the simulation down very well as well. We have one billion of a low energy simulation to supported the reconstruction of this one training events, and this 0.7 billion events for this, you know, came to a event. And then we have this coverage by using this detect continuously for 24 hours at any moment, so we have the coverage of the one third of the array of the sky. And then we have the full coverage. 24 hours for this kind of shape in our sky basically don't know the sky. And, of course, this kind of detector is a very good for the transient phenomena, for example, like this very famous jrp now is 2020 1009 a. So basically this gives you this idea. This is the start point. And then this is the finish point of this burst. So, this is the advantage of this kind of launch field of view detector. And the second issue to do this kind of search for Perotron Perotron, of course, in that requires a very strong capability of separation between gamma rays and heaters. So, let's go to the first of this low energy part of the WCDA. So, basically, you have this kind of a footprint in our water pool for the events. If we know this is gamma event, for example, from the curve direction. And then you can compare with the almost the same energy of the proton shower that is much more spread out. So you can somehow build up some kind of parameter to separate this, you know, to for this kind of a compactness. This is more compact than this. So, once you build up some parameters, you can get to this kind of distribution. This is a proton like events. Then you can make us, you know, talk at some level to make sure you have sufficient, you know, survival rate for gamma ray showers and simultaneous you have the biggest separation effect for the proton showers. For example, like this, this from those numbers reduced down to this kind of number. Once you have the heat for the one shower greater than five four to 500, and then you certainly have the separation power something like 10 to the minus three to surprise surprise the proton background. For the round array, it is much easier job by using muon simulation or muon information measured by muon array. So here you just do a very simple, simple thing here. This is a number of muons measured by all this 1188 detectors. And you just add them up and get this, you know, this number of muons and this is a number you measured from the same data detectors. We call this total number of electrons. And then you have this, you know, the clear separation between gamma ray showers and proton showers. So if you make a cut over here, which is a very simple thing, just take the ratio of a number muon versus number E. If this ratio is less than one over 230, then you know this is the, you know, you can choose this as the photon like shower. And this thing is very clean. And basically you can, you can do that by using this kind of thing to check who was the crowd. You know, if you look at the direction of a crowd within the one degree region, and then this is a number of events you measured in total. This is custom grade basically. And then you apply this kind of cut on you, on this events, and then you get this distribution. At the 100 TV level, this number is something like 10 to minus four to surprise the background. And if you go to the 500 TV, this number becomes even much much better. It's 10 to minus five. And then if you, now you measure the spectrum from the crowd level, and you see for this region, basically you have more than one order of magnitude signal above the background. That's called the background free measurement, pretty much. Okay, then there's a calibration issue. So, how can we get to the good angular resolution that's pretty much based on the timing. So, we have to have a very good timing calibration, and then we have to check our point interaction, and also we check our angular resolution as well. So, we do that. Now, pretty much we just rely on this so called the right rocket, which is the fiber that was the base, the clock centralization system. It actually is just look like a very normal switches, just like this way totally we use the five, five from the pieces of this switches to connect the whole array. Every piece of the synthetic detectors and the water trunk of the characters, we have this kind of connection. We can make sure we have the time synchronized at something around 0.2 millisecond. Actually, sometimes we can get to the good as better as 100 picosecond as well. I guarantee that we have, you know, the timing of the each detector is synchronized very well. However, this seems has not reached to every cell for WCDA. We have a cluster of this kind of cells, something around nine. So, we need that, that means that every nine cells, we have this kind of timing at 0.2 millisecond level. However, we have a cable connection between the each cells to the central part of the cluster. So we have to, we have cables, we have, you know, electronics that has to be calibrated as well. So we build up this LED illuminated fiber network to, you see, this is a fiber and then reach to this detector. And we can use this system again to reach the calibration accuracy of 0.2 millisecond as well by using the system. And then we can do this pointing calibration. So basically we're using a standard candle, which is the crab labula. And at this time, now you will get a much better tour of this gamma ray burst. You know, this gamma ray burst is so amazing. Actually, within a 270 second, that reaches the significance of this exposure. Even larger than the crab measurement for 508 days. That's amazing. You know, by using this, you can do very well, you know, pointing direction calibration. That gave us idea about the accuracy about 0.02 degree as our, you know, pointing measurements. And also, because those two guys are point source, so we can use them to directly measure our pointing, our angular resolution to see the PSF, the size of PSF. For example, like at the 1 to 12 TV using a water chunk of the detector, and then you see at the low end of the energy, we have the PSF size just like this. This is directly measured by WCDA. You see the total exposure time, we have this kind of significant case. And at the high end of this at the 10 TV level, the PSF reduced down to this size. Okay. And something around 0 to 0.2 degree. And by using this big array for at the 12 or 20 TV level, the split, the PSF function is something around the 0.3 degree, but once you get a higher energy to something like 100 TV and above. This reduced down to 0.15 degree. Okay. Actually, it can go to the simulation as well to to verify this. And then this is a number. Basically, by using this detectors with the spacing of 15 meters, first of all, we got the, you know, accuracy of the shallow location is something around the three meters. Okay. This is important to do the energy measurements. One, you have to know where the shower is. This is the one thing. The second thing is, I just mentioned, this is 0.2 degree of the anger resolution. In the simulation, you can verify yourself by comparing with what I mentioned this kind of measurement. And then this gave you the idea of the energy dependence and also the, the Zenith angle dependence. So, for example, like this is almost a very polite showers, and this is much more inclined shower. This is 50 degree. Okay, you see, this is the. And that is called, that is the core resolution. This is angular resolution. Okay. Okay, that is the energy. How do we know the energy of the shower? Basically, for the measurement or using the ground rate, you have this kind of footprint and each detectors, you actually, you have the number of the electrons, no, the charged particles measured here. Okay, so actually, there's another only the charged particles, our detectors covered by 0.5 millimeter. Now, 5 millimeters of lead plate. So basically that converts the photons, gamma rays into pairs of electrons can be measured by this sentient detector. So you have a total number of detector measured, number of particles measured, like this function, you can build up this natural distribution function. As I said that you have to know the core location rather good. And then you have this measurement of distance from the shallow axis. And then you have numbers versus this, and eventually you find out that this is typical event just a fit by this function called a modified NKG function. And then you have the quite good measurement at the 5 meter, 15 meters as a number we choose as the indicator as the estimator of the energy. So if you use this to do the measurement of the energy, then there's something we have this kind of the resolution at the 100 TeV. So for showers, that's in the 20 degree, we'll have quite a good energy resolution, something around 14%, okay, quite symmetric Gaussian distribution here. And then, if you look at the different energy, we have this kind of a response function at a lower energy, of course, the response, it becomes worse. But at an energy above 100 TeV, this is quite good, just like this. And this is the energy resolution changing with energy as well. This is based on the simulation from this kind of curve. And then you have the non-idea how the resolution changes from 30% reduced down to something around 10%. Okay, at the highest energy, here is the 1 TeV, here's 100 TeV. So this is the idea. And to do this at a very high energy, like a product from the search, the people is worrying about, do we have any kind of the, you know, B2B migration to pollute your measurement at the very high energy part. For example, like the PEV events, okay, this is very important that you have to have this kind of curve to check. They call the energy purity at each beam. For example, like this highest beam, you can calculate how much of this, you know, lower the beams to mix into this beam to generate some kind of pollution. So for even a very high energy events, for example, around the PEV, every event actually we carried out even more careful work to try to, you know, to understand our energy better by using the simulation tool. Basically, the simulation has been checked many, many times to compare with the real events. So once we have one event above the 500 TeV, we do this kind of, you know, simulation for each event, something around the 10k of events generated based on some kind of a spectrum. And then by using the cut of the measurements, according to our estimator, but for example, like this guy, or the Rolf 50, we just mentioned that one. And then you can select the range which indicated your measurements to see the distribution of this, you know, the simulated event. That gives you the idea. For example, there's a 500 TeV events. You certainly have kind of the spread out down to the something around the 200 something. Okay. So that means this event could be, there was some possibility, you can make a calculation based on this curve to see how much it could be polluted by low energy showers. Okay, this is a very important step. Once you are dealing with some very high energy events. Okay. Then there's a survey source. Once you have everything down there, then you can do this job. So this is a very simple. Basically, as I said, once you choose, once you have this kind of the photon events picked out by using a muon cart. And then you have the energy resolution shown that before. And then you have this collection of so called UHE photons, which means greater than 100 TeV. So once you have so many photons, and then you just look at the distribution sky of those number of the photons at that moment we have only something like a 530 events above 100 TeV. And then suddenly you can see those 500 events actually make some clusters in sky just like this. Okay, so that immediately indicate there are some source probably associated with this kind of clusters, what do you find by using this events. Now this number is much, much larger, something more than almost 2000 of the events we have. So that indicates probably we have more sources of worry, some seen by this detector. And then they use a relatively hot cart. As I mentioned at 100 TeV we have this kind of 10 to minus 4 rejection power. And then we set to the rather high standard. And of course, those sources must, you know, greater than 7 sigma to be the candidate of the probability. And by doing so, as I, you know, just pointed out here, we have 12 clusters found, and then we find that those guys probably associated with something. Here is the flux in terms of the flux. So something that's really surprising. For example, like this 1825, you know, the flux is much stronger than crop. Okay, here is the maximum energy measured at the moment. We have the highest energy at the 1.4 TeV. Okay, for those kind of high energy events, as I said, it's very, very important. In the purpose of the search for pepper. Once you found even the one TeV that is guaranteed. This is the pepper. Okay, so to find out this kind of high energy events is very important. Now currently we have more than 10 TeVs recorded. And everyone is well, everyone is well calculated. Sorry, I missed some of the words there. For example, like this guy. So what do we do is that you can make the, you know, estimate of the chest probability for this event. Could it be, you know, if it is the fake event as the background of the cosmic ray, then you can do this kind of exercise. This is the total number your message in this direction, and then apply this card. Particularly for this event, you have only six muass measured with the total number of the 6,000 of the electron numbers measured by the ground array. And you know, this now this event is very, very clean in terms of, you know, number of muass. So now this is the card. And you know, this is the background that you have. And if you have such card, and then you have the chest probability was very, very small, like, you know, the 003%. So that makes sure this event is full time. And then if you look at this direction, and they immediately probably give you idea, the source could be some kind of there. And this is a sky map, if you just look at, and those guys are not in our galactic plane. That can make you know, a lot of confidence for us. This is the long source. And you know, if you put this long source together, this is indicated by this blue circles. This is you go to the TV catalog, and then you can find almost everyone actually is already, you know, the old friends in that kind of long kitchen family, except the display. Now that this is new discovery of the, for the, for the source, in terms of the TV source, and then this is how the TV source. Okay, now we have many more than only one here. And the second step is to look at the counterpart of the, the, the, those parameters. This is a list of the source, and then we go to the, you know, to the sky, and to go through the whole literature to try to get out all kinds of sources social was this kind of thing. You can see a pose us and pose us and I will add also as some are here. Most importantly, we found that this young massive cluster stock, massive cross stock process. For example, I just stopped it for the three and also this signal switching. Okay, so those are very interesting things to look and even more interesting to look at. You don't see so many SNRs there. Pretty much, you see a lot of pose up and up. Okay, then we have to go through. Now, for more, much more detailed study for those kind of sources. The first one, of course, is the crab lab. Now, we can go this kind of, you know, the deep investigation of the sources to figure out what is the, the, the parameters. Let me give you this example of the crab lab. And this is a very interesting source, because we have knowledge, very detailed the knowledge about this part of the source. For example, this change our measurement. So you know the size of this in the rain, and also you find the some lot in the rain, you know, so this probably give us idea where the party party was being accelerated. And then this is a measurement of the whole spectrum. This is the very important step. Once you go to the detailed investigation for a source to look at the spectrum. This spectrum by using WCDA and came to a basically we have the coverage of the 3.5 orders of magnitude. And you have this kind of measurements. And then we know this source very well. Basically, this is an electron for the transverse. So, if you have you use this all kind of the calculation to show to support to make a very simple model to try to understand that this emissions from the energy, the x-rays and the gamma rays. And now it's a high energy gamma rays. And together, seems to be a very well understood model. And this is the very simple thing by you with this two peaks, one is a synchrofond emission, and also you have this inverse count discovery. At the first glance, seems to be very well explained by using a single model, very simple model. However, if you look at more details, you know, to plot like this delta sigma to say the data versus the model. How well this can be explained, seems to be as not so satisfied. If you look at the, you know, each energy range, for example, as WCDA range and the slightly this came to a range, you know, it's not completely satisfactory for this model. So that makes people to look into more detail. For example, like this part you see above the 50 TV, actually this model cannot very well explain the data. At that moment is something around the four sigma effect already, and the shape of the spectrum seems to be not really well done by this model. Okay, that will make people think probably we need some kind of new component for this, for this source. And, you know, once people did this, if you put this kind of proton components there with the spectrum of the e to the minus two, which to some kind of energy cart above 10 PV. And then, you know, the same situations becomes a much better. If you look closely for this energy range above the one TV, by using for me, together with lots of data, almost all data is actually being extended a very well. Okay, so this is a very strong indication, probably, we really needed some kind of proton here, then there's a very likely this could be, you know, probably the first one we will recognize as the proton source for the cosmic ray. And of course, this is not all the case. For example, like this very complicated region of J 19, 1908. This is a complicated region here. And this is the topical record to go through. But certainly, if you look at the spectrum only, even with the loss apart, you know, cover their energy almost reached the one PV, and at this moment we already have the event beyond the company, beyond the PV for the source. But seems to be, you know, people can do a decent good enough job to use the both hydraulic origin or the particle origin to explain the data. That's kind of amazing. So that brings us to the issue. Probably, we really needed something better to look into this source, because all of this 12 sources accepted the crowd level. Almost all of them has the extension, the Angular extension. So, for example, like this source inside we have multiple things. So, for this kind of things, if we have the Angular resolution like 0.3 degree, like this muscle, you cannot, you know, get a better idea, which one actually produced this kind of spectrum, and probably that's the reason we cannot do good job to separate of them. That brings us to think about the synergy with all the experiment, for example, like the CGA, Astrid, and also some new stuff. I will mention this here. The problem we accepted, we know we need, you know, this much better Angular resolution detector to help us to try to do this job to separate the sources inside the origins of the inside the source. However, if you look at this updated based on this measurement, we updated this, you know, the crowd spectrum, and then we have this completely new based on the measurements. Now it's not a simulation of this sensitivity curve. So, if you look at that, we need this, you know, the pointing devices to do this job better than muscle to us. However, the sensitivity for current instrument, like the hands, like the magic, like the brain toss, that cannot reach the range we're talking about, around 100 TEB at a time. So, enormous difference in the sensitivity. That means we cannot really reach this region. So something new we need. Of course, the CTA can help us quite a bit. You know, according to this curve, this is a CTA. But you can tell after the energy about 100 TEB, this seems to be a study enough as well. This is a situation of the CTA. Currently, we already have the LST operated and also the construction for LST is still going. But now, nozzle is already built. So we need this kind of detactors badly. Now we have the news about Astri. Recently, we will have a workshop with nozzle Astri together. So this is what I learned. Maybe not correctly. Now, that SST will form an array here. I've made the estimate by myself. This number probably is not correct. There's a 0.5 square kilometer covered. Anyway, according to this, we have the sensitivity around the 10 TEB here. Okay, seems to be better than CTA. So this will provide quite a bit, quite important data to do this kind of survey in the future. And this guy will be built by 2025, according to this, you know, scheduled. And we are thinking about this as well. Just because I just mentioned out. If we have something, we really need this ISAT technique to make sure we have the angular resolution at this peak to do the separation between orages in the source, which is the, you know, measured by nozzle, which cannot give us the idea about, you know, detailed stuff. So if we have the array of the center of the term of telescopes on top of nozzle, and then we can do the, to do quite a good things. First of all, we have, we can take the advantage from a muon contact measurements by nozzle. Each shower hit on this ground, and then we have the muon simultaneously measured. This simulation results and shows the capability of the background of the separation and be improved quite a bit here at the high energy part. So that, you know, give us the idea if we run the detector for one source at one source by something like 400 hours plus, then we can reach the sensitivity, comparing with, you know, nozzle sensitivity here. So this is one of the TV. So that gives us the idea probably we have a much better idea about the inside of a source. So this is the concept. Basically, we're using a full telescope to make a crust, which is equivalent to a hash. And then we have eight of them to cover the whole array of the one square meter to do this kind of measurement. So hopefully we can reach the angular resolution of 0.06 degree. And then, you know, this kind of, if there are two sources inside, then the nozzle can do this, and not probably can do this. Okay. So that means that's my conclusion. Basically, nozzle is a survey facility for UHE sources in northern sky. It is a fully operated since junei 2021, with the rate of the one being event per day. We have a strong separation power between gamma rays and the cosmic ray background. The photo arrival direction and energy measurements are calibrated by using a stem candle, which is the cloud labula, and also now the GRB. And we already found that after the first glass with the half of the decaptor. 12. And now we have much more. And will be published very soon. You keep it in tune. The new catalog by muscle. We needed this, you know, in-depth analysis for, first of all, the grab labula, which is published. And then now there are another one will be published as soon for the for the for the signals reaching as well. The cosmic ray origin identify identification has a very difficult, difficult situation for those kind of a completely to source, particularly you have many things inside. We have to have, you know, the, the ISZT technique that is pretty essential to do the complete job. Okay, that's my talk. Thank you very much for teaching.