 Hi, good afternoon everyone. My name is Will and this is Leon. We're going to tell you a bit about the network we've built for you guys We started here. Sorry the first thing I need to say actually is that we are not responsible for running events. That's easy so so networking is is does involve a lot of terminology and Acronyms and strange words and stuff like that. So we developed a new Item of terminology the concept of the sauna and this is various buildings are all round Various rooms that all around this building and we put network equipment in them and they get very hot These are some infrared camera photos of some comms rooms and You can see yeah, there's quite some heat generated in there It was a bit of a problem. We're basically running on the limits there because the those rooms aren't designed To host a network of that size We have 10 gigabit equipment everywhere and lots of it And the building just wasn't designed for that. So it's pretty warm in those rooms Here's a graph of some of the temperatures we collect lots and lots of data numbers and things And this is produces in graphite and it's very easy for us to just kind of have a quick graph of the top temperatures Which go up to yeah 50 something degrees C. These are degrees C not Fahrenheit Or Kelvin at one point we had to move our servers to another room which has an airco Because the after we closed the door the graph climbed above 50 55 degrees Which wouldn't be good for the hardware, but the equipment handled it the temperature really well actually So we didn't have 100 gigi this year for 30 C3 we had a hundred gigi and you guys didn't actually bother to use it properly So so we left that equipment in the demo stock and and we just took tech gigi And it's quite easy for us to do multiple 10 gigi wavelengths over the single fiber We have that comes into the into the into the building So so we did that The unused uplink bandwidth is too damn high you you guys peaked about 16 gig out It's relatively similar to what we've had in previous years with increasing amounts of especially inbound IPv6 Not much outbound IPv6 So the setup we're running this year is pretty much the same as last year's so it's tried and well tested The core network runs on hardware from juniper that we that we get in a loan mostly MX240 and MX80 routers and For for the distribution network. We run on EX series switches and We were running MPLS, VPLS core network So we can have the routing distributed and still have some networks in a central place like we need for the DHCP server and stuff And it's basically the same same network as last year So it wasn't that much work because we had the conflicts ready. We had everything we could steal most and most things from last year Yeah, well, we we tend to throw away the actual Data that gets accumulated in the logs during Congress. We do Keep the keep the conflicts of we've used that hardware before that that makes makes life a lot easier for us The various tools and scripts we've developed And sorry about this diagram. It's the network got rather larger and so the diagram also got more complicated But yeah, we we have various Intuna bit mx and ex distributed around the building and and Over the in building fiber So what you what you can see here is on the right hand side These are our upstream providers, so which are various ISPs that just offer to sponsor a single 10 g link or a 20 g link And all together these are That's an uplink capacity of 50 gig this year And also on the right hand side is our router in the data center, which we run that fiber to from the building and Then the rest of the that map is basically just the the patch rooms inside this buildings with various routers and switches So we had a colo same as every year It is for computers with one gig and above network connections. We had Around actually it was 80 to 100 machines in the colo I'll show you some photos in a minute and it was using about 15 kilowatts, which is quite a lot of heat It was obviously nice and warm and cozy in there because we had this ticket from the knock-help desk guys over here where someone had decided to sleep in the colo and And and then and then And then left his sleeping equipment there and wondered Well, we did not cleanse this hellish place and remove it, but Someone someone obviously obviously did So sorry if you're in the audience and you're looking for your stuff Don't know The colo is for machines really We had a much better layout this time it does kind of turn into a bit of a carbon salad cable salad, you know and We We were able to kind of make things a bit better. It's it's in it's actually in the the balcony of Yeah, yeah, you can you can see it in the background on that picture. Yeah It's pictures a bit dark because well hackers like darkness, right you can see the blinking lights Yeah, big pile of machines So we had some interesting this is a topical kind of discussions on Twitter with them people I Remember a few years ago. Someone said our Congress has more capacity available than the whole of Africa and now people this year say oh Congress has more cap more more more better network than North Korea and Well, this is true. We don't think this is something that's particularly funny Most of us in the knocker professional internet engineers and we think if It's a really sad state of affairs if we can build a better network than for a whole country. So so well, hopefully We'll continue to have a good network, but maybe some other countries will too in the future On the plus side Africa is catching up. So definitely. We don't have more capacity than Africa anymore. Yeah This was actually we did didn't sort of fun analysis of where we're traffic between Congress and North Korea was going but There wasn't much traffic. It was below the in the sampling area of error of our measurements. So Maybe it wasn't only just due to this really circuitous route where? packets were going, you know the whole width of the United States and across the Pacific to to get to North Korea, but There we go So wireless We this year we we're running on a rubber equipment We had two controllers and with 10g up links It's a it's a it's a lot much larger set up than last year's we've had we've had I think 70 72 excess points last year and this year we're in 125 115 of these are actually serving clients. So you that these are the ones you connect to and The other 10 are just air monitors really that we use in high-density areas like these lecture halls Which don't do anything else but measuring the actuality on the spectrum So the the controller can do more intelligent decisions about how to How to plan the wireless? Also, we had way more users than last year's At 33 where we had five five thousand users and the wireless and peak times and this year we peaked at 7800 and that means an average of 68 Clients per excess point in average, which is kind of high and we'll come on to that in a minute Interesting thing is that not everyone is here or has devices connected and we saw in the region of 20,000 unique devices I that's about a couple two devices per attendee, I guess So people people come and go from Congress and the peak traffic there was about three gigabit on the wireless Which is which is a fair bit? We had some interesting graphs because we collect numbers I Don't have a we collect numbers, but not your data This is I don't have a pointer, but basically if you look at this large purple graph here, this is actually This is actually the number of users in salines in hall one And then obviously the talk finished here and they went everyone goes out into this Cyan sort of area, which is actually the foyer behind style one. So everyone just Walks out and their devices stay connected to the network and then oh, there's another interesting talk in sort salines again So everyone goes back in again and So we were actually using this in a knock. Hey, there's loads and loads of people in Sal 2. What is the talk there? We should go and see what it is This is a picture from the FNaught news show actually so on the on the right-hand side It's when the FNaught news show ended and everybody Went home basically Some of them went to the bar first, I think Crypto if you don't use the crypto we provide then people will sniff your traffic We provide we had about 60% 82.1 X WP enterprise clients. That's where you enter basically a random username and password This is what you should be using please do not use the unencrypted Wi-Fi We do include it for instance for compatibility reasons like you have some old hardware or Raspberry Pi is and stuff like that But if you persist in using the unencrypted Wi-Fi then we can't help you and your data will be sniffed recorded and sold This time around the encryption actually is terminated with the Aruba setup on the controller itself, which is Stored somewhere safe in the building Which reduces the attack surface because that means that the the traffic is also encrypted all the way throughout the network as well the the wired network that the access points connect to But yes, please use the encrypted stuff We have some more graphs of the wireless And this is actual traffic like megabits per second. I think You can you can see the times when hackers sleep It's not it's not actually in this graph, but we've met at the lowest number of users I think at 7 and 8 a.m. So that's when the bottom of the curve is So what you can see here is it's basically that we have a lot of traffic that this is only This is only saw Einds and to the lines and it's by so the main lecture halls Yes, it's about one one point two gigabits of peak traffic and The thing is what happens if you if you have this much traffic on the wireless we we have in a pretty graph of this as well It's pretty clear isn't it So this is this is this is the the spectrum utilization basically it's it's a do buy five gigahertz hardware It does improve year-on-year. I think as people replace hardware and and yeah 65% 5 gigahertz clients is we think we think you're getting there. So that's nice So we had some problems with the network It's never never perfect We had one of the Juniper virtual chassis. Well, it didn't literally explode. It is pretty warm in there, but it it kind of fell apart and We had to yeah at some point at some point Well, the virtuous chassis are basically single switches that we a multiple single switches that we stack together So they are one virtual switch so we that's just one IP address we SSH into and Basically, it's just what the one managed switch and what happened at some point is that basically fell apart one of this single switches Rebooted and yeah, we don't really know what happened there We might be a failed stacking cable the ones we used to connect the switches together, but we didn't really investigate it We might investigate it after the event that we probably probably won't We had them we had a bug which caused the we talked about the router that is in the data center IP HH across town and we were enabling IP fix IP fix which is We're using this for flow monitoring. It's not for monitoring of your data But it's we want to collect some data about where traffic is going on the internet because the Congress network is actually part of the internet We run our own autonomous system and we we like to Adjust where the traffic goes and and provide best performance But anyway, we enabled this feature and it caused the line card to hang unfortunately on the router in the data center and we'd already built most of the Congress network at that time when we were kind of slacking off already with a few beers and Then then fortunately, we still had a designated driver at 2 a.m. Because Yeah, I wasn't driving anywhere and we had to drive across We had to drive across town and and like get into the data center and adjust things Special thanks to the guys of the data center who stayed up until like 4 a.m. Just to allow us access so we could reboot the router And we had a few few Wi-Fi problems one of the things is that there are these Etsy regulations that The access points need to detect 5 gigahertz radar. I believe it's weather radar And and what happens is the access points listen and if they hear something they don't recognize they shut up Because they think hey, there's some important weather radar here. So I won't transmit in this area But we found this this Function was very sensitive So we had to adjust the whole channel plan and and this was this was a bit of a problem It meant we couldn't we wanted to get more out of the network and couldn't because of this detection and it's simply due to the amount of like Electronic devices and general stuff going on at Congress and it's just a fun function of the density here So we'll be working with a with a vendor on that to to improve things and maybe make this more tunable with time so So again, we need to we need to thank you some companies basically who provide equipment and and support so we can run this network because We don't rent it. You can't run these amounts of gear We can't phone someone up and say please can we pay you money for this? We're really reliant on People's generosity, so Especially juniper who provided like I think 1.2 tons of equipment Was it 3 million euros of equipment insurance value? Yeah, 3 million euros of insurance value Also, that's not a company, but we really need to thank the people of the knock help desk who keep a lot of work from us and we had a very great support from Aruba and We've already talked about juniper There's a company a German company called flex optics who have been really kind with supplying us optics for all kinds of events these are the actual lasers and we had Hundreds and hundreds of these like it is a very huge box and and they're very very kindly supplying that to us for free We we get a donated bandwidth from a trotto kair and KPN. So thanks for that All kinds all kinds of other bits and pieces From people's closets and so forth to make the network work but thanks thanks to everyone really and Now wash your hands Does anyone have any questions? So this talk's gonna be a little different than all of the others you heard before we are gonna do Questions after each section. So if you have questions, just run to one of the microphones be quick and brief So we can do as many as possible so if anyone has a question just Run to one of those lighted spots and we'll start with one from the internet I think Indeed there was Question about whether you know what the current market price is for one gigabyte of Congress data would be We haven't received any offers. So I have really no idea All right, then I think we'll start with microphone to I just wanted to ask if you have numbers about not 64 users Yeah, we do but I'm not carrying them in my head Okay, but if you if you come and talk to us afterwards we can we can talk about that set up in more detail We did have some various people contest us over the Twitter account Who who were using the nat64 and some feedback on that? So we're gonna continue with this forthcoming Congresses. All right microphone three Sleep safely. I am one of the input as connect coordinators, and we found this guy sleeping mattress. He was searching for four days Another quick question from the interwebs Yes, yeah, there's a question about whether you had any fallback on the internet connection side of things for fallback Do they mean security instance or I just I think just in terms of Whether you had an additional line or something. I don't know. Oh, I see. Yeah We actually have to buy the fiber. I think we actually have to buy it for the whole year Something no, we're renting it for a month. Mm-hmm. Well, it does cost quite a lot of money So we be we for a four-day event. We don't have kind of super dual paths out the building in case of digger digger comes We'll probably use more equipment next time and have More resilience in terms of the equipment, but probably not in terms of the physical fiber due to the cost But talking about abuse can't tell me you didn't get any right. Yeah Um, we were very few abuse conflates this year I think we had like three cases in which we did something. Otherwise, it's just automated stuff where people were pot-scanning and Do less pot-scanning. It's really annoying Yeah, it just fills up the inbox and someone has to close all those tickets and RT You will figure out how to do this with my skill. It's a nightmare All right, then microphone three please Regarding the abuse, do you reject automatically mails from a fail to ban fine sense? We get them all and we process them It's difficult we don't reject them automatically because you think that that's kind of rude We do in theory at least collect this data so we can look at it but There's little we can do we provide a service where If someone sends us an abuse complaint, we can guarantee they will never receive another packet from Congress ever again And this this so someone may phone up and they say hey, I'm seeing this bad traffic And we just say well, we'll just put an access list on and you'll never see us you'll never hear from us again and That that seems to keep people happy And I had a quick other question you mentioned you keep flow data Did you check how much traffic go was going through tour? No, no, we absolutely do not look at the protocol data So I meant just by looking if people are contacting tour notes with a list of IP is public We don't collect that information either. We don't collect IPs. We don't store that and we throw away data as soon as we can We really have no interest in collecting Data about your packets other than this this particular flow data I'm talking about is actually to on whole autonomous system basis. So it's the whole big ice piece So we might know how many gigabits we did to Deutsche telecom But not how many to your DSL line and Deutsche telecom for instance. Okay, thanks Okay, and then I think one last question in my phone, too Are there any any signs of spy agencies monitoring the network? No, that's a tough one That's a tough one because We don't notice any that's all I can say If if we had noticed anything we would have taken action, but we These people are good, right? So Too good. Yeah, it's possible that Whereas we have a number of upstreams that yes something could be happening there But it's again, it's outside the border around at the outside the border of our network So we don't know so just encrypt everything. Yeah twice Okay, then thanks again to the lock guys Okay, so we don't have any more questions if you're done Who's coming up next Disney to unlock my laptop Fengal I think you next Alright, so you're gonna talk about Third right just Take stuff. It's it's on there We need a different screen. Yeah, yeah, can you swap to the other? Yeah, so There we go. Good that we have the walk guys all here. Yes Yeah, so I want to talk to you about the power network We have built up here in the Congress area for you to use it So the building itself has not so much power outlets and Congress members here so This table is Mainly what we have supplied on each area So in hall three we had the biggest amount of power For you guys to use in all H also with the party area. They are using the power Much of the afternoon for the co-location we had more than on Mainly anything else Yeah, our material we have used you see the big amount of distribution boxes and the 16 amps to 32 amps 63 amps and 125 amps the 125 amps is the First time in Congress. We have used them. We needed them for the whole so for the whole three Because the assembly team put all people together that wanted to use 3d printers laser cutters and other stuff at One spot of the hall and total we have installed 9,850 meters of cables 673 power distribution boxes and from our side. We had delivered 3200 power sockets, but every assembly have bring their own power socket lines with them and so I think it will be much more here Here we have first time we have measured the consumption in the halls or in some halls So hall three for example was 1067 kilowatt hours And in the co-location we have used 931 kilowatt hours That's the consumption for the whole day But we have to say last year it was doubled so we have I think a green congress Yeah, and I think it was on my side I don't have any much to tell you because I don't have any data from the controllers or something like this So again, I don't know. Would you mind answering questions if there are any other any questions about power? I don't know Do we have any electrocutions? I don't think so No, no Might happen with those lines lying around everywhere Okay, all right. So then please give it up for the Seidenstraße Hi, I'm Sebastian. I'm going to talk about Seidenstraße Yeah, this year was a lot smaller than last year. We've only used 600 meters of cube That's 700 meters less 700 meters less 100 meters less than yeah last year last year we had 700 Um, it took eight days to set everything up We had four to seven hackers working on it Almost full-time. Those were the same guys who meant our Seidenstraße assembly, which we also had this year and Yeah, I'd like to ask for some applause for them because I was none of them And and I couldn't come here until the 26th and when I arrived here everything was set up already So that's quite impressive and We also saw some new trends in building capsules This year we had way more lads I had to photograph them all with LEDs off Because they were too bright and the hall is really dark. So otherwise you wouldn't see the capsule The capsules were also heavier. I think nobody cares anymore about our standardized weight limits So there because of a lot of lads you need a lot of battery power in there And we I've seen quite a few 3d printed capsules But most of them use 3d printed parts somewhere or even parts from real Pneumatic cube capsules that you can buy commercially and they were sold in half and adapted to our system So that's really cool So this year we also tried to do some fancy crafts We did not put up Transmission logs last year for people to file things manually because That didn't work out too well and the numbers just didn't add up in the end So this year everything was done by automatic capsule counters at the central node And these are the data for the central node basically Yeah, the traffic was similar to last year. So this is the data from I think day one at 19 o'clock to Just before the talk here And we've got overall about 550 capsules that have been sent or received And the fastest capsule was 14 meters per second, which is in a similar area than last year And you can also see some peaks in this craft that might be interesting because those peaks where when there were no talks And people had time to play with the Sidenstrasse Also, you can see the sleep times Except for day one in day one. We had a little bit of a power outage because someone unplugged the cell war assembly and inserted a vacuum cleaner instead We also experimented with autoroutors We tested some ideas here at the congress and yeah It all boils down to the tube is less flexible than the people building the autoroutors except Expected beforehand. So we need more motor power on those things and we don't have any standard routing protocol So even if there was a working router, we would not know how to tell it where to route We are somehow working on that if anyone has good ideas, just write a mail to our mailing list or even better subscribe there Which brings me to the last slide So if you want to contact us with ideas or if you want to help us with something Just go to ISE. We are on Hackend in the Sidenstrasse channel There's not not much going on there over the year, but I'm usually there to respond Also, we've got a mailing list that has a bit more traffic Yeah, it's a standard mailman setup So just write an empty mail with the subject subscribe to this address and yeah, I guess you know the drill Also, we are planning to do a Sidenstrasse setup on the camp next year which might actually be useful there and not just a toy But we need a lot of more helpers to do this because doing this by eight or nine guys is really a lot of work And yeah, I guess we all want to enjoy the camp. So the more people help the faster we are done also Short note here if you want some Sidenstrasse at home, you can get free tubes from us today after we tear everything down just called 4451 And then we'll get you some tubes But you have to you have to have some way to transport them home So we can do that for you and also it would be nice if you would have some way to transport them to the camp Because we will need them there. So that's all from my side Just I have a quick question no flying martin bottles this year. No no flying martin bottles No pro con tubes. No special incidents. All right. That's an improvement. Maybe one there was DDoS attack at my automated capsule counter somebody attached two meters of led string to his capsule And I counted like 300 capsules in less than two seconds Okay, we have a question over there Yes about routing. Have you thought about mpls over Sidenstrasse? So I'm only a bit of a network guy. I'm not really sure what npls is about But we've been discussing Like doing ip-based stuff over Sidenstrasse by using nfc tags and putting the packets on there Okay All right Yeah, there's another question. So you had a packet inspection point at someplace near the angle Stuff, uh, what was that about? Packet inspection point. Yeah, there was like a box in the tube Um, just at the at the ceiling. Oh, okay. I didn't see that one. I thought But yeah Don't know Or maybe we just find the agency Surveillance stuff. Yeah, just on the wrong network. Maybe Okay, I think that's it from the Sidenstrasse. Thanks again And I think we only have the walk left. So Can can we have uh, yeah, thank you So, um, hello. I'm uh, denimo. I'm uh, giving a short report from the video operation center And uh, please give big hands to our winker katsu again And of course other than her there Quite a few other people involved. So who are we we're actually not only c3 walk. We are c3 streaming which comprises of the video operation center of the ccc the fami v and the hs who are both Very much into into video and helped us a lot like every year I guess is actually new Thank you for joining us So we we built this and we prepared this And we improve on every conference that we go but obviously The uh, the the congress is like the the the most challenging conference, right? So we do we approach this in different teams. We have infrastructure. We have streaming and encoding we've a team doing the website We have post processing and coding operations subtitles This is uh, this is people working together in their groups and across groups So this works pretty well and we actually had a meeting A few a few days or a few weeks before this congress to prepare everything in bulin Where we where we actually put this together Um So what's new this year? We had video on demand, which was really great because it was just supposed to be better and it was really received extremely well Which is awesome because It was better, but we had very very little complaints anyway um For the first time we have full hd in all the the halls including the zended centrum and h264 and vp8 So also free codec supported in high definition Our release pipeline now supports release to youtube so everyone who doesn't manage to go to media cccd can now receive our content as well Yeah, that's that's that's the amount of applause. That's due. That's perfect So subtitles in the room and on the web player and actually, uh, we had great feedback there I'll I'll get to that in a separate slide We had gopro camps for presenting small objects like it was a big problem in the last congresses when People had to show that tiny it's a busy hardware hacks and we actually had to zoom in with our cameras from far away So now we had gopros that That actually could catch this a lot better We had backup recording with SSDs, which was really really good Because we actually needed them and it helped us recovering quite some recordings that would have otherwise been lost and We gradually improved other things like the content distribution networks and so on and so on It's just really too much to to count on because it's just amazing what people do in this environment And we have dvbt We actually have dvbt Broadcasting equipment in this hall. We have an official broadcasting license So, uh, I think we didn't really announce this too much, but you could actually have watched the talk Through dvbt So how did this actually work? Um, and It's actually surprising that you can can you guys actually make out? What what these, uh, what does graphic reads? Okay, perfect So I wasn't sure if that would work out So we actually start in the middle So we have the camera sources and the slides which come in via sdi into our mixer Where they are mixed live you you might have seen the awesome split screen view that we have courtesy of our hardware mixers That we had in all in all rooms for the first time this year We have um, and we we and when we go through encoding we actually have two ways now first the The way to the relays and then the the the actual production line where we actually cut and edit the videos and And equip them with metadata a bit more And actually produce all the other formats Um, we had an encoder cluster that encoded everything from from the full hd master In berlin, so we pushed everything through the network. So thank you knock for providing us with the network that we had Because that was essential to get to get our service to to the people. So thank you Um, and so just to finish the production line We then push it to our cdn which runs on mirror brain free software solution and uh on youtube And you guys who watched it live you could see it through rtmp through hls And Through basically the the the relays we had master relays and and atch relays and we had one relay that would do the video in demand Because uh, it would basically reuse the hls snippets And serve them For you delayed viewing pleasures So now for the subtitles we had uh 40 angels helping and subtitling the talks Up to three parallel tracks and very good feedback from hearing impaired and deaf viewers And the service uh was very popular with people that are not as fluent in either german or english so, um Though they's actually those people actually get included uh as well, which is great And for the usage peaks Here for the subtitles the the fnord new show was 120 subtitles viewer and Jacob's and laura's talk was about a hundred subtitles viewers, which is great um So, um What else happened the most remarkable thing and you might have read things about washing hands. We had to learn the hard way Um, the the downtime wasn't so much hardware. That works pretty well I mean if you have uh, if you have watched the streams, they were mostly reliable and very much Reliable this year. This is the feedback that we got through most of the channels Um, but the downtimes were not like technical nature We actually were the virus operating center That's what people called us. We have 25 core people and one afternoon I think I was on day two. We had nine people down within four hours And mad props to cert for helping us getting them fixed up again Some most of them are well again or on the way to recovery. So thank you cert and as I mentioned technically dvbt because we were still testing it was a bit flaky um, it will be uh A lot more useful on the camp, I guess And we had a short front at autage because of a dependency on event ccd That was quickly fixed and we were back online and it only affected the front ends people that were still watching That we were still watching the uh, the streams could keep watching And we're not disturbed Yeah starts I mean competing with the knock in terms of stats is just useless. So I won't even try but we have some um, we delivered 80 terabyte of streams Video on demand nine terabytes cdn 19k views Handled by our mirror network The raw video material was 4.5 terabytes And it was 184 hours of recording and because we have to duplicate them for all the other formats this amounts of That amounts to almost 900 hours of video that needs to be processed by the cluster Okay, I don't need to explain that So yeah, but we we we actually had peak We had a peak at almost 9k viewers and at that time we were pushing Out 10 gigabits Of of video and audio traffic And another interesting thing, um, we also streamed the zended sand home and No video on demand because that that that was only established for the For the main rooms, um, and it turns out it may be a good idea to have that for zended so until next year as well Um, what's amazing is we we could actually see the nsfw late night show Uh, we could actually see that they had 3k viewers, uh, which is absolutely astonishing. Um So I thought that was worthwhile mentioning And there was another, uh strange thing, you know, we are a great cat content lovers and because of that we have vinker cuts and and Suddenly we were innocently idling and working away whatever we were doing and some and uh, sudden When when suddenly this happened there was a larger than life, uh, vinker cuts are showing up at the walk And thank you to the donator Okay, so this is basically it. Um, finally, I would like you to give back hands to everyone who's helped Making this happen and this would not be possible without the people manning the cameras manning the video mixers and Running like mad to get everything organized. So you and the people outside of this conference center Can actually take part in this conference. Thank you video angels Okay, and if there are any questions All right last question time again Yeah, you had microphone to go ahead. All right. Um, I'm wondering for dvbt. Uh, do you know what the range was or the power? I'll give that to the expert So we we planned with three transceivers or senders um, one in the halls hall three uh, one that's a cat room for you and one at Yeah, somewhere between hall one and bar two. We actually deployed only one in hall three with receiving power of 500 millibirds, so 1.5 watts because the bundles not second tour It was easier for us. The bundles are second to only allow indoor coverage. And so it's much cheaper for us Yeah, and it actually worked just today. So we had some problems there Yeah, you see the the headcount number Was down and on Cessar Day and it was kind of a beta test for the camp. So, yeah, thank you Okay, we'll just go ahead with the question from the internet Sure, um The internet just wants to give you guys a big shout out for providing the service and a big thumbs up Um, there was a question about whether, you know, the latency of the stream or Whether it depends a few if it depends a few us rtmp or hls I think rtmp is faster about 15 seconds and hls depends on your client All right. Thank you work from two, please um also about dvbt um, I was wondering if In what resolution you put that out and if that will be a hd content On the camp at least So your question is about hd on dvbt. Yes Yeah, we will might do this on the camp. We didn't do this in today or this year Maybe using dvbt Two and so on but we will see how this works, but In Syria could also do hd on on dvbt one, but only the fewer channels Okay, I think that's it. So please give a huge round of applause for all the people making this congress possible