 Hi, good evening everyone. Thanks for coming. We're going to give a few mixed presentations about the infrastructure here at EMF camp. My name's Will Hargrave and I've been kind of looking after some of the teams all together who are delivering this stuff. I'll just introduce my... I'm going to say colleagues but it doesn't seem like the right word. This is Ayan who's been mostly dealing with Wi-Fi. I have David C who's been heading up the nog team. Equi has been working on a lot of the core network and other interesting things. And Peter who has come from Germany to do the VOC video casting. So thanks very much. So I'm just going to talk briefly about the power situation myself. So we actually have a bunch of scripts which are on GitHub. Of course you can't see any of this stuff but can you see it perfectly? I put it up here earlier when we were building it. It's pretty nice to see. We actually have a bunch of Python scripts that I smashed together in 2014 because I got really frustrated with Excel spreadsheets for planning power and using NetworkX and some cool stuff like that and spitting out a diagram to make sure that we don't order the wrong connectors. They're all in GitHub by the way. You can have a look and look at my terrible coding. So this time we have two power grids actually for the critical stuff. BS7909, code of practice for temporary electrical installations requires us to have two sources of input for site lighting. So we actually have a mains failure panel somewhere out here. So if one grid goes down the essential services are protected. Which we didn't do before. So we have two 200 KVA generators, one droning away down there and the droning way up there. I'm sure you've heard them. Otherwise it's actually remarkably similar to 2014, those of you that are there. We have a lot of distros out around the field. We've got 30 nm sockets in done and closed. We've got 16 nm on the field for you to plug into. We didn't really have enough stuff this time. Our rental company were, yeah, a load of stuff didn't come back from hire. So we've had some challenges there and I know a few of you had to wait a little while to get connected and we're pretty sorry about that. But yeah, we just need more next time. More and more stuff. So basically on this diagram everything that's in red is a three phase connection. So we've basically got a tree spanning out from the generator down. It's one, two, five and sixty-three. I'm three phase cables that you've probably tripped over. They're the big rubber ones. And then down and down to the edge. Here's the second diagram. This is the north grid. Yeah, so that's the one to north. This is all the way, by the way, on GitHub. You can have a look should you desire. We did some awesome graphing this time. Rust has made some generator monitoring tools which plug into generators RS4 at five ports and the power from the generators DC bus. So we were able to collect actually properly from the generator all the stuff it displays on its front panel, you know, all those super nerdy things like what's the engine coolant temperature of my generator or how much fuel is left, what's the burn per hour, what's the... You know, some of this is actually useful and I don't really care that much how warm or cool it is as long as it's not boiling. It's about 82 degrees C, by the way, mostly. Anyway, here's a total power. One is the north one and two is the south. We just... Yeah. Anyway, quickly, this is a stack graph, basically, so you can see the total usage, which kind of peaks around 100 kilowatts, which is roughly about right for a couple of 200 kV generators. I mean, I wouldn't want to run out... I wouldn't want to run out of power. Sorry. Heaven forbid we should run out of power. We'd have had a few power arches. The catering always uses a lot and because we've actually managed to get a lot more food vendors and stuff like that this time, they've been using a lot more electricity, so we'll have to beef that area up a little bit in future because 163.3 phase is not really enough. Yeah, all of this stuff, a lot of the generator starts and stuff are available on dashboard.emf.camp. You can click around it now and look at the generators and see what we're up to. Other than that, that's about it from the PowerPoint view. We're in kind of practical stuff. We are going around and trying to unplug cables when you need them. We are planning to power off the north side of the site tomorrow morning or earlier if we run out of fuel. But yeah, so most people need to be offsite by midday, so if you're not helping, so yeah, we'll be saying goodbye to the power up there and rolling all that out tomorrow morning when we're running around as quickly as we can to unplug your cables and et cetera, et cetera, when you need it. So thanks for that. I'll pass over to, who wants to talk about the uplink? David? So I'm going to talk a little bit about how we get the internet to the field and then I'll pass on to Equi to talk about how it comes around the field and to AgA about the wireless. One of the first questions we get is how do you actually get a really fast internet connection to a field potentially in the middle of the nowhere and certainly not near any useful internet points of presence? Well, in 2012, this is what we did. We did a microwave point-to-point link to a local data center. It wasn't fun. Even if you pay someone else to do it for you, there's an awful lot of faff to get it set up and get it aligned and get it working and getting the speed that you want. So we said, in 2012, we're never doing microwave again. And then in 2013, we did this. 13. Oh, yeah. Electromagnetic. And then in 2014, we did this. And then in 2016, well, mission accomplished. We finally have a fiber connection all the way down to the site from a provider. So as you are probably aware, Loasley House is just up a few hundred meters up from our field. And they renovated their business park a few years ago, and they had the immense foresight to not only dig a very large trench all the way towards Guilford, several miles across their land so they could have better access to fiber, but they also have a fiber network running around the business park as well. And that network is managed by a company called Fiber Options, who've been incredibly helpful to us. And we simply had to run a fiber of our own up to their nearest outside splice point and plug into their network, mostly along hedges and fences at one point, along some steel wire up at a couple of trees just to allow vehicular access if needed. And terminating Tuesday evening, I believe this was late in some late night splicing in the dark just to test everything was working. So a passive optical network, as many of you will know, is generally designed to get a lot of fiber to a lot of places, whether that's along a large area or to a lot of businesses. And it does that by having passive optical splitters along the network that are relatively cheap and it just basically forks the signal off and then recombines it on the way back up. And that's the technology they use up on the business park. But fortunately, I'll get to that in a moment. So this is what we were given as the ONT, the Optical Network Termination Device. It weighs a few grams. It contains a router, a four-port switch, a couple of Vipe ports, and a Wi-Fi access point. Made of plastic. It is made of plastic. The only difference from your typical Soho router is that it's got a fiber input there. And I'm sure it's fine for most of their customers, a small business or a home user who only needs to do a small amount of traffic. And I'm sure it's fine when it works in rooted mode where it can use hardware acceleration to actually shuffle packets around. Unfortunately, what we wanted is a bridge connection and what we wanted was a VLAN that is presented here to us and presented to our router up in London, Docklands, and has nothing on a logical level, has nothing else in the way. So this device was put into bridge mode, and it turns out that it does bridging in software on a 600 MHz CPU. So before we even send any traffic down it, simply announcing our roots, which is the best part of a slash 16, so 40,000-50,000 IP addresses, simply the background noise of the internet, the port scams and things like that was enough to completely lock this device out. I don't think they're used to dealing with such large address space going through these devices. But we talked to FiberOptions, who, as I say, have been incredibly helpful. We proposed a solution because it turns out that, although they use Pop Passive Optical Network and technology on the business park, there's not actually any splitters in the ground. The only splitter is the lower rack unit you see there. It's simply a way to get many ports into a single interface on their optical head end, which we noticed also had some SFP ports. And given that this is the only splitter, there was a clear fiber path, a single mode core, going all the way from this site, all the way up to their comms room up on the estate. And they very kindly allowed us to plug one of our SFPs into their unit there, running Gigabit Ethernet again on a single core because it's bidirectional optics so it transmits and receives in different frequencies in different directions. And yes, unfortunately, we also had Flex Optics, another one of our amazing sponsors that sent us a large number of optics and we were simply able to recode them to work in this device. So at which point, a Gigabit down to the field? So how did you use this Gigabit? Well, disappointingly. I think we averaged over the weekend probably about 100 megabits in, not so much up. We have this dashboard, which you should feel free to look at, ideally before you leave camp because I don't know how long we can keep it running after camp, but do have a look at that. It's got more stuff underneath, all the useful stuff that we use to monitor the network and that is interesting. So the total amount of traffic, our peak was 505 megabits, so we used at our peak, we used about half of our capability. And the total amount of traffic, I'm afraid to say, is... I have hard disks bigger than that. Come on, guys. So I'll pass over to Equinox to talk about how we get the network around the field from our data center here. So I'm Equinox. I'm going to tell you a few bits about how we actually deliver the internet around the campsite. So what you can see here is the physical overview of the data loose which you've seen around. So the core network is built on fiber, which is patched to maybe half of the data loose. What we do here is we just go into the data loop and patch the fibers through to the next data loop so we don't lose them when they are behind something that goes down. And then the stuff that is further around is connected by copper. You can see those on the black lines under diagram. So we try to minimize the stuff that we do with copper, but it's not really viable to do everything with fiber. But, yeah. So the good thing about patching stuff through the data loose without going into the device is that the actual logical diagram becomes very simple because most of the switches that are in fiber, connect directly to our core switch so there's no instance of something on fiber that is behind something else on fiber and that makes the network relatively easy to manage. Obviously the copper connected switches are behind a fiber switch but that we can deal with. Most of the hardware you see here is what we usually work with on other events as well. We had some new hardware from Arista this time around so we got to find out how that behaves in the field. I think it turned out pretty well, but yeah, it's okay. So this is what the head of the logical diagram looks when you have it in the data center. The actual network part is on top of the table which you can see there's the wireless controller and the core switch. And by the way, the core switch is 10G and the uplink on the fibers is mostly 10G so the data loose have 10 gigabit Ethernet which I don't think were needed anywhere, but in theory you could have filled that. Yeah, also the middle of that core switch is 6x 100 gigabit Ethernet which if anyone brings a device I guess we can find some way to plug that in but we didn't need that. So below the table is the service to the left is actually the walk encoding streaming stuff. On the right side is our own servers DHCP, wireless management stuff things like that. And I think this is actually the first outdoor event that we had UPSs in the data center which we didn't end up needing. However, that was pretty nice to have anyway since the UPSs gave us a power reading. So this is 2kW of power in total which you have to move out of the container somehow because it all ends up in thermal energy. So you can already see there's two air conditioning units in this picture. There's actually a third one behind the camera so you can't see that. And there's the fourth one at the bottom which was in the infotainment as well called bike up I guess. And I need to point out here that we started with only one of them and it built up a lot of ice on its back and then it went to the defrosting mode and then it suddenly turned up the heat to 35 degrees celsius which 44 I'm told. So the servers didn't quite like that so at some point we went like hey why is the server off? So luckily we noticed that pretty early so we ordered the additional air conditioning units and that turned out to work pretty well. So this is down from 20 degrees celsius so the unit is wrong, 22 to 15 at which point it's actually pretty annoying to work in the data center. But it works out. Obviously that doesn't help you to get any internet because you're not next to the data center so we have the data loose around and first of all I would like to thank to all of the volunteers who went around with the key and plugged in people and plugged in power and are hopefully going to unplug people as well. So you can see inside of the data loose there is one of the Rista switches some power supply for the access point that is actually mounted outside on the pole. You can also see there is this let's call it interesting light which fulfills the purpose which is to monitor that the data loose actually working since if it stops working the light will either freeze or turn off so if the light is working then you know well the problem might not be the network I should better check my laptop. You can also see the fibers pulled on the bottom of the picture we try to only have one of them in each data loop that works out usually because we just put them away from the core and yeah we use by the optics by the way so it's only one fiber used in both directions that makes things just easier to manage and patch so yeah it's a total of 3.3 kilometers this is only the fiber so this was shipped in it's the inventory that we have and there's two kilometers of copper and I should point out this is only between the fiber data loose and the copper data loose basically and for some tense so this doesn't include anything that people bring themselves or people patch themselves or stuff like that so we come up to just above 5 kilometers of patching and that's quite a bit of work to do in the heat something new that we did this time around was to run a VoIP network which mostly we did because it doesn't work which usually does this isn't here and we needed a little bit more communication we didn't have enough radios and this is actually Arian's setup who also ran the Wi-Fi so I'm going to patch over to Arian at this point I guess so Wi-Fi on EMF 2016 some statistics we had 66 access points deployed this time it's a whole mixture of Aruba access points there are either dual radio 8211N or 8211AC most of them are indoor and some of them are outdoor rated so the AP247 277 and we had a lot of the AP135 in the plastic boxes on top of the data close we had a peak of 2084 clients 50% of those were on 5 GHz and that's including the badges which were 2.4 only 350 megabits of traffic we've seen at a peak and a total of 4200 unique devices we've seen on the Wi-Fi network and it's actually only 300 unique devices we've seen on the wired network so almost everybody was on Wi-Fi and I think it's quite a bit lower wired uses compared to the CC camp and the Dutch hacker camp so 1400 badges we've seen on the Wi-Fi network and that total up to 65% of the clients were using 821X so they were using WPA2 Enterprise properly encrypted Wi-Fi network and that's without the badges because they were on the Wi-Fi network so 65% 821X includes the EMF camp or EMF camp legacy network which was 821X we ran with our radio server and then we also had Spacenet which is a federated authentication platform for hacker spaces and we had EdoRome which is also a federated authentication platform but for educational institutions so in total we saw about 300 clients 300 unique devices on EdoRome and most of the people were actually on the EMF camp network which is 821X and 5GHz only this whole internet of things of stuff yeah so the badge was 2.4GHz only unfortunately we've also noticed that the badge wasn't doing any channel 13 so we're actually running a 4 channel setup on 2.4GHz on the Wi-Fi network so we're using channel 15 and 9 and 13 but the badges were running on a I think a US fixed country code when they were handed out so they couldn't see channel 13 so they had 25% less coverage around the field so Acronux actually wrote a patch for that and I think it's upstream on GitHub or something I don't know and at first 821X was also not working for the badges and we were running out of time to get that fixed and we also we fixed it afterwards so they are also able to do 821X so they can actually get off the insecure network so some obligatory device and username, statistics as we do in other presentations as well so mostly smart devices on the Wi-Fi network so your smartphones and your tablets and that kind of stuff and a lot of embedded devices so that includes all of the badges so that's and if we were looking at the operating system families that is on number one of course the badge with 1400 devices or at least the Texas Instruments as we classified that with dgp fingerprinting then we have androids a lot of Apple devices and apparently more Windows than Linux which is sad by one, yeah well it's sad and actually for the 821X network you could choose your own username password almost nobody did that because I think everybody read the booklet and it said EMF, EMF so that's on number one and there's also some realms on the right table for mostly Ethereum users so a lot of yeah nothing hack is actually from SpaceNet and the rest of the realms is from from Ethereum so we like to collect a lot of statistics so we also are collecting statistics about where in which rooms or which areas are a lot of people so we also look at how many people are in which states and we can correlate that data with which kind of talks there are so this is on above you see all of the interesting talks we had a lot of visitors so we had the opening talk and then on the first day and then sex robots and after that people were going to sleep and apparently a lot of people are sleeping in field G so we can see that at night and then the next day we're seeing the talks of the Simpsons and how I used to rob banks that were extremely popular and today we've seen a very popular talk in stage B on hobby electronics like a pro we've also been fiddling around with a project of ours which is a Wi-Fi project where we have an open DRT device which will look for nearby access points and try to associate with the access point try to get an IP address send some traffic that kind of stuff and it will report back to our graphite server about what kind of results it got while doing those kind of tests so we can try and monitor the network from a different perspective so this is also on GitHub and if you feeling like helping out with this project then be very welcome lastly some pictures of our Wi-Fi deployment so on the left we have I think Leon and I were a bit bored yesterday so we saw a lot of people sitting around the big tree near the bar slash knock so we thought we had to put up an access point in the tree and at the back of the field there were almost no diving clothes anymore but there were still people camping there so we had to do something to get a little bit of coverage there so we mounted an access point to the fence which is actually has a directional antenna so should be able to get pretty far on the right you will see the whole pool with the plastic boxes with the Aruba 1 AP-135 ready to be deployed to the Dutton Close anyway that's it for Wi-Fi, I'll go hand it over to Aquinox again I guess or David or yeah okay we didn't start the timer so I don't know how much time we have so well I'll be very quick on this part for the first time ever first day evening the network was set up we were scratching our heads thinking what have we forgotten usually we're running around still during the opening ceremony putting switches out but this time as soon as all the power was completed Friday morning the network looked like this and it stayed pretty much like that the whole weekend so some knock members took on additional projects like monitoring the background radiation in the data center to be able to use more bandwidth or monitoring the stock of Marta in the bar we've had three copyright complaints so far although we're not convinced that it's not just an elaborate troll because they're for some quite bad movies there was another one recently actually we have one for so time to do some quick thanks so many thanks so I'm building a network the size of a medium-sized ISP for the course of three days and then tearing it down again it's not a cheap thing to do and less than well around 1% of your ticket price actually goes to the knock and most of that is for sundries like cables and tools and cable ties that we are always running out of so building this network at all is only possible thanks to our amazingly generous and helpful sponsors and I'd like to thank all of them in no particular order but I will go alphabetically so many thanks to aruba booking.com the Chaos Computer Club Comtech Enterprises EventInfo.org FiberOptions FlexOptics LONAP and Sargasso Networks so many thanks can have a round of applause for the sponsors so as I'm sure you all know people volunteer their time and for those who have we thank you very much some of us have been here for a week now we've got another three or four days here on site if you have time to stick around for the teardown it would be very much appreciated if you don't want to work on the network stuff I know all the other teams need help as well if you do want to work on tearing down the network it's not quite as simple as just unplugging all the things so please come along tomorrow morning at 9.30 outside the knock for a quick meeting and that's 09.30 hours tomorrow morning 09.30 and hand over to Peter yeah hello so we are a team from the German Chaos Computer Club who tries to cover events with video recording and streaming so as but not much people watch streams here so we don't have fancy graphs to show so I'll show you a little bit what we did instead so this is the signal path how the video comes from the cameras to the CDN and to the recording so we many of you have volunteered here and have seen the video mixer and the operated the cameras so the cameras of course are the source of the picture and also we have a frame grabber that gets the slides so you will see in the recording that sadly if the video is played from the notebook there will be some delay we will see it in the recording but it doesn't matter so and everything below the top row is done with open source software and most of some of them also written by us so you're all welcome to copy that basically it's a bunch of ffmpeg scripts and a video mixer that's called VoktoMix and on the CDN side we are running nginx service with rtmp that provide hls streams for the apple users and the rest of the planet gets webm via an icecast server and also besides that there is opos and mp3 audio streams after you record a talk or you record many talks like here you have the problem that you have many hard disks full of talks and that's usually not really nice to handle so we have a post processing system for that that keeps track of all the talks and also handles the recording encoding afterwards this is also soon to be open source so if you're running a conference or a hacker space which does regularly events please get in contact we love that people play around with it so at the end everything ends up on media.ccte which is our own video platform and also in the YouTube channel of the EMF camp so whatever you prefer you can see it there I think there is only one talk whose explicitly opted out from YouTube but it's not from our platform so you may be better there and there is also a treat but you can follow that has now treated I think 500 times this event about to know your recording so yeah it was about 52 hours of talks that end up in 2.5 terabytes of data which are encoded to about 40 gigabytes of releases so each release is available at H.264 and WebM in SD and Full HD and also in audio releases in mp3 and opus as I said you can and watch them on our video platform and the EMF YouTube channel and if you're a user of the Kodi media center you can also just call the cc plugin and then have it on your nice tv screen the both machines you see on the slides the one on top are the machines that are in the tents that getting the feeds from the cameras and do the stream encoding and record to the hardest an MPEC TS stream and the smaller ones below are the ones that are also shown in the picture from the stock from the data center these are really tiny boxes where a strange company fits in desktop i7 CPUs with 4 cores which makes them basically a plasma cutter on the output but if you put them in the data center it's okay and these two boxes are handled nearly all of the encodings you will see so they did really much work because we couldn't travel with our big server case here we are now nearly done with I think 4 talks everything else is cut and all recordings will be online tomorrow morning when the encodings are done here is a screenshot of media cccd which is already filled with many talks so if you want to download something for the way home do it as long as the camp is still online and maybe we get another peek in the banter's graph yeah thanks for all of you who have volunteered to do video shifts because we couldn't do it without you also big thanks to the emf crew that made this event possible and if you want to do something with video streaming or recording you can find us at erc at irc.huggin.eu in walk launch or follow us on twitter or look at cdrive.de or everything we do is documented in our wikis so if you are interested more in details look at the wiki and yeah thanks for being part of emf I think we've got a small amount of time for some questions as usually some people who want to ask something so feel free to do so I wonder if it's possible to download a raw data that mends all the graphs we need to make sure we sanitize anything that's sensitive data I haven't really thought about that actually there's obviously that you can look at the dashboard in the day trial of that yeah for example for the power stuff to get what is the peak power to be honest the most useful stuff about the power is actually the design and the experience from doing it yeah we get the figures and stuff like that and we can really it's really nice to verify that we've approximately done it right but yeah a lot of this is kind of experience like oh my god the food vendors use a lot of power that kind of experience or unfortunately for instance with the power we were really lucky that it didn't rain because as soon as it rains all the really nasty extension leads everyone's left in a puddle outside the tent and we have the RCDs tripping everywhere which is actually what we had a lot of in 2014 and we've been pretty lucky here I think we've seen on the main sub mains like maximum 18 mA of Earth leakage which is perfectly fine no one's dying now so yeah yeah what are your main learnings for next time from the camp here it's all gone very smoothly but what would you make better? thanks for saying it's gone really smoothly there's been a certain amount of running around a couple of hours fly we we need we actually use a lot of 13 amp sockets and actually getting that number is actually hard because there's not many events like us that provide powers of so many people so getting all of these from one supplier is a little bit tricky from the network point of view Fibre is great Fibre is good yeah I mean really 2014 we spent really a lot of time on complicated solutions and stuff with micro if you're sure it works and it is a way to do it but it's really nice to have Fibre anything interesting from the VOC? no so just have a question firstly how long will we videos be available for on CCC I'm just wondering forever the rest of eternity fantastic and second I was wondering something I can't remember so yeah good was there a question at the front? I can't see anything from up here something's there hi you've got a lot of different kinds of network devices in your network how are you managing them are you just going in on the console or if you've got some kind of tools for automating it to some extent HP open view not so we started building scripts in 2012 and have been improving them year on year to automate all the things we have a google doc spreadsheet is our ultimate source of truth and everything is built from that you may have a different preference we find it convenient to be able to collaboratively edit it at the same time it has everything from our switches, their locations, our access points our addressing scheme, our VLANs so all the data is consistent and then we have a bunch of Python scripts which you'll find in that github account that just generate everything sorry how much did the network influence the choice of location or were you given the location then told what are you going to do here there is a bit of influence on that obviously being in the home counties we have a lot more options we have looked at some fabulous sites in really nice areas of the UK where you look at it and you say well my options for internet here are so limited sure you can throw money at the problem but we don't actually like to throw money at the problem we like to have a friendly supplier that's local and you know every EMF until now and also the other similar camps have relied on the generosity of supporters sure I mean I can go and spend serious amounts of cash on an expensive circuit here but it doesn't actually it's not actually appropriate for this sort of event so yeah there is some influence there and there's a lot of it is one of the factors as well as alongside all the more complicated things like you know what's the road layout like all of these sort of physical layout issues that are quite complicated it is a factor but not a huge one question for the VOC guys those i7 plasma cutters what were they are they it's a gigabyte bricks and they sell it as a gaming PC which I don't know who will game on it but thank you hey so what happens to the fiber and the copper left after the event do you intend to repurpose it and how much do you think you will be able to repurpose so actually the fiber is some of this is kind of been going around multiple events we realized with I think CCC camp and ohm and so on that we actually can pre-terminate all this stuff in advance and have the ends of the fiber in boxes and on nice drums and document it all so actually what we did was like select a series of fibers from the CCC storage where they live a lot of the time and it's actually happened before 2014 EMF and it was like oh we need one of these one of these one of these and then put them on a pallet ship them in this case from Berlin to here and roll them out onto the field so actually we roll this back up and we use it for the next event wherever that may be or whatever it may be is kind of a variety of events so it means that that effort is not wasted you know actual field deployable fibers kind of expensive so that works pretty well the copper yeah Dave is a copper expert if you would like some copper if you see a copper cable that has boots on it don't take it because it's one of Arians that he's brought to power the access points but if you see anything that we've strung between a dart and clay that's just been crimped for the event feel free to have it otherwise it just ends once it has already been removed from the dart and clay otherwise we have a local scrap merchant picking it up on Wednesday I'm told we're out of time because the next talk wants to start okay are you able to take questions afterwards if anyone's got any more grab us just out the back so thank you very much and a lot team