 Hello everyone and welcome to a presentation on how we got the internet to EMF camp this year and how we almost didn't and we got away with it by the skin of our teeth. So this time in an attempt to make things a little easier, we had quite a lot of preparation and set up before the camp actually started. Then there's going to be a section on the uplink actually getting into here. Then a bit more detail on the course network and the data center and then a little bit on the wired network and AK47 is going to give us a presentation on the wireless. So this is several weeks before the camp began, a group of us coming to the site with a spotterscope attempting to demonstrate that we can get a line of sight to the data center. I'm just going to hold this to the data center where we actually get the internet connection from. It's a distance of about seven kilometres. Looking at maps and things, we can get a height profile just to see if there's any obstructions in the way, but that isn't guaranteed to work because of things like trees and buildings. The height profile may be out of date and may not be particularly brilliantly accurate. Obviously, to be able to see something seven kilometres away, it needs to be fairly large and visible. So Nyn at London Hackspace very generously made us this two metre by three metre fluorescent green and black flag, which turned out to be extremely useful. This is about a week before the camp with a pair of cherry pickers, one with a big flag on it at the data center end and one at the camp end attempting to see if we could see the link. At that point, we actually couldn't. We could see things near where we thought the link was, but not the actual data center site itself. So we were confident, but not 100% certain that was a bit of a mistake. This is the site where we were going to put the mast at the data center end. All this builder's rubble and sand is where the mast is going to go, and we were told it's all going to be moved, no problem, don't worry about it. Because we got large numbers of devices on the campsite to be configured, setting them all up by hand would be a pain in the neck. So we had some Google Docs spreadsheet that had all the relevant IP addresses and details of which devices were linked to which. And then there was a bunch of Python scripts that downloaded those details, took a template file, and generated all the configuration to the switches. We also needed to wipe old configurations off the switches and upgrade them to the latest version of iOS. So there's some extra code there to do that that kind of works, but not brilliantly. This is just a quick example. So if the switch has a port prefix of different switches, some use to have fast Ethernet ports and some have gigabit Ethernet, so we had to take that into account. And then it goes through a loop for each port in the camper network and puts in all the configuration stuff. There's loads of extra stuff in there we've missed out. What else have we got in there? Yeah, if anyone knows of any existing scripts to do this kind of thing, it would be nice to know what they were, because we couldn't find anything and it seems like the kind of thing that someone else has to have written at some point. Because we weren't certain about the line of sight link, we decided that rather than having the mast at the camp end right next to the camp, is to put it far up the hill and run fibre up to it, so that we have more flexibility in positioning it and to try and avoid trees and things in the middle. So come Wednesday, when we're on site for the first time, if I just turn back to this, we come to the data center, and there's a diagram there which you can't read, and a cabling diagram which you can't read either. It gives you some idea of the complexity. The kind of the squares on the edges of the individual depth and close with the core in the middle. So we found that at the data center, the rubble and sand hadn't been moved, so we couldn't actually put the mast there. We then spent the morning running around all the local car parks and businesses saying please can you put a 26 meter mast in your car park. Quite surprisingly most people said no. Luckily we found a garage down the end of this alleyway here where they were very friendly and said yeah you can shove a mast there. So this is the mast up on Wednesday night with a flag flying on top of it, so that come Thursday hopefully we can get a visual line from the camp and get it sorted out. What else have I got in here? There was another plan on Thursday morning called Plan P for prayer, which is to try and relay the signal from the roof of St Mary's church, Bletchley. When we contacted them they said no, but they said it so nicely I thought it was worth mentioning. On Thursday we had a 32 meter access platform which is a huge truck which had difficulty getting up the alley but got up there in the end. We had that for Thursday only. If we couldn't make the link work then we wouldn't have internet at all. Back to that one. This is the view from the top of the cherry picker towards the camp. I don't know if you've looked up the back of the hill here, just about on the left here, there are two shorter masts and they're not visible on this slide is there's one very tall thin mast which has been put in by the farmer with some anormators and some data logging because they're trying to work out whether it's economically viable to put a wind turbine in here. On the left here, you probably can't see it, you can just about see a grey thing at an angle which is the mast at the camp end halfway up because it's in the process of making a greyish greenish blob which is a land drive of driving people up there to sort out the mast. Let's see if that works better. We also needed to get from the mast at the data centre end back to the data centre. That was over 100 metres so it wasn't possible with cap 5 so we needed to get a roll of 5 but luckily we had a spare untested 140 metre roll so we used that, unrolled it all the way and as we were unrolling it all the fibres were cut apart from 2 and those two uncut ones didn't work either. So we had to call back to the camp for a spare roll, get it run round unroll it all, all the while trying to get the link up and running but luckily we got it running in the end. So we had from the data centre 150 metres of fibre running along a rubbish and weed stream bank to a garage tyre and spare part store which had a switch and the power injectors for the radios and after all that at sundown on Thursday we were just about to be able to get internet to the camp. On the campsite we also had lots of problems because we discovered that the fibre running from the mast at the campsite was also broken and needed resplicing which luckily we had a splicing kit on site so we were able to get that fixed. This is the decay way up on the hill getting lonely on its own to the left here is a sort of lighting mast tower with a built-in generator Luckily that had some spare sockets on it so we were able to use that to power the decay. At this end we don't need the full height of the mast so it's only up a little bit which makes it easy to deal with. This is the on-camp data centre in a refrigerated shipping container we didn't bother with racks everything's just sitting on pallets unfortunately being refrigerated the maximum temperature it could go up to was only 10 degrees C so it got fairly cold, we had problems with condensation and we had a high-tech solution with a bucket. This is the usual things core switch, a distribution switch and the all-important wireless controller some virtual machine servers for the on-camp services and a big mess of cables and now we're on to wireless. So yes wireless in terms of hardware we had this year an Aruba 2710 controllers which can handle up to 512 access points and has four times 10G We uplinked the wireless LAN controller with one times 10G to the Cisco 6500 core switch we had over here We didn't actually need that 10G because we only hit like 300M or so but well if you can uplink it with 10G then you should write We had about 50 dual-radio 8211 and Aruba access points and then another 10 dual-radio 8211 AC access points The software we were using is free-radios for most of the 8211x networks then we also had the Aruba air-wave management switch which we used for monitoring and gathering statistics and that kind of stuff and we also had graphite running to make some pretty graphs which we will show later In the picture you can see in the middle the 2710 controller with the 10G uplink On the top here we have a AP275 which is an outdoor access point from Aruba which has six integrated antennas so it's not actually a security camera it is actually an access point so you might have seen them deployed over the side and at the bottom which is an access point which is actually hanging in this room which is an AP225 which is a dual-radio 8211 AC access point This might be a bit hard to read but this is the config we've deployed so we had separate SSIDs for the EMF camp SSID on 2.4 GHz and on 5 GHz Most of the networks were encrypted with WPA2 Enterprise 821X encryption with authentication was being sent to a free-radio server This free-radio server also had an uplink with the top server the server for Spacenet and a server for Ethereum So we were next to the EMF camp SSIDs also offering Spacenet and Ethereum They are both serving the same cost so it is a federated authentication so if you are either a member at a local hackerspace or if you are a student at a university and you got an account at that university or hackerspace then you can use Ethereum or Spacenet If you already have that configured on your device then if you head over here to the campsite you don't have to configure anything anymore on your device so if you already have something that configured it will just work so you don't need to set anything up so that's pretty good for the ease of use of the users So we are tunneling all of the traffic through the Aruba wireless LAN controller The reason we are doing this is mostly for easier AP deployment because the access point just needs to get an IP address in random VLAN so it can build a tunnel back to the controller and that also means we don't have to stretch any VLANs across the site This also gives us the opportunity to do a lot of broadcast filtering because the Aruba controller has features like an ARP proxy and a neighbor discovery proxy and we want to filter broadcast traffic because that is usually at this scale is killing for the overall wireless performance We ran with a 4 channel plan on 2.4 GHz with 20 MHz channels and then with a 19 channel plan on 5 GHz We had about 10 access points dedicated configured for air monitoring so those access points are doing dedicated background scanning doing rogue detection and doing intrusion detection So I'm not sure if this map is a bit readable but this map should show the positions of all of the access points across the field so each datanclow had an access point and in this stage 10's multiple access points are deployed so over here we have 5 access points with one access point configured as an air monitor The reason we are placing more access points in a stage like this is mainly for capacity because this stage could fit 300 people or something so we need more access points to serve that kind of capacity So in this photo you will see the inside of a datanclow with on the bottom the Cisco switch which is in most cases also can do power over Ethernet and on the top we have a Aruba AP134 on the right we have a AP275 which is placed on this C field so you might have seen it we deployed that yesterday because we had some problems with coverage on 5 gig over there so statistics we had the number I got this from was about 2 hours back we had 2012 unique MAC addresses scene on the peak of 1090 concurrent associations 75% of the clients are smart devices either smart phone or like a tablet 50% of that is Android 25% is Apple iOS and then a way lower number are the laptops 7% on Linux, 7% on macOS and 3% on Windows users of the SSIDs so we had 33% on EMF camp 30% on EMF camp in secure 25% on EMF camp legacy 7% on Adorama and 2% on SpaceNet so this graph shows the aggregated users of a couple of SSIDs because we had multiple SSIDs running on WPA2 Enterprise it shows basically over here there's a way larger amount of people using encryption versus the unencrypted network so this pretty much shows that people do care about encryption and they well really should so 2.4 GHz versus 5 GHz the distribution is about 5050 we still saw a lot of clients that were actually 5 GHz capable still connecting on 2.4 GHz we were able to detect this on the wireless lung controllers because the Aruba access points can detect using when they are seeing probe requests from different devices so they can mark them as 5 GHz capable so this, the reason for this might be due to user error so the user is connecting to the 2.4 GHz SSID while it really should be connecting to the 5 GHz SSID bandstering can be a problem or the device is not capable of using the DFS channels so this is already a higher number of 5 GHz channel above channel 44 I'm sorry, 48 on this 5 GHz channels you have to do radar detection and some devices cannot do this this graph shows the channel utilization so the amount of 8 to 11 frames we are seeing on a certain channel and here we can see that we have about 40% average utilization of the 2.4 GHz band versus 10% on the 5 GHz band certain radios are peaking at 100% so that means the 2.4 GHz channel is completely exhausted so this is the traffic we did on the wireless lung controller we had a peak up to 300 Mb well this is also an interesting graph because you can it's a bit hard to see actually but so the brown area here is the amount of client associations in stage A so you can see over time where people are moving so when the talk is done you can see they are moving away from stage A and then going back to the A field to miliways or something like that so you want to take over for this one or? we have support from many different vendors it really takes a lot of organisation help to get this kind of thing running Bytemark loaned us a lot of equipment and also did some data centre hosting for us Comtech also loaned us a lot of equipment and also David C's time at which he gave extremely generously and also gave us some server hosting Rapid wireless is who we hired the wireless link back to the data centre from and they also helped with deployment Colocker gave us our internet link and also data centre access the camp couldn't have happened Flexoptics as usual loaned us for free a load of SFPs and Givix which are vital for linking all the access which is together and Aruba loaned us the Wi-Fi loaned up provided the uplink and so did Targasso networks and that's it I think any questions? lots of questions Martin so in this case we are getting a lot of sponsored access points so we need to do what we have so all of these access points have omnidirectional antennas so I would definitely agree it's better to use directional antennas but we simply don't have them available so we need to make the best of it exactly pretty low actually but we don't have that many access points deployed to detect but I would say it's about I think we saw about 50 of them so you have mostly our smartphones on the Wi-Fi network so there are the Galaxy S4 which is 11ac capable so it's mostly that kind of devices we saw but in the next coming years we're going to see a lot of AC clients so that is actually good news that means a lot of more 5Gig users so we can finally throw away this crappy 2.4Gig band and move to 5Gig completely but yeah a bit hard to we could look that up actually but we could put that from Graphite not enough as usual so I would say we had an average about 100Mb so if you calculate 100Mb over 84 hours then that should give you a fair number of of data uses I would say total total upstream plus downstream was average about 100Mb I would say I think you in the middle had a question I can't remember well do you know hello everyone so it's quite a common question so the microwave link is something like 430Mb as installed it's all giggle links for the actual transit network and to be honest I haven't had a chance to look at the graphs so I can't answer the question about how much traffic we use because I haven't looked it's mostly all on wifi though was there anyone else any more hands we have really no more questions great I can go get a beer how many cables did you have to plug in in the dotting club for network I have no idea 22 mostly power outage at a large part of the field so basically stage b and a large part of the abcde field so those are the spikes didn't hear the last bit two megabits we were looking at things like strobes but looking down into the city from up here I went up there in the cherry picker on the hill at like 17m up on Tuesday night and it's damn cold up there and it's very difficult to make stuff out amongst the backdrop of the city it's I kind of feel it's easier to do it in the day but yeah it's worth thinking about it's quite difficult to kind of sometimes do things like that in the middle, up a mass 30m up and you're going to wind it up it's just thanks to Jesper actually so Jesper is running the UK node for Spacenet so if you have a hacker space in the UK talk to Jesper to be connected to Spacenet if you have a hacker space somewhere else in Europe talk to us there is also a country node in Germany and one in Luxembourg we're currently in the Netherlands doing the rest of the hacker spaces in the rest in Europe who wants to get connected so by all means if you want to get connected to Spacenet then you want to have this awesome roaming experience between hacker spaces and this kind of events then please contact us okay thank you everyone