 Thanks for the introduction and thanks for having me, my first time as well. So I'm really happy to see all of you here, so I'll be talking about an LED installation that I've been doing last year in August together with my performance group, Starlight Alchemy, and it's called Helix, and my wife added a colon in front of it because she thought it was more hipster. So now it's colon, Helix. So a little bit about myself. I'm a born hamburger from Germany, and I'm a computer scientist by training. I moved to Singapore three and a half years ago. I'm now happily married to a Singaporean and probably stay here for a bit longer. Currently I work at NUS in the IoT area. So we're doing Android stuff and making IoT devices talk to each other in a very simple way. And since I came here, I am also a member of the Light and Fire performance group Starlight Alchemy, and I'm going to talk a bit more about them so you get a bit of context to how I ended up doing an LED installation in front of the Singapore Art Museum last August. So Starlight Alchemy is a group, a loose collective of very international artists. I think we have a Korean, we have two Germans, we have two, three Singaporeans, four Singaporeans, five Singaporeans, and more. Like, we're a bit like about nine people in the core group, and then we have a bunch of friends that come in and out depending on whether we need more support. We've been doing lots of different things. Most of it related to so-called flow arts and object manipulation, which basically means we try to use, to dance with objects and see how we can use the momentum of these objects to create interesting light patterns in the air. That's the more subtle way. If we need a bit more energy, then we just create big explosions with fire, which is more fun. But different effects. We do this for corporate events, for advertisements, for festivals, so lots of different occasions. We also do wiki workshops, so if you're interested to learn some of the dancing and flow art stuff, contact me afterwards and I'll hook you up. We have teachers for acrobatic yoga, hula hoop, poi, aerial arts. Also we have some gems in aerial arts and probably also staff, if we find enough people, but I don't think there's a group right now. We meet every last Thursday of the month at Gardens by the Bay for what we call glow night. It's just nice because all these illuminated forests around us and we're spinning our lights, bring our music, have some food. It's fun. If you want to come around, you should, it's fun. That's what Starlight Outcome does. Since 2007 it has been growing and growing and we had a collaboration ongoing with first the Esplanet Theatre where we did Starlight Outcome with a couple of shows and then the person who handled us there moved over to a National Museum, which is how we kind of ended up starting to perform for the Singapore Night Festival. And that's how this connection came to part. And I want to talk a little bit about, obviously today I'll be focusing on our more installation part of Starlight Outcome and the sculptural part. So I want to talk a little bit of where we draw our inspiration for this kind of stuff. And there's like one big line which makes it mean like science videos are awesome because you get all sorts of weird and interesting effects. If you look at them and you try to take them out of context, it's really interesting that you can try to take these physical or chemical effects or optical effects and try to blow them up or multiply them, show them in a different context and create a little video to show what I mean by that. And I hope it will play. Yes, this is something, I mean the science videos, they usually look kind of annoying but this is a vortex, two vortex clouds colliding in midair, which I think is super cool. This is a wave pendulum and you see these S-shaped patterns. It's basically just evolving because the strings are different length. So you see these very intricate patterns that keep evolving. So one way to think, you look at this and it's really cool. You think, how can we make this more interesting, right? So someone, not us, thought, yeah, let's make it bigger and illuminated and then you've got up sculpture right there. So that's like one way of trying these things and they even added some interesting musical element where every ball hitting this side here creates a piano note and depending on how long the string is, you get different notes and you basically also, you not only see the frequency and the resonance is happening, you also hear it. I think it's a really neat taste. Starle Alchemy did a version of this. We set it on fire, which was interesting, not as successful as this. I have a short video of it coming up in a second, hopefully. Yeah, this is the Starle Alchemy version. I think this one, the light version wins. But it was worth trying it out. Another one I really like personally, we haven't done anything with it yet, but I really like it. It's the Ruben's tube and a lot of these tend to go viral these days. So some of you might have seen this before. Well, this is basically, it's a tube filled with gas and you have a speaker at one end and the sound waves of the speaker create pressure differences inside the tube, which push the gas out so some flames will rise higher than other flames. So you basically have sort of like a flame equalizer and every time you hit a resonance frequency, you get like a nice wave pattern and someone took that and saw it's make it 2D. So they created the Pyro Board, which is super awesome. I think. So again, this is like the Ruben's tube, I think was invented in the 1940 by a guy named Rubens. And so it's really, really old effect. But now someone made it 2D and it still works, which is quite cool. So I thought one of the plans I had is like maybe make like a lot of Ruben's tubes and have like different parts of a track visualized on it. So you've got like a 2D equalizer on it. So now to what actually leads to the helix. This is the origin of the whole idea. It's a wave simulator. So you have this and you have a bunch of sticks on a center string. And when you lift one, the momentum propagates through it. And it's called a wave machine. So we saw that and then started out to be turned into a fire version. So that one was very successful. A lot of people love that. We did a lot of shows with this and it's basically just the same thing from the science video just taken out of there and set it on fire, make it bigger, make it operatable by two people. And that's that's a lot of how we get our inspiration. This was three years ago for also Singapore Night Fest. Some of you may recognize the building. OK, so from there, we came last year to Singapore Night Festival 2015 and happily coincided, obviously, with SG50. And SG50 was great for us because since it was Singapore's 50th birthday, the organizers of the festival wanted to focus more on the local group. And since we already had a connection, we got a pretty big gig, our biggest budget ever to basically do what we wanted to do. And therefore we we we developed an initial concept which involved three shows distributed over the nights. And the idea was to tell the story of the the history and a relation between fire and light and their differences and their final reconciliation. So the first name for the show was the history of lightning. The museum thought it was a bit too complicated, so we simply resorted to alchemy. And we started doing all this and I think like halfway through when we were like in the middle of the show preparations, they moved us over to the Singapore Art Museum and told us, yeah, you also have to do one week of installation of your pieces. So that was quite a shock, which we had not considered. So we had to do an opening planning. And luckily, we had decided that for having this difference between light and fire to actually do an illuminated counterpart of the fire helix that we already had. And that was how the idea of the LED helix was born. And luckily, that one also kind of was possible to pull off as an installation piece, which kind of saved our butts at that point. So how do you turn a fire helix into a light helix? Let's start a little bit with the requirements that we have to make this. So for once, it has to be a performance prop. Actually, it needs to be operatable on stage by people, which implies that we cannot have any wires coming from it and coming off it because it needs to be on, needs to be movable on stage and off stage. So that's the reason why it had to be completely wireless and battery operated and also handholdable, which implies some weight restrictions. Then they wanted it to have, they wanted to use it as an art installation for a whole week and every night. So that required a long run time, so the battery life had to be sufficient. They wanted it interactive, so we had to find out some intuitive way of making people interact with it and without having age-long explanation windows that they had for us to read to how actually use this thing. So, and the pure functional parameters to make it match the fire one was, this one has about 35 elements, all about 1.2 meters long. And the tube diameters, we decided to have it at one inch because that is very convenient with an existing system, which will be, which we can just use parts for like the juggling equipment that we used. So yeah, so we said we need at least four hours of operating time. And those were kind of the parameters that we were working with. And what we came up with in the end is, and has everybody seen the video yet? The one, no, okay, then I'll just play that one again. So this is what it looked like in the end. We had a six meter high steel frame, which we luckily had to manufacture for the show as well. So we used that for that as well. And the whole thing is hanging, it's 35 elements, 35 illuminated stuff. Zoom out the sound, so you hear them only once. So yeah, it was basically hanging. We had a motor at the bottom that twisted it slightly so you get this helical shape, a swivel at the top. And there were several ways to interact with it. So one of it was you could play a piano and it would visualize the notes that you were playing on it in these dropping waterfalls of color. Or different mode was coming up soon. Some spirals like this. And the second way to interact with it was to use the Kinect camera and take the dance motions that you do and use them to create the shapes and patterns on the helix itself. So this is the output of the Kinect camera and then I'll make a bit music more loud. It was more fun with music. So that was quite success. We had a lot of people having a lot of fun dancing with the helix, which was great to watch. And you can see that's my wife. And the way to experience the helix is to just be in the moment and get back into this playful spirit and give yourself a nice moment. I hope that if you have a lot of circus friends, then they can all perform handstands and awesome stuff for the video, which is great. OK, so that's basically what it was and what we did and what the end result is. And today I'm going to talk about what is behind the scenes and what went into it. And so that was the fun part. Now we're getting into the technical parts. So I'm going to start, dive right in with LED technology. What was the light actually off? So this might be familiar to some of you. If you have any questions, by the way, just don't hesitate to ask. Just interrupt me. If I'm boring you to death, shout boring. Then I'll just skip the slide. So I was using these little LED segments, which are called WS2812Bs. You can buy them on Alibaba in five-meter rolls. They come in these kind of strips. They are actually really cool because they have, these are RGB LEDs, but they have a built-in controller, which is this little black thing up here, which means you can address them individually. So most of the LED strips that you can buy at Simlim Tower, for example. In fact, I think all of them, I haven't managed to find these there yet. They are all only one color. The whole strip can only be one color. So for these ones, I can address each of the, I call these pixels individually and can address a color to each individual pixel. And that's pretty cool because it makes a lot of things possible. They come in 30, 60, and 144 LEDs per meter per version. And another cool thing is that they only have power ground and only one single data line, which is used to control them. And that makes it also very easy for your microcontroller to actually handle them because most of the other single wire, 12-volt LED modules, use four wires. And you have to drive at least three wires of them. So you have three channels that you need to address individually. For these, you only need one. The way they do it is quite ingenious. You basically send ones and zeros along the data line. And the controller registers a one if you send the high pulse, which means power on, is shorter than the power off pulse. And the one means the high pulse is longer than the low pulse. And then you just bang ones and zeros through the line. And the first one, the first LED takes the first three 8-byte values, sets the color, and then afterwards just forwards it forwards the next value to the next LED. So they just talk to one each other along the line. And once you have sent all your data, you send a reset. And then the first one starts listening again. That's basically how this works. And another interesting property of these guys is their power consumption. They take 20 milliamp hours per LED. But you've got three in each of them, so each pixel takes 60 milliamp hours. That being when it's on bright, white light and driven with five volts. So in the end, I decided to go for the 30 LEDs per meter. So that's what's in the sticks. In the staffs, that renders itself to down to having 74 pixels per staff. I measured the power consumption, and I'm driving them at 3.7 volt only, not on five, from lithium ion batteries. And the power consumption, depending on what pattern you put on it, was 400 to 1,200 milliamp hours per staff. So in the middle of about 800 milliamp hours there, I figured out I needed about 3,200 milliamp battery power for to get my four hours of runtime. And luckily found these 14650 trust fire lithium ion batteries, which basically have 1,600 milliamps each. And times two, you arrive at 3,200. That was a lucky find. By the way, fun fact, lithium ion batteries are named brilliantly. It's not like something stupid like A and AA, which nobody knows what it means. These ones, 14650, the first two digits say how wide it is, and the last three digits say how long it is, which is really, really interesting if you have to stick it in a very narrow tube. It's very helpful. All right, that's the LEDs and the power. Now, is this? Yes. OK, so now for how do we actually talk to these LEDs? And I decided to use something that's called the ARTNET protocol. And that's kind of an old protocol. It's like the MIDI of lights, so to say. It's based on DMX, which is used in all these big stage lighting setups. Like whenever you see Lady Gaga on the stage, and lights go bling, bling, bling everywhere, they probably use DMX and ARTNET in some kind of way in there. It's a very simple protocol. It basically delivers 512 8-bit numbers to an address. And it all starts like the first 8 bytes, say, ARTNET. So you know it's an ARTNET packet. And then follows address of the universe, which means every node, which is your final destination for the packet, can support a couple of universes, which is just a way of making the node understand more than 512 values if you need more. Actually, what it looks like exactly is not so interesting actually at the moment. The interesting part about this is more that what it means for your bandwidth, because remember, we want to do this wirelessly. So we've got 35 staffs. And they each have 74 pixels. Each pixel in needs has three LEDs. So that's 3 times 74, which is 222, which fits in our 512-byte ARTNET packet. So that's the good news. But now we've got 35 staffs. We want some kind of smooth animation. So we're looking at about 24 frames per second, which means we are sending 35 times 24 frames per second. And each of them has 512 bytes of size. And if we multiply that by 8 bits, we get that we need about 3.4 megabits of bandwidth if you want to do 24 frames per second. That sounds all very technical, but it has a quite huge implication if you think about which wireless technology you actually want to use. So these are the three usual suspects when it comes to wireless technology, one of them being Bluetooth, which is great, but unfortunately, has not that much bandwidth. It's less than three. And also, the range is not spectacular. It's quite OK, but it's not spectacular. And one of the biggest killers is it's really hard to get more than 70 devices connected. You can do more, but then you have to go into different kind of network topologies. And it becomes all very esoteric and tricky. So Bluetooth was out. Then we've got ZigBee, which is even worse in terms of bandwidth, with 250 kilobytes per second. Yeah, doesn't really work at all. The range is quite nice, but also ZigBee is freaking expensive because it has a ZigBee logo on it. And all these modules come by $35, so that one was out too. So we resorted to good old Wi-Fi 802.11, which is great because up to 11 megabits per second gives you enough bandwidth to talk to your devices most of the time. Range is fine. Devices are theoretically unlimited, practically very limited. It's a bit energy-hungry, unfortunately, but that was something that we had to deal with. And the great thing about it is just, I think, like one and a half years ago, a little chip called ESP8266 was released, and that's a little microcontroller, which comes at about $3. If you buy more of them, you can get them even around $2. And that thing was great. The light is great, I will tell you now. So my initial plan was, yeah, I'm just going to use an Arduino micro-sticket in my tube, and I found the use is great. Put Wi-Fi on it, touch it, great. I've done, I'm settled. So I did a bit more research, and I found out that actually the ESP8266 is a great little microcontroller in itself who is perfectly capable of driving LEDs by itself. We don't even need an Arduino, which was awesome. So at this point, I have to thank Charles Law and Peter Scargill's two very enthusiastic ESP8266 developers who figured this out and paved the path for me so I could stand on their shoulders and use their initial technology demonstrations to start and drive my LEDs. That was awesome. So a little more details about the ESP and why it's great. This is the version I use. It comes in about 12 and more different versions, which are different sizes and different flash memories. So there's a lot of variation. I chose this one because it's very, very small and it has, for its size, the most GPIO output pins, which I use to drive the LEDs. It runs at 80 megahertz, so it's faster than an Arduino. It has more flash than an Arduino, more RAM than an Arduino. It has built-in Wi-Fi. It can be a client. It can even be an access point. So if I switch this on, you can connect with your cell phone. It's great, and it comes at $2 and 1 half. So even if you destroy it, it doesn't matter. Just go by 20, one broken, and yeah, whatever. So this one, this is great. Because it was so cheap, there's a couple of drawbacks. It runs on 3.3 volts, so you can't just use your USB power pack and stick it on. So you need to do a bit more careful with your power. And it's not 5-volt tolerant, so if you put 5-volt on the TX line, you will destroy the chip. So don't do that if you want to play around with this. There is more boards now, which are equally cheap, about 5 volts, which come with onboard power converters. So you can put 5-volt on it. When I started, there was none. So I started doing my own, which looked like this in the first version. And then after I figured out that that worked, I kind of created a little more sophisticated drawing and a PCB laid out. I don't know if you guys know Uperter. It's an online PCB designer, which is really, really good. I enjoy a lot working with it. So just go online. If you have public designs, you can use it for free. It does all the designing very easily, very nice, all sorts of parts. There's a lot of parts already there, so you don't need to design them yourself. So it makes the process of designing a PCB quite easy. And after I designed it, I sent it to China. And luckily, it came back. And it's looking like this. And if you want to have a look at it yourself, right there. Pass it around. That's the thing that's inside. Hey, come on. Do you have an extra pocket because all of you have a pocket that you don't want to use? Yeah. There we go. Just pass it around if you want. So that's the heart of the Helix. Each of the elements has one of these inside. And what you cannot see, which is written around here, is that this whole thing rolled down to cost me about $6 to manufacture. And because it was a cheap, I made 100 of them, which is correct, because now I have lots of things. Does it work? Oh, there we go. Yes. So we got a lot of them. We can just illuminate whatever we want now by just putting it on. So in fact, if you want to do that, you just use all the LEDs. And if you just take this light battery in there, in there. So whenever you've got some LEDs lying around and a battery, everybody should have in their pockets all the time, then you are always just one step away from an illuminated little room lamp, which is taking me Wi-Fi Sometimes I get problems with the Wi-Fi connection, then a reboot, which is what you're seeing right now. But as soon as it finds the Wi-Fi in here, it should connect. I set up my own router, so that's. Anyways, we'll see more of that later. Yeah, but that's the fun part about having these little small modules. If you want to have a look at the LEDs inside, you can also pass these around if you want. These are the exact same ones that are in the helix. All right, so now we've got the microcontroller sorted out, which is great. So we can put it all together. We ordered PCB, not PCB, polycarbonate tubes from China. Unfortunately, the ones that they have on stock are half a millimeter too thin in the wall thickness, so we had to make a new mold half a millimeter thicker, because they were not stable enough. Sometimes half a millimeter makes quite a difference. It's quite amazing. The new ones were much more stable, much better. I've freely printed inlays to hold the batteries. So the whole setup that's inside the helix in the end looks a bit like this. So we've got a battery, goes in here, and the controller goes in here. And then we've got a lot of cabling. And then the whole thing with LEDs looks a bit like this. There's a lot of bubble wrap around, because we wanted to avoid that the LEDs sit directly against the polycarbonate tube to get a more diffusive effect. Otherwise, you would see the pixels very sharp. So that just looks a bit more beautiful. So that's what the bubble wrap does. This is the other side, which sticks up there. And then this whole wiring goes onto this and forms one of the staffs. And the bubble wrap, OK. You will see in a second. Yes, exactly. So this is exactly one of the elements of the helix. It's got another button on the top to switch it on. And the bubble wrap does basically this. So you see how over here, where the holder is, you see the pixels very sharp. While over here, they become more diffused. That's what the bubble wrap does. If the bubble wrap was not there, they would cling to one of the sides. And also, we do spin them around a lot. They would also rattle. So with the bubble wrap inside, you can't hear anything if you do this. So this is exactly how one of those elements looks like. And if this does not work, could you do me one favor and just restart that router? Just switch it back on? Yeah, thank you. Anyways, so we can just leave this one happily blinking away for now. Is this the new Y-saber? The what? Yeah. Yeah. It doesn't do sound, though. It's fixed on. Yeah, it's fixed on. Yeah. So this is great. Now we've got one of them. Unfortunately, we needed 40, which is where you bring all your friends in and make them solder and cut cables, and solder some more, and cut some more cables, and cut LEDs, and bubble wrap them, and big props out to my friends for helping with this. Otherwise, that would have not been possible, because building 40 of these is quite a feat. We took about two days of assembly to make all 40 of them, and yet we made 40 because there's always some defective ones, so we made 40 to be able to use 35 for the end, which worked out quite nice. And if that all goes well, what you end up is a lot of fun. If you take those elements by themselves, you can start building funny structures with it, which are addressable, and you can have lots of fun with it. That was when we still had only half of them, that we built a bit more, like a tent. And it's quite fun to have all these elements lying around, because you can really get into some playful spirit, and then start making all sorts of weird shapes out of it. But in the end, what it was supposed to be was a helix. So we assembled it together, and this is basically what you see there, plus a cat on a leash. Which always wants to be the center of attention. And this kind of illustrates how this whole thing is basically a giant display. He really likes to be the center. So you can look at the helix as a giant, very low resolution display, and how to use that I'll get into that later. Just let me use a couple of words, go back to the Wi-Fi situation. So we figured out Wi-Fi is great, but Wi-Fi is not so great if you have lots of clients. And we've got 35 clients here, which doesn't sound like that much, but it is actually putting quite a lot of stress on routers. I don't know how familiar you guys are with how Wi-Fi actually works in detail. So basically, Wi-Fi works on a 2.4 gigahertz spectrum. And on that spectrum, we've got 14 official channels, although 14 is kind of like an exclusive one which nobody uses for regulatory reasons. And the problem with this is that all these channels are overlapping, which basically is the same as if you go to a party and everybody speaks the same language, although everybody speaks a slightly different dialect. And all the channels are basically how different the dialect is. Like this one is very similar to two, but very different from 13. So the closer they are together, the more similar they are. And so if one speaks and two speaks, and three tries to listen, it can't because there's two people talking at the same time, you can't hear it. And that's basically exactly how Wi-Fi works. So two people on the Wi-Fi network can only talk at the same time if their dialects are different enough, which means for Wi-Fi that you can only use channel one, six, and 11 at the same time without causing interference. If you try to use more channels, somewhere in the middle, the signal quality on the other channels will degrade, which brings on your range and your bandwidth. So that causes problems if you have 35 devices, which are talking a lot. And the first conclusion you come to is that consumer routers are really shit. And if you get more than 10 active clients on a consumer router, you, yeah, welcome to hell. It's not fun. They are optimized to get your Wi-Fi connection as fast as possible. So your gaming works, which is a one-to-one connection, but they're not optimized to have many devices. So that is a big problem. And also, if you're sending that many messages, so for 35 staffs at 24 frames per second, that's 840 messages per second, you are causing a lot of interference and congestion. So how to attack these kind of problems is you use an open source operating system for your consumer routers, which makes them less shit, still shit, but less shit, which is good. That was the first relief. And then in the end, I had to buy the bullet and do a three router setup, so I actually created three different Wi-Fi networks on one on channel one, one on channel six, one on channel 11, and then divided the helix into three segments of 12, 12, and 11 staffs, each associated to one of those routers. So they all had their own little Wi-Fi network, which then made it work most of the time. If I asked it, and prayed for it, and said, please work tonight. What is the second one? This or like every second one, every third one? A third each, like three blocks. It's a good question, actually. I didn't think about it, but maybe it might have been smarter to separate them, like one, two, three, one, two, three, to get the same group further apart. Good idea. I'm sorry, yeah. Can I ask back, where do you actually place the routers, actually? Yes, as close as possible to where the helix is performing. So for the performance application, when we use it on stage, I usually try to get the routers directly underneath the stage. So the helix is performing literally on top of the routers. For the hanging installation part, I had it directly behind, at the back of the wall. And then it worked so-so, if I jiggled the antennas, and got the antennas in exactly the right orientation, then I could get it to work quite reliably. But it takes a lot of fine tuning. And it's mostly, it's very dependent on how many devices you get into. Like up to 20 devices, it's more or less no problem. Soon as you go over 20 elements, it starts to become more and more of a hassle. So we actually took out, I think, five elements now and brought it down to 30 to make it more reliable, and also a bit lighter. Yes, also use UDP and not TCP, because TCP tries to retransmit packets. And if you have such a close timing, by the time the TCP packet finally gets retransmitted, you're already past the time where your LED was actually supposed to light up. So that was the measurements I took to attack this Wi-Fi situation. And it really helps to do it in an area where there's no Wi-Fi. We had a performance on Santosa Island, which was great. It worked like a charm, because there's no one there doing any Wi-Fi in front of the single-park museum. It's a different story. Especially if you have Wi-Fi at SG, you're in trouble because they set up, they also, like the way it works for them, that you have a lot of clients, like all pedestrians around you, able to go at Wi-Fi at SG. They set up three routers very close to each other, and you have three networks every time on three different channels. It's how you do so-called high-density scenarios. So if you're in a public area where this is already happening, you're competing with Wi-Fi at SG, which is a problem. So much for the technical setup, which brings us to the final setup. So we had a Helix here, a laptop there, and three routers. And these three routers were connected to my laptop. And on the laptop, I was running a software called MadMapper. So MadMapper is basically the software I use to create the ArtNet messages. And MadMapper is great. It does a lot of things. It also does projection mapping. So if you put a projector, you can distort the image to fill it to different surfaces. But for me, the great thing was that it has a very, very nice module to drive LEDs. And it's very flexible. It supports a lot of different product calls. And you can interact with it in many ways. I'll go into that later. Unfortunately, it's only a Mac version. There is no Windows version for it. But it's very useful. And I'll give a short demonstration of how it actually works to address these stuff, if my Wi-Fi has been generated, which I hope. All right, let's see. Let me just switch to mirroring. Oh, let's go down. Is this OK? OK, so this is how MadMapper looks. And what I can do now is, so first of all, I will check whether this one has connected to the Wi-Fi. So the controller is just as like any Wi-Fi PC, so I can basically go and ping my, what would it be, on one? So I configured this in the router. So this one will get the IP address 101. Oops, it will help if you put ping in front. So there it goes. I can ping this. So this is just answering to my ping now, which is great. So I know it's connected. And I can just put it in MadMapper. I can say it's an ArcNet device. It's got this IP address. And I tell our MadMapper to use universe one and remap it to universe zero on the device, which is where it is listening. And now the really fun part about it is I can now go to the LED module and put in a DMX fixture. And if I say this is universe one, then there we go. I've got like these are all the LEDs basically stretched out, so a more accurate representation would be like this. And if you see this is the staff, and if I'm moving this up and down, you can see how this actually changes where the LEDs are lit up. So if you've paid attention, I said I've got 74 LEDs. But this doesn't look like 74, which is because I've mirrored them because I want it to be both sides. So the actual number of LEDs is half. So we've got only 37 pixels on it. And then I can do all sorts of fun stuff. For example, I could use my FaceTime camera and see if I get out. OK, there's some nice black part over here. So now it's basically switched off. But if I take my hand and bring it here, then I can basically play a video on this one line display. All right. So this is why MedMapper is cool, and why I use it a lot. And it's kind of the building, the foundation of how to address the helix. And you could also see that it was doing rainbows before, which was the demo mode. And it's programmed in a way that as soon as it connects and receives an ARTNET message, it stops doing the demo mode and will listen to ARTNET. So right now it will just keep listening to ARTNET. So this is how you map the LEDs. And so what it allows you to do is basically treat the helix exactly as a display would be. So you can literally play a video on it. And each of these elements stand basically just align on the screen. So this is a video of how it looks if you use a video as input for the helix, as on a show. So this is the animation that I use to drive the helix here. So you see life, how this translates to what's on the helix. It's a little bit overexposed, unfortunately, but I guess you can sort of see it. And you can also see where Wi-Fi interference and lag comes in. For example, the right side of it is a bit lagging behind at the moment. Luckily, it's not too obvious most of the time, which is good. And it also depends on the pattern, which is interesting. Like depending on how much is going on here, MadMapper is smart enough to adjust the message ratio. It's like if the stick doesn't change, if the staff doesn't change, it won't send a new message. So it's not blasting 840 frames continuously. Only if you have everything changing at the time, then you get 840 frames per second. And you can see that it will actually start lagging, unfortunately, unless you're on Santosa. Is it mirror in the middle? No, it's not. It was like the red band was like the orange of the caps. Yeah, so the helix looks very rectangular, but if you put it together, the actual pixel dimension is kind of square, because it's 35 times 37, which is why the animations are square. And then the pixels go this way. And it's not mirrored. The animations go across the whole. Ah, it's missing. It's constable. OK, I don't understand. Oh, OK. Oh, sorry. OK, cool. Glad we sorted that out. So this is how you drive the helix with a video. Now, they wanted a life element for it as well. So what we did, and so on the video, is we added a MIDI keyboard and the Kinect to generate live visuals. And the cool thing is that if the helix is just a display, then whatever I can generate in visuals, I can play it directly on the helix. And I don't need to go through figuring out how do I translate my visuals into LED commands because my mapper does that for me. But that was very convenient. I used Quartz Composer for generating visuals and Max MSP. I got a friend of mine who does some Max MSP who generated a different patch for me. But before we get into that, a short excursion into Siphon, which is the best thing on Mac ever. This one's important to understand the next part of it. So who knows Siphon? Way too little people. How many Mac users do we have in here? You're going to love this. Sorry to the rest. But there is a Windows version of it as well, which is not so well supported. Well, so what Siphon actually is, it's a framework and written in C++, which allows you to share video frames directly on the graphics card. So what that means is you can send a video from one application to another application in a very, very convenient way without having almost any impact on the CPU usage. And that is really, really cool. So why is this cool? So say you have our setup. I have my mapper, which does all the stuff. And I want to get my video in here, but I want to generate it with Quartz Composer. And on the way, I think I want to do some funny, fancy effects on it, which may be very difficult to do in Quartz Composer. But if I have a VJ software, it can be very easy. So for example, module 8 is a VJ software. Then what Siphon allows me to do is to take the video from there, Siphon it into module 8, apply some video filters on it, and then Siphon the whole thing into a mapper, which then transmits it on the Helix. And I will try to demonstrate this in a very quick way. Hopefully, it will work. So we start with a Quartz patch. So is everybody familiar with what Quartz is? Who doesn't know Quartz? OK, a couple of them. OK, so Quartz Composer is a so-called visual programming framework, which is very useful to generate visuals. And the way it works is you have so-called blocks, which are these things. And these blocks, they have inputs and outputs. And depending on what you, and you can combine them, you can connect them. So if I just start this, which is just a MIDI keyboard, which allows me to play MIDI notes on my keyboard, I have this patch here. And you can open them, and there can be sub-patches. So there's a reason why people call this noodling. And so what you see here is a little bigger. E, E, E. So on the left side, you have a completely unreadable block. There we go. OK, so here we have a MIDI note receiver, which gets MIDI notes, and it triggers these MIDI notes. And it sends a signal to this macro patch, which I have created, which has a particle system inside. And this particle system then gets triggered and creates a visual, which we can see here. So if I press a button, I get a nice little color blob. That's what this does. And the cool thing about the visual programming frameworks is that they are very easy to work with. So if I want to integrate Siphon to it, luckily there is a Siphon patch for it. So what I think about MIDI keys is you always forget to turn off the global hotkeys. So the easy thing is you can put in a Siphon server. And now I can take the image output of this giant patch that generates the particle systems and just put it in the input of the Siphon server. And what this does, if we did it correctly, we can also give this a name. We call it Quartz Siphon. And if that works correctly, we can start module 8. And we'll find it here. So there it is. And we just activate this on this layer. And I'm not going to go into detail how this VGA software works, just for sake of demonstration. And if I now press my button, we immediately see this whole thing appearing in module 8. So what I can do now is I can start putting all sorts of fancy effects on it. There's just some color weirdness here. And now this whole thing becomes very weird. I can even start to do other stuff like add a layer. And here. There we go. So now I've got my live camera feed here. But I still have my funny little blobs on it, which is great. So this one also supports Siphon. So I can just go to MadMapper. And I have module 8 main view here. And I can just double click it. And now I have my funky little make it a bit wider. So hopefully, if this actually works, then I should now be able to. Can you see it? Yes. So you get these color blobs on the, if we hit it, there. So this is how you can really, really quickly mock up and mix visuals that you're creating. And Siphon lets you do this in a very efficient and fast way. And it doesn't put any strain on your CPU. So that's really, really neat to be able to go and do this very quickly. Sorry. How many, have you ever really pushed the limit of how far you can go with Siphon into different programs and software that you typically just do two or three? I think the limit is not Siphon here. The limit is probably the software. Yeah, the software. But do you just typically just kind of, that's your workflow, is Siphon into module 8, or have you ever just really pushed it with different stuff? I haven't gone to, I haven't reached any limits yet. I think the most intensive I've done so far is having a, what did I do? I did live capture from a camera, took that, piped it into an OpenGL shader to generate, to do video tracking, Siphon, the output from that into MadMapper to do some distortion, no, into module 8 to do some distortion and Siphon that into MadMapper. And that worked at 37 frames per second, or 30 frames per second. The limiting action was actually the video capture, which every now and then would drop frame rates frames. And yeah, that was the, I haven't reached any limit on Siphon yet. Because it's not really adding any extra work to your system, because it's just publishing the frame on your graphics card. So basically it just tells other programs which address on your graphics card to look for the image data. So there's no load, it's just like I'm passing you an address. So instead of copying it over the CPU, taking it out of RAM, copying it to a different address, using the CPU, which takes a lot of stress. This one just tells the next program, look at that address on the graphics card, that's where you find everything, and then you just get it from there. So it's super efficient on that way. Yep. Yeah, it sets up a server in the system. It's basically a registry for frames. So you register, it's a C++ framework, and you can integrate it into any C++ capable program. A lot of people have written a lot of modules for different types of softwares or frameworks or whatever. And yeah, the way it works, you set up a server in the system, and every software that sets up another Siphon server registers its server inside the Siphon registry. And every Siphon client can then look and subscribe to whatever a Siphon client is available. Sorry? Does it work with Resolume? Yes. Oh, I think so. It does, right? Yeah. I'll be surprised if it didn't. Yeah, so when you actually do it, it's really a base initially, a base initially for any of the most of the things you use. I'd like to say that then the nodes will be used. Yeah, so the chain that I just demonstrated starts with Quartz. So every time you press Quartz listens for MIDI nodes, every time it receives a MIDI node, it just uses that to trigger this little particle burst. And what that does is just life generating a video feed. And every frame of this video feed is then published on the Siphon network. And then the thing that modulate does is it subscribes to the Quartz Siphon server, grabs the newly generated frame from there, puts effects on it, mixes it, and then takes a newly generated image and publishes that again on its Siphon server, to which Medmapper then subscribes. So that means it's a part of any other images? Yes. In fact, it comes out the way you stick it in. If it's like 800 by 600 texture, then what comes out on the other end is an 800 times 600 texture as well. So it's really just a pipeline between programs. And that makes mockups really, really easy and really, really nice. Any more questions at this point? Cool. So let me get this. Mechanics are involved in the size of the helix and the distance for it to move like that. The movement is a mirror ball motor at the bottom. Heavy duty mirror ball motor. Does it have to have a certain width and gap? Yes. There is mechanical constraints. So you have two belts running along the middle that hold the staffs in place. So if you put the distance of these belts too far apart, then you will come a very stiff wave. You don't get the full twisting anymore. So you have to put it at about 15 centimeters, 20 centimeters. It's like the maximum distance before it becomes too stiff. If you put the staffs too far apart, the belts will start to twist. So you get like these two belts will just twist, and then you lose the wave momentum. Instead of transporting the wave from one end to the other, you will have the helix twist in the middle where the belts twist, and from there on the wave will not propagate anymore. So there is a ratio. We haven't calculated it, or anything. I can't give you a fancy physics formula or something. It's just like trial and error. Now, we took what worked with the fire helix and kind of put it over there. We had to adjust it a little bit. But yeah, it was mostly trial and error. I didn't do any physics research there. Maybe I should. It works. OK, so that was siphon. Now, what you just saw is how to generate the midi nodes is exactly what is shown here. So you take the keyboard, take the midi nodes, and siphon it to matmapper on the helix. And what that looks like if you play it on the helix is this. If you put the helix closer together, then it actually looks more like a display. And it creates a really nice little funny effect. The only drawback with this method is if you put it into the hands of a public, there's very little piano players. And if you're sitting there and you have to man this thing for four to five hours, and you have to listen to the 50-inch time, it gets a little bit annoying. So if you want to do audio effects on art installations, think twice. Or make it so easy that it will always sound good. Don't rely on piano players, there's not so many. And those that are there are always shy, which is another problem. So that's the midi version. And then we had the Kinect version, which uses the Kinect camera. The Kinect is developed by Microsoft, and it's a 3D camera. So what it does, it creates a depth image of its surroundings. And that depth image looks kind of like this. And the cool thing about it is that what you see here is basically just a great scale image of what the camera sees, but it has depth information associated to it. So the darker the image gets, the further away it is. And this is cool because it allows me very easily to separate a person from a background. So I can just say I'm only interested in pixels that are two meters or less. So if you're in an outdoor environment and you have the Kinect camera, you don't have any background, or the background you have is very, very far away. So it becomes very easy to isolate a single person. Just ignore everything that's further away than two meters, and that gives you the outline of a person very easily. And so there's patches written for quarts. There's patches for max, there's patches for almost everything you're using to get this data into your system, and then you can use it to generate visuals. Due to a lack of time, the only thing that I was able to do with it was to take the silhouette of the person and use it to create an illuminated silhouette on the helix, which proved out to be quite fun. So people could actually dance, and you could see your movements directly translated on the helix like this. So in that sense it's really fun to play around with the Kinect. It can do a lot more. There's frameworks that do motion tracking. It can figure out which hands, which, where you're pointing at, can figure out whether your smile, I think. It's actually very sophisticated. So we barely scratched the surface with our usage here. But still it was quite fun to have a direct interaction with it. And this one was much more enjoyable because you could just watch people dance, listen to the music you like, much, much more fun than the piano. So it brings me to the end of this talk and a couple of conclusions. This all sounded like it was very linearly planned and went perfectly well. That was not the case, of course. You're always smarter at the end. So we went lots of different directions and hit a lot of dead ends before we kind of figured out the plan that worked. So don't trust talks that tell you this is what I did, and this is how it all panned out beautifully. It wasn't that way. Another thing really, really think twice before you use wireless technology for driving your thing. If you can cable it in a way, cable it. You will save yourself a lot of trouble. But if it works wirelessly, it is actually really fun. And it's great because it allows you to just take it anywhere and just give it to someone and walk around and have fun with it. So that's something that wireless is in a way like liberating yourself, your art in that sense. But it's a pain in the ass. Yes, don't run installations on batteries. Big problem. I mean, not a problem by hassle because you have to recharge them every night, which means you just have to go in, you have to take it down, you have to plug everything in, and you have to plug everything out and set it all up again. So it takes a lot of. How did you charge the battery? There is a little connector here at the top. Yeah, I've got two batteries, one on each end to have a balance and basically just bring the plus or minus pull to here and then I attach an external battery charger. Yeah, just a standard China lithium-ion battery charger. So I can't answer that question, but because it all hugely depends on what you actually display on. Like if I put this on, it will probably last for two and a half hours because it's white and it takes a lot of power. If you put the normal operation, I think in front of the museum, after five hours it was still going. So it really depends on what you're putting on it. If you have like soft dim illumination, especially the dance mode has a dimmer background illumination and then the only bright pattern is the person that's dancing on it and that is not in white but in different colors. So it's not using the full 60 milliamp hours. That was actually quite energy efficient. So it really, really depends on how much, how bright your pattern is. In fact, running it on lithium-ion at 3.7 volts already saves you a lot of energy and the brightness difference is not very big. So I would advise everyone to not try to run it on five. Try to run it on 3.7 if you can because you save yourself a lot of energy with a very negligible brightness cost. So that's actually quite cool about these LEDs. Oh, I forgot to say to mention one great thing about the ESP8266, you can program it wirelessly which is great because you have 35 elements. The last thing you wanna do is plug in everyone and reprogram them one by one. So reprogramming over the air is great because you just press one button and you program all 35 at one go and it takes you 10 seconds instead of two hours. So that makes development a lot easier. So it's another great property of this one. Yeah, obviously use existing technologies as much as possible like Siphon, ARGNET because if you use ARGNET as a standard which is widely known, you can immediately integrate with other softwares which do a lot of things that you need much better than you could ever develop it. And AliExpress is great. AliExpress is the bomb. It's, yeah. All right, that's it. So if you have any questions, shoot. Is it projecting? All in all, I think I worked on it for six months about there, but not exclusively. So it was like we started actively working on the whole Nightfest setup in January and by August it was due. So the Helix idea came up at about February and besides that I was also doing the rig, like the steel rig which had to be designed manufacture. We were also doing a fire spitting aerial ring, chandelier, an aerial chandelier. Yeah, those were the four that I was involved with. So it was not the exclusive project. So I think if it was the exclusive project this would have probably been about three to four months. For the years. Material costs for this was I think 7,400 Singapore dollars. Thereabouts. Not including any manpower. Well, would you like to improve it or change the next performance? That is a good question. What would I improve? I wouldn't try to kill everything with one installation. So a lot of the reasons why this is designed the way it is is that we wanted to get as much usage out of it as possible. So we perform with it with spinning. So we're using these staffs by themselves like the performance light-twilling staffs as remote controlled. The whole thing being a performance and an installation part leads to the problem that you have an installation running on batteries which is annoying because you have to always be there and you can't just put it up, leave it there and tell someone else to plug it in and plug it out at the end of the night. So I would probably not try to do a performance and installation part which is the same thing again. That's definitely a lesson learned. I would probably not try to do 35 wireless clients again unless I have a big budget to get myself industrial grade high density routers. But then instead of paying $300 for a router, you're paying $3,000 a router or something around there. That's a big cost jump. Maybe at least by this 1,000 they're getting expensive pretty quickly. And what else? That's probably the things that come to mind as quickly. Embrace 3D printing. 3D printing is great. Just helped a lot. Being able to print your ideas in smaller scale or something and have it in your hand and be able to look at it and or try whether some sizes fit like UpVerture, the platform for example allows you to just one click export a 3D model of your PCB board with like all the right bumps and everything. So you can just put it in your 3D printer and you get a plastic version of the size of your PCB board and then you just can see immediately if it fits into your tube inlay and how the battery fits and all these kind of things. It makes for a great tool for these kind of things. Definitely, yeah. So, yeah, this is another general tip I would say. Would heat be a issue of the board? You mean the heat generated by the LEDs? Would it say a issue of the wires? They do get hot, but they don't get sawed in. It's been running for a while, so it's... We never had a problem with heat. So it's quite okay. I mean, you'd feel a bit of heat if you put your finger directly on this one, but they don't get so hot that you would have any serious heat issues. It's like, I've never had them any, like get so hot that I couldn't touch them anymore. So, no. So I don't say that it's not that linear, it's not that straight forward, but it's not a talk, you're talking about how you have functional parameters and so on and so forth. And how you pick the hardware based on requirements for, you know, performance time and current and all that jazz. So is that systematic of all your learnings? Or is it absolutely not? That's why I said like, that's where this comes in. A lot of it was figuring out on the way. I thought I'll try to see you very, very planned first. I thought it was the ROT personality, so it's always planning, it's actually fairly... I think that would probably be one of the learnings I'm taking away from it. So for my next project, now I know what to look out for because this was the first project of this scale. So I'll probably start planning these things a lot better. Like for me, the battery choice mostly boiled down to what can I get, what can I fit in here and how heavy is it? Because if you, like one of them, okay, 35 of them has actually turned out to be very heavy. So the two performers that were actually using it on stage were not able to put enough tension on it to have a lift. So when it went into the vertical, the lower ends were, a lot of times they were hitting the floor because they couldn't pull it far because it was too heavy. Also, the silicone belts are quite heavy by themselves. Those ones are bored. This is where I said like, where this being one inch comes very handy because this is a standard part for another product. And we know the people who manufacture this so we could get a lot of them quite cheap. And they sealed it off waterproof, which is also nice. Yeah, so. Yeah, no, no, no. It's fantastic. Sorry. So how much gold do you have to write? Co-vise, it was mostly the programming for the chip. That was, I don't know, 10 or 15 glasses in C. It's not, it's a manageable co-basic. It's mostly set up packet management or the message management server, Wi-Fi server management. And then the, I did like a whole lot, like the library of patterns that I can play on these. So we've got a bit more variation than just the rainbow. And that's most of the programming. The rest is quarts a little bit. Like unfortunately, most of the time went into construction and building this thing. So in the end, I couldn't spend as much time on generating visuals and patterns as I would have liked to. But if we find another venue that wants it, maybe for a longer installation, I'll definitely put more. It's quite good to create the most of it, just like I'm just generating it. Yeah, I like the fact that it's just like a display. So it makes it quite accessible even for other people to just say that I've got this great animation, just send you the video and you can just play it on it and they can start playing around with it as well without needing to program anything in a hard-coded way. Do you have any questions with the sensors? I would have loved. I didn't because of budget. Like one idea that I would have loved to do, but I couldn't do it because of budget and time constraints was to actually put an accelerometer on it and drive the color by the accelerometer. So the helix would color itself by doing this. So we have like a color wave that is just driven by the motion going through the thing that that would have been very cool. But yeah, then yeah, not enough time to figure out how to connect an accelerometer to the ESP and then design a board and that's all. So maybe if I build another one, which is highly unlikely. Yeah, my last question. Are you guys obsessed with fire? Some of us are. For me personally, fire is not an obsession. I wouldn't call myself a pyromaniac. I like the element and the uncontrollable nature of it, but the more you work with fire, the more you get to know the element and the more you get to know what you can do and what you cannot do. And that's, it's an interesting skill because most people have a very, very fearful relationship to fire. Like fire, oh my God, I'm not gonna go near there, right? But most people are not realizing how much you can actually do and how much you can actually touch fire before you get burned. And it also depends on what surface is burning and what material is burning. There's a lot of different versions and variations of fire and it's fascinating to figure out how you can manipulate it. We have, there's like, for example, we do explosions with liquid or we do explosions with powder and those are two completely different worlds. Like the powder one, we can do it indoors. We have literally accidentally blown that up into the face of one of our performers. He was like standing behind us. Two of us were doing like this and the guy was just getting it all in his face. He's literally engulfed in this giant fireball. Nothing happened. So there's, if that was, if that had happened with liquid, he would probably have been dead. So these kind of, these kind of things where you know like, okay, there's a person involved around there, you're indoors, you do powder, right? So you can't do liquid. If you're outside and you've got wind, powder is a different thing, right? Liquid is much more, much less agile. So liquid fireball will not move as fast as a powder fireball. If you've got a fan in front of you, do a powder explosion, you've got all of that in your face. So these, these kind of kind of skills that you learn when you do a lot of work with fire and it's fun. And if you start with gas, it's a whole different world, it's fun as well. All of a sudden you can do like machine gun fire and these kind of things like, that's fun. Yeah, but I'm not a Paramanian. Ha ha ha ha. All right, I think we have to close out. Thanks again for the next, we've been doing a