 Hi, my name is Willie, I'm going to continue from Martin's Seam of dreaming up and creating things and talk about wearable technology. So the project I'm going to speak about is called IDOS. I'm going to talk about what IDOS is, our design process and what its future impact might be. So IDOS is a new approach to wearable augmented reality that lets us control and enhance our senses in real time. It's an approach that starts not with electronics and senses that we can embed in devices to be worn and carried around with us. But instead it's an approach that starts with the human body itself and how we can physically manipulate our bodies to allow us to experience and perceive the world differently. So what are we interested in? The project started when really interesting trends were taking off in the design landscape such as augmented reality and advances in big data capturing such as quantified cell for life logging movements. So Google Glass and Fuel Bands are really useful devices, they're really useful for knowing your calorific expenditure during the day or very useful if you want to surreptitiously check your emails wherever you may be. But what we're interested in is how we can create a new experience that allows us to sense more. So in this way we're interested in more physical and visceral forms of augmenting reality and we look towards science and art and people who are doing more radical things with the human body and combining that with technology. So for example this guy who put a webcam embedded it in the back of his head in order to give himself 360 degree vision or printing electronic circuits onto the skin. So what exactly does Eidos do? So Eidos is two experimental masks, it allows you to enhance and control your senses in real time. So Eidos vision enhances your perception of motion and Eidos audio allows you to select and focus on speech. We found that we experienced the world as a series of overlapping signals. So when you're looking at this spectacular view you're taking many signals, you're taking in brightness, colour, contrast, motion. And what we were asking was what can you do with technology if you can isolate and then amplify these signals. So Eidos vision isolates and amplifies motion and Eidos audio isolates and amplifies speech. And this is a short video which shows the experience of the two devices. So Eidos video creates a similar experience to long exposure photography except that it does this live as it's happening. So as you can see it's revealing hidden traces and patterns provided by motion. Eidos audio allows you to select and focus on speech. So in a distracting noisy environment such as a trace which in background noise is neutralised. And the speech you want to focus on is sent directly to your inner ear. So can you hear me now? Yes? Good, I can hear you. So this is all very well and good but what do we want to do with these kind of technologies? So anywhere where live motion or audio analysis is valuable is a strong application for Eidos. So as you're all aware in sports post-match analysis is very common. So the football team will record the game and then analyse their performance with their manager after the match. For the first time we're saying with Eidos that you can analyse your performance as the action is taking place. So not only does this provide perhaps a more efficient way for players to improve their performance but also provides a new experience for the spectator. Eidos audio has strong applications for health benefits. So because we're using bone conduction we can restore or even improve or maybe restore signals which are weakened by ageing or disability. I'm now going to talk a little bit about who we are and how we got here. So we're a team of four. We all met at the Royal College of Art in London and we were studying innovation design engineering at the time. This is a postgraduate course which takes students from all different backgrounds from all over the world. So Tim's previous life was as a fine artist, means as an engineer, mine as a physicist and Eidos as a designer. So having very different backgrounds we have very diverse perspective on things and I think this is probably what led us to our common interest in human senses and combining that with an interest in technology, design and the human body we set about on this explorative design process over the course of four months. So our design process started with one research question which was how can we add value to the human body and we started by hacking existing wearable devices such as emotive headsets to do online bank transfers or integrating the oyster travel pass and RFID tag into the human body to travel around and see what experiences gave us. So we found that the human body is in how we experience the world and we experience the world through our senses. So our next phase of experimentation looked at isolating the individual senses and building them back up using a combination of form and technology. So here we're looking at sound and how we can use sound to manipulate our sense of space. And next we looked at vision, our natural frames per second and how motion naturally grabs our attention. Next we focused specifically on hearing and asked whether there was different ways to transmit sound through the body. Yeah amazing. So I'm putting on my tooth. Why are you speaking down in the ear? So what you see here is the sound being transmitted using vibrational motors through the tooth and it uses bone conduction and the sound travels to your inner ear. And this is, as you can also see, a very successful experiment. So we straightaway experimented integrating this into a wearable device. So design inspiration came from film and photography and studying different body behaviours as well as how we can combine organic and digital in order that we can represent the mixing of digital technology with the organic human body. We developed hundreds of four models as well as more technical rigs in order to overcome many challenges that we encountered and so how they work. So Eidos Audio takes a directional microphone, takes the sound and then is processed through the computer and the sound is then outputted through transducers and motors and speakers. It has a left and right sound input as normal headphones do. It also has a third sound input through the teeth. So distracting background noise to the ears is neutralised and the speech you want to focus on is sent directly to the inner ear through the mouthpiece. The video device consists of a webcam and a head mount display and the live image footage is captured, processed and displayed for the viewer. So what's next? Since launching the Eidos project back in February we've had quite a phenomenal amount of interest from the media and also from potential clients and collaborators. What people really respond to is this notion that the senses aren't fixed and that sounds, touch, taste, vision can be modified and adapted. We're working with a company at the moment around the idea of synesthesia or sensory transfer and perhaps using our technology in order to enhance taste through sound. The area of arts and live performance is an exciting area to test and develop our technology. So looking at how you can reveal the hidden patterns created by dance and ballet and create an augmented live performance or even perhaps a customisable live performance. And interestingly, just after we launched in February, MIT released this fantastic project which shares similar principles to Eidos. So their open source code allows you to post-analyze a video of a baby and by revealing hidden information about the colour change of the skin you can determine the baby's heart rate without using conventional heart rate monitors. So I really believe that a future which combines enhancement of the senses with wearable technology will be a very powerful one. Beyond Eidos, we're continuing as a team to work on the explorative design and we hope to discover more. And yes, thank you very much for listening.