 So augmented reality, two basic things to cover today. Now the first one is the project that we worked on. So HEART which stands for Heritage Education Augmented Reality Tours, HEART for short. And the second is just a general overview of augmented reality itself. So just crack straight on. Lots of information, so please ask me questions at the end. This is Ori Inba. He's a software engineer. In 2007 he left his IT job to spend some more time with his family. What he discovered was, like many of today's children, they were spending too much time in technology, playing games, watching screens, surfing the internet. Not nearly enough time out in nature. My philosophy is, as much time as you spend in technology, you must spend it away from technology. So he wondered, is reality too boring for children? How could we engage them more in reality? And if that's the case, what would reality 2.0 look like? He believes that we need to take some things that attract them to these digital worlds and bring them into the real world. So in a way augment our world. So what is augmented reality? It's not that. Definitely not. It's to augment is to add to our world in a way that benefits it with digital content. I'm going to be talking about the visual side of things. There's reasons for that. But we have other senses as well that we can augment. And we've been augmented for a long time. Smell a vision. It's quite a popular one. And TV and smelling what's going on as well. Audio. We've been augmenting our world of musical instruments, et cetera, for a very long time. We've got to some pretty smart technology like this pair of headphones here, which can sense your heartbeats and your moods and play music or change things depending on your key track of what's going on. So the reason why augmented reality is so focused on the visual is because that's where we get most of the data into our brain is through our eyes. So depending on the material you raise, between 70 and 80% of the data going into our heads is through our eyes. Now, without moving your eyes, each eye has approximately eight megapixels resolution. It's hard to judge. And it's estimated that seven of those are in the sort of central few degrees as demonstrated here. Interestingly, each eye has a blind spot, which is about 10 degrees out to the side. So if you did the whole movement out here, your thumb disappears at about 10 to 12 degrees. New party trick. So where's augmented reality now? If we go at the Gartner hype cycle, we're probably somewhere around there in the troff of disillusionment. There's a couple of reasons for that. One is hardware, and the other one is software design or apps, et cetera. Currently hardware relies on us to carry our phones, which I've lost, so if anyone finds mine, cheers. Or iPad or a webcam on your laptop, et cetera. So it's not very user-friendly. Eventually, some of these more hardcore devices like this will be coming out. They have to get a bit better looking. Don't think too many of us will be walking down the street wearing those, but they've come a long way from the huge helmets of really not that long ago. There's some key important things we have to overcome. Wide field of view, so it needs to be above 80 degrees to immerse us in this new digital world. High resolution, obviously, and a few other little tricks like stereoscopic vision, et cetera. There are some other cool technologies we're working on, or not me personally, but humanity. Retina rim plants are being used at the moment in conjunction with the external camera. The camera feeds the retina rim plant, which then sends signals through the optic nerve. We have a very prototype. I think I only tested it on rabbits so far, but contact lenses. And although the resolution's improved somewhat from when that picture was taken, it's still very low resolutions, maybe 100x100 pixels or 200x200 pixels. So not a huge nowhere near the megapixels that we need to replace our vision. This is much more interesting. This is Magic Leap. It's very new. It's all hush-hush. Took a lot to find just these images. It's a new start-up run by Richard Taylor of Weta Workshop here in New Zealand, plus some other superhumans around the world, and they've just received $542 million worth of funding from the likes of Google, et cetera, to fix this hardware issue that we're trying to deal with at the moment. So the way it works is it's a 1mm 9mm device. It used to be used as a camera to go in and look and pull light out, as cameras do, but they decided to reverse it and use it as a mini-projector. Now, the oscillating optic fibre there moves in a spiral pattern, and as it moves, it's changing its light as it goes, and it can produce... I'll just go back a slide. A picture much bigger than a spot of light from a single traditional optic fibre. So it's not hard to imagine something like that just sitting on the side of your glasses there, projecting information straight into your eye. One on each side, you've got stereoscopic vision, so you can project, make it appear that digital content is far away from you. They can also work on allowing you to focus at different depths and things as well, but it's all the size of this machine currently is the head robot, but we'll get there. OK, so that's a little bit about the hardware side of things. I'd like to talk also about AR design, because it's actually really important to prevent disengagement and ensure that augmented reality continues to flourish. So these are being proposed by Lex Ardes. So augmented reality must emerge from the real world and relate to it. I'll come back to these. Enhanced reality, not distract us from it, like the original picture of the car, and must deliver a superior experience to alternatives. Or better yet, there aren't any alternatives. Augmented reality is the only way we should all could do it. And for obvious reasons, that's not practical in a lot of situations. So rule one, augmented reality must emerge from the real world and relate to it. This describes how content is attached to the real world and it must appear to be connected with it. There's no point having just content hovering in front of you. It needs to really come out of the real world. Google Glass. So this is an augmented reality. It's had a lot of hype and it's beefed up the story around augmented reality, but it's really not. The problem is a small screen off the side of your view, which firstly means you don't get stereoscopic and you can't do depth, but also to actually see the projection of the content, you look up and see another copy of it up to your right-hand side, which is basically the camera view. So this is a little bit better. It's still not perfect because we're carrying a device around. But you can position it in front of the object so that you can see the magazine changing and make it appear as though the content's attached to the magazine or at least coming from it. Rule number two, augmented reality must enhance reality, not distract us from it. So there's plenty of ways to escape into the digital world. We're really good at that at the moment. This device here, the Oculus Rift, if you ever get a chance to have a go, they're great. I should have bought mine along. For those of you that have tried it, you know what I'm talking about. And this is just prototype level stuff. So we can escape to digital worlds already. We don't need to do that with augmented reality. That's not really the point. Yes, they'll be a bit of crossover, but again, not the point. Rule three, augmented reality must deliver a superior experience to alternatives. Here's the best example of this that I could find. This is a fascinating app called WorldLens. It translates English to and from six different languages. And the basis is you turn it on, you hold your phone up to a sign, and it changes with the same font and colour, the sign to read in whatever language you've chosen on the fly. Brilliant. You couldn't do this any other way without having to make it more cumbersome for the user. And that's the point, trying to make it easier for them. Here's an example of some content that recognises a picture and then overlays just some random text. Well, not random, it's obviously to do with it, but just some text over the top of what you're looking at through the device. Not overly useful. And perhaps you could do this a bit away. For instance, could you not just have a little plaque or a little bit of writing on the side of the wall there? Why would you make your visitors download an app just to get that information, especially when not everybody has that? Okay. For augmented reality to work, your device must first recognise and understand what it's looking at. Computer vision. There are several different techniques. Firstly, 2D tracking. So this is just simple QR code based markers or magazines like we saw earlier. It can also be done with 3D objects, CAD models or point maps, et cetera. Point maps are very similar to facial recognitions. So there's some points in a 3D space and it recognises that exactly the same. There's also a non-visual tracking like GPS. It works out your position, your orientation and the position orientation of the content and it can tell when you're looking at it and therefore display it on your screen. A lot of these can be used in conjunction with each other to improve the robustness of an experience. One of the additional techniques is using simultaneous localisation and mapping or SLAM. Basically, it builds a 3D point cloud such as this on the fly and combines the other techniques to make it quite a robust technique. So let's take a look at a couple more examples. I know I said, what's the point of covering up a painting like this? It is slightly better than the other example. Firstly, it's actually attached to it so you can walk around and look at it. Secondly, it's adding video and audio and things that are more interesting and that you couldn't just stick on a wall without putting a TV there. This is one of my all-time favourites. It's a sandpit. So the kids play in the sandpit, they mould it and using a projector it overlays the contour lines so they learn about height, et cetera. They can then put in digital water and watch how the water reacts with what they've designed. Artists use it either themselves by extending on 2Ds or 3D traditional art or there's quite a movement for people to come along and add things to other people's content. Classrooms. Having a moving, scalable 3D model of a solar system or a heart for me is quite invaluable in the education space. There's a lot of different companies working on this and this is probably one of the areas I'm most excited about. Another favourite. This is Elements by Dackery. They have six different cubes, different things on each side and as students you can bring them together and create chemistry experiments without the risk of blowing yourself up or the teacher up. It's all animated. It works really well. And it's free. You can buy the blocks, wooden ones but you can download the template for free. And there's some marketing uses as well but that's by the way. So what's next for AR? As global citizens we've conquered travelling to locations and seeing the world but what about bringing the world to us? So I approached the Nelson Provincial Museum last year and the Suda Gallery and asked if I could replicate some of the exhibits using 3D technology and photographs. So this is from the Roman exhibition they had at the time. I then attached all this content to a blank space at our local polytech where users could come in and just have a look at either the Suda Gallery or the museum, just flick between the two. During that process I learnt loads about Nelson's history, not a local Nelson person but there's a lot of great history there. One of the older cities so from 1860s onwards we have the most amazing glass plate collection which I'm sure you all know about. They just reached their 100,000th digitised glass plate last week and half of those are already uploaded online for people to access. I realised how much of a treasure these photographs are. This is classic 19th century photoshopping to remove this lady's wrinkles. They sit there with little paintbrushes and what, no, it's brilliant. I also met the wonderful education team. The museum themselves go around the city and show the kids these and hold them up to see what it used to look like. So we wondered let's try and replicate that for the masses. So this is heart. You can go to I'll just speed through because I'm running out of time. So these are the lovely sponsors and people that helped it out. Thanks. So these are the locations around the city. You can go to these locations now with the app, hold it up and you get a representation of what it looked like 100 years ago. We used a different technology that's been used before in Christchurch, etc. We used something called edge-based mapping. We build 3D models of the scene and then placed them into the app and used SLAM as well to give a really robust non-GPS floating around experience for users. Fortunately it takes a wee while to do a good day or two to make a scene, but it's worth it to have those pictures aligned perfectly. I might just finish there and just see if there's any quick questions actually. Anyone? Tell us a bit Yeah, awesome marketing. They've done similar sort of things before. Thank you very much and a warm round of applause for David. Thank you.