 I'm Scott Darby, Creative Coder at IOHK. I'm the lead developer and creative on the Symphony of Blockchain's project. We are now heading into the WebGL build of Symphony version 2 and I'm going to give you a quick progress update and show you where the project is heading. So just to give you a quick overview of the project. Symphony of Blockchain's is an interactive tool which allows users from a wide range of backgrounds to explore and learn about blockchain technology. It is an audiovisual experience which represents the underlying mechanisms and mathematical concepts underpinning Bitcoin, Ethereum, Cadano and other blockchains. Users will be able to hear the sound of a blockchain at a point in time and be able to ascertain properties simply by listening to it. The audio will expose transaction density, network health, user adoption and practical efficiency. All of the structures and visual elements in the visualization are there for a reason. Form follows function. The experience will be varied and dynamic encouraging users to keep returning to the experience to explore and learn more. The experience will be information rich and aims to be the most comprehensive tool available for exploring a blockchain while being visually stunning. It will be extensible and modular which will allow several blockchains to be compared and allow other developers to contribute to the project easily. But just to recap and show where you've come from, here's a selection of the static concept work which I created for the project. You can hopefully now see some of this coming to life in the real-time version. As there are many parts of the blockchain world we're creating, we've decided on some locked off camera views to guide people through the experience. This will also help from a performance perspective as we can control what is being loaded and rendered on the screen. The user will be able to seamlessly switch between these camera views and go from viewing the entire macro blockchain structure to viewing individual transactional detail. The views we're focusing on are first person or cockpit view, this will be the main view that we use in VR. Underside Merkel tree view, the top-down view, a macro blockchain view and an isolated block view. So the top-down view, once the user has zoomed in from the macro view they will enter the top-down view where the transaction detail blocks is displayed. The controls on this view are like that on something like Google Maps and you can use the mouse wheel to zoom in and out and click to pan around. This view could be a good opportunity for augmented reality as the top-down view of each block is unique and will work well as an AR marker. The user could hold their phone onto the screen and see the transaction crystals extrude out. Here's a shot of what this currently looks like in the build. You can see the UI at the bottom with the details of the block currently being viewed. So just to recap on what you're looking at here, this is a block with each of the transactions represented as hexagonal crystals. The crystal height is controlled by the transaction value and the color is controlled by the ratio of outputs which are spent. Fully black means all outputs are spent and fully white means no outputs are spent. The crystals are ordered by transaction age from the top left to the bottom right of the screen. The pattern of distribution on the plane is controlled by a simplex noise function. The noise amplitude is greater the less healthy the block is, resulting in healthy blocks looking much more ordered than unhealthy blocks. In the cockpit view the user can fly around freely using the standard first person controls at WASD key and mouse. The user will be able to focus on transactions and see information about them in an overlay. The sound will change dynamically as they move between blocks. The sound in the first person view. The user can explore an ambisonic soundscape generated by the transaction data. Each block has a unique piece of music or sound design which is created using additive synthesis to combine sine waves generated by the data in each transaction. The closer the user is to a block the louder the block sound will be. So as they move along the chain sounds will morph into one another creating a blockchain symphony. The audio is created using the following rules. Each transaction is assigned a fundamental pitch from a musical mode based on the transaction value. Higher values result in lower frequencies. Each transaction has 16 harmonics. The number of transaction outputs which have been spent control the volume of these harmonics. Transactions with all outputs unspent play all harmonics at full volume and transactions with all the outputs spent play only the fundamental pitch. The health of the network when the block was mined controls the range of randomness assigned to the pitch of the harmonics. This means that healthy blocks sound much more harmonious and unhealthy blocks. Here's some screenshots of the first person mode. You can see this UI overlay here that's been designed by Julie. You'll be able to click on individual transactions and see them on the right there with all the detail about them. Here's just a few more shots from the WebGL build. This is showing the Merkle tree from underneath each block. That's showing a series of blocks all connected together with the Merkle trees on the bottom. So as you can see the WebGL build is starting to look quite a bit like some of the offline render concepts we put together in the beginning. The user interface is a really key part of this. We're including lots and lots of data in there for people that want to use it. So here's a design concept of how that interface can look. We're imagining this will be on the left hand side of the screen and the user will be able to pull this out so they can search by date, search by block hash, pull up lots of information about what's going on. This is the underside Merkle tree view and this will let the user explore the Merkle tree and see how the Merkle tree connects to the crystals on the plane above. The isolated block view will allow the user to isolate a block from the chain, rotate it around, really explore its entirety. Okay I've got some examples of the sound design here. So this is an unhealthy block so the means of fees were very high when this block was mined. You can see the crystals coming in there based on the transaction time. When they're glowing white that's when they're emanating a sound. Here how the healthy block sounds much more musical than the other block. Here's a block from a very early on in the blockchain. You can hear it's got a much simpler sound to it and that's just due to the the amount of transactions which are in the block really. And as you fly around the blockchain you can hear each block kind of blending into each other and there's the a few of the macro view. So this is the overall view of the blockchain. It's kind of a map with which the user can use to explore the entire structure. This has changed slightly in the build actually which you'll see from the demo shortly. So this is another screenshot from the WebGL build so you can see how it's really coming along. You can see this large sphere in the distance like a planet. We're planning for this to be the mem pool which is where all the unconfirmed transaction activity is. Okay so I'm just going to show you a live demo of this just to prove that it is coded and working. So this is the entire blockchain structure here which you can see in these rings around this planet. And if I zoom in with the mouse wheel you can see all the individual blocks. The closer we get you see the UI pop up and you're kind of within the certain boundary of the block and you can hear it's starting to play the sound there. So basically when each of these crystals are glowing white there's a sound emanating from and I can enter the cockpit mode here. This obviously needs a bit of UI design so I'm just using the keyboard and mouse just to fly around this. So this is all using real data being pulled in from Blockchain Info API. And if we go underneath you can see the different merkle trees there connecting up. And if I go in somewhere closer like this is where this is where all the latest blocks are being mined. Hi I'm Andy Buchan with Creative Studio KUVA. We've been working alongside Richard Wilde and Scott Darby in developing Symphony as a Turing installation. In order to best frame what follows I thought it worthwhile just to give you a brief idea of the intended ambition we're aiming towards. We're looking at the intersection of technology and innovation to create a unique augmented physical installation that can appear everywhere from a gallery space to a piece of more traditional marketing. A physical installation in a gallery space that allows users to interact interrogate and understand a series of aspects within the blockchain through the use of augmented reality. The installation will launch with a focus on interrogating the mem pole. Echoing the work initiated in Symphony 2.0 we will create a physical globe. A sphere will be sunk into the wall of the gallery space creating a visually intriguing and inviting moment. Participants will be able to see the rest of the globe with geolocated memes and interact with the content through their handset and an augmented reality view. As with the web experience for Symphony 2.0 this augmented installation will be information rich with an aim to draw visitors in. It will be widely accessible in its approach and engaging for experts and academics along with the general public. The augmented reality experience will also be accessible outside and beyond the gallery space giving us an interactive tool to support education and engagement opportunities further afield. Project outcome. There are four key main areas here ownership education attention and engagement. Again as with Symphony 2.0 the physical installation and its augmented view enables us to continue to carve out ownership around how the industry articulates the underpinning concepts. It allows us to own how the world visually understands the blockchain and cryptocurrencies. The installation will provide a one-of-a-kind experience and the first moment the general public will be able to physically interact and interrogate with specific aspects of the blockchain. It will drive understanding. Lastly it will generate meaningful PR attention and ultimately interest in both IOHK and the wider symphony project. So just to touch on live data and its importance the project itself will be built from live data and we feel it's vital for its integrity and success. The work in the visuals that follow are built from real data gathered last Thursday with around 6,400 unconfirmed transactions in the bowl. Just going into a bit more detail in terms of our approach to data and our current thinking on it there are four key aspects to the visualization. Fee and value, size and movement. So shape all transactions are represented in the same way and fundamentally they are identical apart from size, value and fee. Size, the value of each transaction is represented by the object size larger objects are more valuable. Fee level, using a predetermined clamped and fixed color gradient a transactions fee level is represented by its color. Motion and animation, the memes will move and animate the project should feel alive. Transactions appear as they are sent and are removed as they are mined into the block. So just having a look at the physicality of the installation itself and what you see here is a very initial scamp effectively for it and our current thinking is that we would show a thin sliver of the globe itself as opposed to the entire object with it peppered in these crystal like structures again referencing Scott's work in Symphony 2.0 but really creating a physical moment that is intriguing and engaging really drawing viewers and participants over to the installation itself and encouraging them to engage with it. And on the right hand side what you see is potentially how it might look through an AR screen and AR portal so revealing the remainder of the globe and allowing people to see behind the wall effectively again giving them a really strong visual reward for that digital engagement. And here are a couple of quick materiality studies and animation tests as to how this live data would come in and as you can see in the left hand side there is this video plays really creating this kind of interesting dynamic physics based animation that really gets a sense of the frequency at which these mems appear and are pushed into the pool itself effectively and then on the right hand side just a quick materiality study giving a kind of glassy structure to the individual mems which is almost crystal like which again flows through into the work that Scott is doing within the wider context of Symphony 2.0. We thought it was important just briefly though to break out and talk about the opportunity that augmented reality really gives us within the wider context of IOHK and you know within the installation and certainly what we detailed above we think it's tremendously valid but we absolutely see the potential for AR being used as a tool within the wider team so whether that's something like Agilos or Charles and beyond really using it to interrogate specific aspects of their research or outputs and really giving them a unique tool for presenting. As you see here we've quickly done a couple of mock-ups of exactly how we could see augmented reality being used within the context of say Agilos and him presenting a particular aspect of his work but really feel like it could be something that drives audience engagement and ultimately we feel a deeper understanding of the concepts or really whatever Agilos and the team within IOHK want to present and talk through. So getting back to the installation and really we thought it was important in this next segment really to explain the work ultimately and so the goal within this sprint as we'll go through it shows off the potential of the augmented reality installation really to explain and outline the potential to the key stakeholder team. So in order to do this we created a quick again internal facing script ultimately that goes through the user journey alongside the experience inside the gallery space. So what we have here is a user flow effectively and we have three components down the left user action the behavioral goal and ultimately the emotional result of that and then we have five key stages within that process of discover approach engage interact and learn. As the user is outside of the gallery space or as they're consuming information about it through media really the the kind of behavioral goal is curiosity and driving them to the gallery space and obviously this allows us to build awareness as the approach the physical installation it's core that whatever physical aspect there is to it that it really draws attention and intrigue the behavioral goal is visual surprise and that kind of physical attraction and then ultimately when they get close to it it's important that it has that level of detail that drives that aspect of surprise when they get up close to it. So moving on to the engage segment obviously it's important that there's an informational aspect that in brief explains the exhibit and the installation in front of them and whether that's a URL that pushes them to an augmented view or an app for download ultimately that's the behavioral goal that we want people to understand what they're looking at and understand the interaction and really in terms of that last piece that emotional result really again driving curiosity and pushing them in terms of an action towards download and towards that potential if we need it registration and then with the augmented view accessed obviously they're in the interact state and fundamentally this is all about the emotional goal and we really want to make sure that the visual layer is something that's in its first instance delightful to look at and really rewards the user and the participant in the installation for that action of downloading and engaging with it and really drives that kind of linger within the context of the exhibit and they're allowed to explore it and are really rewarded for that. So then moving on to the final stage really and this is fueled by the engaging aspect of the content so the better it is the longer they will dwell and really kind of driving that understanding so the more they'll learn and that's the ultimate goal here really is to kind of fuel that learning and fuel that engagement so that people linger in the space and are really captivated by it and really it serves as a potentially a jump off point into symphony 2.0 and certainly into IOHK in general which we think is certainly the ambition. So just moving forward into being more explicit with how we feel this experience will flow we've done a quick storyboard effectively and blocked out some of the key moments and some of those key scenes as we see them in terms of how people will enter into the gallery space and interact with the installation. So as you can see outside the gallery space it's a traditional passing by engagement however it's obviously key that we have something in the window that grabs attention from the passing public and really draws them in but when they're inside the flow and the intrigue driven by the sphere buried in the wall and then the engagement and the interaction surrounding that physical moment the use of the mobile device to either scan the QR code or go directly to URL for a potential download of an application and then looking at the augmented view and exposing or revealing the rest of the sphere and allowing participants to scroll around or spin the globe and either zoom in or physically move closer to explore the data that's woven into it and really that ultimately completes the experience ending on that final lingering moment of interrogation and exploration. So just to wrap up we thought it worthwhile visualising how augmented reality could be used in the wider context of IOHK as an additional installation or as a tool to support presentations we think there's a great opportunity a great utility in the technology to facilitate audience engagement and deeper ultimately deeper understanding and just again kind of to kind of summarise yeah we certainly feel that this work around the mempo the kind of initial work is only scratching the surface of the opportunity certainly for positive attention and really kind of meaningful engagement and that really concludes our presentation on the augmented installation thank you very much for your attention