 My name is Henry, I'm Founder and CEO at Streamer and it's magnificent to be here. I've like watched all the talks from the previous dev comms and I decided that that's what I want to do. That's where I want to be and I'm here. Amazing. So thanks for having me. I'm going to introduce Streamer to you briefly tell you what the vision is, how it works. I'm going to do a live demo and spend most of my time on that. If the Wi-Fi gods permit, it will go well and in the end I'll talk about how Streamer fits into a decentralized service stack consisting of Streamer and other projects that offer services for decentralized application development. So let's jump into it. So Streamer aims to tokenize the value in real-time data. For example, let's consider a self-driving electric vehicle. Now to deliver the best possible passenger experience, the car needs, it constantly needs data from other machines such as traffic congestion information from other cars, electricity prices on nearby charging stations, weather forecasts and so on. So Streamer provides a single interface for the car to go out and subscribe to the data it needs and pay for it using a cryptographic token called the Streamer data coin which lives on the blockchain. Now the car is also a data producer. It can sell the data it produces such as traffic information to other cars, road condition measurements to smart cities, passenger blood sugar level to nearby advertisers or whatever. So a data stream economy emerges. Streamers implemented as a decentralized peer-to-peer network for data delivery. So a data source can connect to any node in the network, publish a message and it instantly gets delivered to valid subscribers. So we aim for the low latency, high throughput use case in a pub sub in a massively scalable pub sub pattern. The Streamer network is designed to horizontally scale up to potentially billions of events per second using a clever sharding scheme. So sharding works by dividing the whole data flow going through the network into independent partitions and assigning asymmetric responsibilities for nodes in the network using a reputation mechanism. So the Streamer network is an off-chain companion network which uses an underlying blockchain for security critical operations such as value transfers and access control. So decentralization is achieved by allowing anyone to run a node in the network and contribute bandwidth to the network and storage in exchange for tokens as a reward. So the network is run by users instead of giant corporations which means that you own control and monetize the data that you produce. Of course decentralization also makes the network more robust against attacks and node failures helping guarantee the delivery of data in all circumstances. Now in addition to the scalable messaging infrastructure we're also building two applications. A marketplace which allows the data producers and data consumers to find each other. It's kind of like the app store for data streams. Raw data is usually not very useful as is. Some kind of computation is needed. Filtering, aggregation, combining different data streams, visualization, stuff like that. And for this purpose we are creating the Streamer engine. And the Streamer engine has a user interface called the editor which is based on visual programming and it looks like this. So it allows developers to get their hands dirty with the data very quickly. Promoting quick prototyping and allowing them to seamlessly combine powerful scalable off-chain analytics with smart contracts for example. And we'll do a live demo about this right now. So the demo I'm going to do touches all the layers in the stack but of course the most visible thing will be the editor because the rest are just infrastructure and they look like APIs. So you will see the editor mostly. But it's good to bear in mind that the engine and the editor are just one application on top of the network. So any application can connect to it and produce and consume messages. So let's see what happens. So let's see if we are okay. So this is the editor and we're going to start with an empty canvas obviously. I'll go to the search and search for a data stream I'm going to begin with. So this stream is called public transport demo. It actually contains telemetrics information from public transport vehicles running in Helsinki Finland which is where I'm from. You can see on the box I'm going to zoom up a little. So there are all the fields contained in the objects in this stream. So to see what the data actually looks like let me pull up a map and I'll just connect the vehicle ID over here to ID input on the map latitude and longitude fields on the map as well and I'm going to hit start and see what happens. So we instantly get a real time view into where the vehicles are at the moment in Helsinki. Right. Okay. So what does this have to do with Ethereum and blockchain? Now let's take a use case. So let's pretend I am Helsinki. No I'm the mayor of Helsinki sounds better. Okay. I'm the king of Helsinki and you guys you run a transport company. Right. And I want to hire you to run the public transport in my city. But I don't want to make a fixed deal. I mean pay a fixed sum per year because that gives me zero guarantees on the quality or quantity of your service. So instead let's make a data-driven smart contract. I'll pay you 100,000 way for each meter traversed by your vehicles. Sounds fair. Okay. Let's do it. So what we need is a smart contract first. Now a cool thing here is we have something called the solidity module. It has an edit code button. So I'll first select which account I want to use. I hit edit code. Let me pull up a little template from my text editor. It's a smart contract called pay by use. It's also available on the on the platform. Okay. I hit apply and it should get compiled. I have my constructor parameters over here. So let's initialize the contact with your address. So I'm paying you right. So I'm going to just get some address over here. Okay. Paste that. We agreed that I'll pay 100,000 way and I'll initialize the contract with some test either and hit deploy. So I can create and deploy smart contracts directly from the interface and later on I'll show how to connect data to the smart contract that I deployed. So let's wait a while for that to get mined and in the meanwhile let's add the computation. So now we have raw data. How do we get from that raw data to actual transactions that we can send to the smart contract? Something is needed in between, right? There's like 100 events per second coming from that stream and we definitely don't want to push 100 transactions per second onto Ethereum. So we need aggregation. We also need to calculate the distance that the vehicles have traveled from the raw data. So I'll show you another cool trick here. I'm going to add a sub canvas over here. So we have abstractions. Yo, dog, I heard you like boxes inside boxes inside boxes inside this box. We have more boxes. We have a lot of boxes. Okay. I'm going to connect the vehicle ID and speed. We're going to calculate distance from speed multiplied by time instead of difference in GPS coordinates because that's shady and noisy. Okay. Let's add a chart. Let's see what's happening. We have two outputs. I'm going to zoom in again. One of them says batch and one of them says current. I'm going to connect current to the chart. So what this box does in addition to calculating the travel distance, it also aggregates it. It accumulates it up to a threshold after which an event comes out from the batch output. So this should keep rising until my set threshold, which is 1000. It's a bit too much maybe. I'll change it to 500 and see if it works. Run time change. So now we're 300. Come on. Come on. Give me some more. So this, when it reaches 500, it should go back to zero and start from beginning. And that's the moment when we can commit to the blockchain. So we're aggregating and every once in a while we'll commit to the blockchain. And I'll show you in just one second how it works. Okay. So this is what we're after, right? One more thing to do and then we're going to be ready. So this batch output will send out the amount before resetting. So that's what we want to report to the smart contract. How do we call this smart contract? I'm going to add a module called Ethereum call. Let's make the calls from the tram demo account. Which contract do we call this one? Okay. Okay. Which function do we call? Function called update. So what the function update does, it's basically multiplication. So we report how many units we have traveled and it multiplies that by the preset price 100,000 way per meter. Only for you, my friend. So we connect batch to added units and we're basically done. So let's add a couple of more things to see what's happening. I'm going to add a table. I'm going to add the batch output to the table. I'm going to add another table. So we also have the events that the smart contract might trigger. There's one called paid amount. I'm going to connect that to the table. So we make the transaction and we also get a feedback back into the user interface. Now this is ready. Let's start. See what happens. So sorry. Got the Mexican flu or whatever. So when we reach 500, we should have a line here indicating that something came out of the batch. There it is. And now there should actually be a pending transaction on the test net. There it is. And when it gets mined, we should get one back over here. Wait for it. These are the longest seconds of my life standing here in front of you, just waiting for mining to happen. Okay, here we go. A few of those actually got mined. So this is it. So just to reiterate what just happened. A sensor or a bunch of sensors, hundreds of sensors, each in a different vehicle produced data to the streamer network. The streamer network delivered the events in real time to an application in this case, the streamer engine. The streamer engine called the blockchain. The blockchain answered with events, went back to the streamer engine. The streamer engine produced user interface events, which are also messages in a stream to back to the network. And those events got delivered directly to my browser via the network in real time. So application developers today, they use various cloud services typically. They use EC2 for virtual machines, AWS. They use hosted file storage, hosted databases, hosted real time data pipelines, hosted microservices. Now, the hypothesis is that each one of these layers will be replaced by decentralized counterparts, creating the decentralized cloud. So for example, Golan will steal business from EC2. Swarm and IPFS will slowly start replacing how S3 and other block storage is used. So where streamer fits in, the streamer network will provide decentralized data pipelines, a bit like Amazon Kinesis. The streamer engine will offer similar functionality to Amazon Lambda, for example. And blockchain, of course, that's new, right? There's nothing on the cloud side of things. And what the blockchain obviously does is it offers uncheatable state transitions, making the decentralization of the other layers, other services possible in the first place. So what would this look like from the application developer's point of view? So let's say we have an application. It runs on a container, which is hosted in a decentralized computing service, for example, Golan. Now, this application could be the streamer engine. It could be anything else, web server, microservice, you name it. So what we want is that we have, inside the container, we have access points to these decentralized services, which allow the application to connect to the different decentralized networks. For example, the streamer client, Ethereum node, and more IPFS, whatever. And we're collaborating with Golan and researching and prototyping how to build this kind of decentralizable application development stack that allows you to write pretty much any application in any language and make that decentralized by running it in a decentralized computation container and use decentralized services for it. Okay. So I'd like to invite you to meet and greet us over beer from five to six today at the Grand Fiesta Americana lobby bar, which is just across the street. It would be great to meet you all. If you like what you're seeing, follow us at Streamer Inc on Twitter and join our community chat at chat.streamer.com, because we make your streams come true. Thank you very much. So there's plenty of time for questions. Yep. Sorry. First of all, really cool UI. Because there are incentives for providing data. How do you prevent someone providing a lot of fraudulent data? Yeah. Okay. So of course, we need to prevent flooding the network with, for example, random bullshit, right? So there will be some kind of usage cost for the network, for both the producer and the consumer. So if you're a data producer, it's possible that there's also a staking mechanism. So you need to make some kind of initial investment, either in terms of usage fees or staking, expressing your belief that your data is valuable to someone. If that hypothesis is wrong, you basically lose money. So it's a bit like entrepreneurship in general. You have an idea. You think there's value in starting a cafe over there and then you make an initial investment and see what happens. Yeah. So basically, if there's a fixed cost for providing this amount of data onto the network per day, you need a certain amount of customers to break even, right? So you should achieve, aim to achieve that. And that this incentivizes posting or publishing data that doesn't have an audience at all. Or if you use data for private purposes, say you're an organization and you use the network for internal data. It's not even intended to be sold on the marketplace. You use it for internal analytics or whatever. Then you pay the usage fee for the network. And those usage fees go to the nodes. So it's kind of like mining, but we're not solving this artificial CPU or GPU problem. Instead, we are providing bandwidth to the network. So are there some mechanisms for cleaning the measurement streams of malicious measurements with some assumptions that at least part of the measurements are to be trusted? Yeah. So I mentioned the reputation system for the nodes, which helps distribute the responsibility there. But there needs to be another reputation system as well, which is at play on the data marketplace. Kind of like you have reviews for apps in the App Store, you're going to have to have a reputation system for streams on the marketplace. Now, if someone is producing malicious content or infringing copyright, for example, so basically this is an unsolved problem. So if you could buy a stream from there and then publish it for cheaper or for free, and this kind of activity needs to be incentivized. But of course, it's hard to exercise censorship in a censorship-free network. So if you post a movie on YouTube, they can take it down, but if you do that on the network, it's impossible. So we can have community voting for unwanted content going on. The producers might need to stake value in there as well. Or we can simply retain certain amounts of centralization on the marketplace so that it can be curated by a group of curators, and actually then we can have a little bit of censorship going on in there. Hey, so I may have missed it, but where exactly is the data stored in the network? So what we have currently is a centralized solution built on cloud technology, and the decentralization of the system will take place from now on. And we haven't yet solved all the details there so we're early on on that. The basic idea is that the data is persisted on the nodes. We might also use decentralized time-series database or no SQL database if one that suits our needs pops up. But as this is very evolving space at the moment, so it might be that we need to actually build them as well. So it will be stored at the nodes and we need some kind of indexing going on there. So compared to, for example, static files, if you have a static dataset or a video file, for example, you can a little bit easier handle that. But what we need to do is we need to have a search functionality into the data in order to retrieve specific messages or ranges of messages from there. So we'll have that implemented as part of the network. Cool. Great. Thanks guys for your attention.