 announcements today. But first, a little bit of background. We have an abundance of research and scientific literature today. The problem is that our human brains are limited so we cannot read and understand everything. And that is why we are building an AI researcher. What we're doing today very concretely is that we're semi-automating the researchers process. About a year ago we launched the first phase, which is taking a problem statement and broadly exploring the research field. And we've proven through peer-reviewed open access papers that we, using our tool, researchers using our tool consistently outperform teams using, for example, Scholar or PubMed, which is very exciting. And this leads us to our first announcement of the day, that the second step of the researchers process, which is taking all of that content and narrowing it down to a precise reading list, that tool was launched to our first-better clients last week. And that is a tool that iterates between the tool and the user to filter down. We can save you 90% of your time in doing this with about an 85% precision and recall, which is a lot better than what you can do in industry today. So that is our first announcement that that tool is live. Thank you. The second announcement that we're going to make now is that every quarter this year we have been doubling our revenues. And actually for the month of January and that month alone we are actually cash flow positive, which is very exciting. And then the third announcement, which is the bigger one and the one that makes us go, whew, finally, is that as of this week we have closed our seed funding round and together with Nordic Impact and a bunch of amazing, amazing individuals that really believe in us, we have raised a total of $2 million, which properly positions us to go out into the world and do some really exciting stuff. But there is one more thing, because yes, we do have access to more scientific content than ever before. But we have politicians that doubts its legitimacy. We have publishing houses that hides a lot of it behind paywalls, going after those who breach those with deadly force. We have academics that are struggling for lack of resources and funding in time to communicate across different disciplines. Openness and transparency and fighting biases has always been a core value of what we do. The problem is that we are finding it really hard to live up to that. Yes, we work with amazing corporates, but looking at the VC scene and trying to get funding and working with those corporates, we find ourselves trapped and delivering that as opposed to going out and delivering our tools freely to everyone who needs it. Because it isn't enough for us to build a cool piece of software. If we want to make a massive impact in the world, we have to uproot an entire industry and we cannot do that alone. And about a year ago, we launched our AI trainer platform. Since then, we've had 8,000 individuals sign up to help our tool learn. People who contribute and volunteer their time and tell us that what we do is important. And we've been asking ourselves over the past few months, how can we give back? How can we give value back to these people and show them that what they do matter to us? And thanks to new technology and new opportunities arise. And I'm happy to announce today that we are going to enable our AI trainer community and everyone else working with us by tokenizing access to our tool. And what this is looking like is that every AI trainer who builds annotated data sets with us, every coder and quality assurer that helps contribute to our increasing open access code base and longer term researchers using our tools to publish their research open access will be rewarded tokens, tokens that they can spend directly into the core services of our tool. We will ask this community to hold us accountable for openness, for fighting biases, for being transparent in how the tool works. And as we continue to play with big corporates and they purchase access to the tool, and of course as the algorithms continue to improve thanks to the community, the value of the tokens to the community will increase. Scientific research is the ultimate decentralized body of information that we have. And decentralizing access to it allows us to empower a community that otherwise would not have been able to do so. Doing that will allow us to take power away from the big publishing houses. It will allow us more transparency and prove a really important balance to the big software players that are seemingly impossible to hold accountable for their algorithms and their implications on the world. And we can show the very, very revenue focused VCs that actually there are other ways of doing this or other ways of building incredibly profitable companies where you can open up the tool to as many people as possible. So with that, we will be releasing a full white paper in 2018 with a full functionality of the tokenization of the platform. Until then, I hope that everyone here who loves science will join our community, join our efforts and please let us know what you think because this is going to be a community effort. Thank you very much.