 I would love to know a bit about the company itself. We started in 2010, as you mentioned, and we actually were part of a DOD research project. And that goal of the project was, consume hundreds of different real-time data feeds and then be able to give a query capability to analysts, to data scientists, to developers, to quickly be able to deploy stuff into the field. And at the time, NoSQL was all the rage, Hadoop was all the rage. There was still the big warehouses, the legacy warehouses, like the Teradata's. They all really had a huge amount of trouble dealing with real-time data and being able to do complex query. They could do a few things. They could do a lot of pre-planned things where they did a lot of indexes and supplemental data engineering to make it happen. But to be able to really find that needle in the haystack to be able to do whatever you want across all that data as it continued to flow, there was a really no good solution. So we were there as part of that program and we had this idea, hey, the GPU is something now in 2010 even where it is a tremendously powerful device. So databases have been designed with one thing in mind for 40 years that compute is very scarce resource and you should be able to organize your data prior to asking your query or your question so that you use as little compute as possible. Now the GPU, compute is an abundant resource. So what if we flip the equation on its head? Let's make a database that's for allowing data to continuously stream in to be able to write any query you want without data engineering and leverage all of this abundant compute in a distributed way, in a way that allows the developers, the data scientists to ask any question they want and get back responses quickly with up-to-date data. And that was the basic premise behind Connecticut and we started building in 2010. And we became the analytic engine for the speed layer for that program where we were sitting on top of actually accumulo and doing all the analytic temporal and spatial work for that project. Over time, we went into larger enterprises like USPS which was one of our first flagship customers where again, they were doing something that really required a new type of solution. They said, hey, we put sensors on every mail carrier. We need to be able to analyze this in real time and it looks like you're the right solution for that. And that was one of our first major wins. And from there, we really focused on becoming a real-time speed layer for the modern enterprise. So are you mostly targeting the public sector government entities? No, I mean, we do have a big DOD customer base so we also have financial large banks, large telcos. Anyone who's trying to take advantage of real-time data from sensor and machine where they wanna be able to do advanced analytics that potentially fuse that real-time data against historical data sets to be able to query it without any type of limitation and have it be up-to-date, have it be performant, that's really our sweet spot.