 proving not to be the case. In terms of the maturity of the platform, you're absolutely right, I mean the slide that Mike showed that only 30% of the contributions happening today are in the Hadoop core layer. And the overall kind of vision there is very similar to the operating system, right? Except what this really is, it's a data operating system. It's how to operate large amounts of data in a big data center. So three, it's like an operating system for many machines as opposed to Linux, which is an operating system for a single machine, right? So Hadoop, when it came out, Hadoop is only the kernel, it's only the inner layers, which if you look at any operating system like Windows or Linux and so on, the core functionality is two things, storing files and running applications on top of these files. That's what Windows does, that's what Linux does, that's what Hadoop does, at the heart. But then to really get an operating system to work, you need many ancillary components around it that really make it functional. You need libraries, you need applications, you need integration, IO devices, et cetera, et cetera. And that's really what's happening in the Hadoop world. So it started with the core OS layer, which is Hadoop HDFS for storage, map reduce for computation. But then now all of these other things are showing around that core kernel to really make it a fully functional, extensible data operating system.