 Do you think that just the way we are talking about a lot of cultural movement when we come to security, AI, ML, and other things, do you see that companies also need to have a data strategy, they also need to have a cultural changes within them when they look at data also differently, where once again, it's not a silo, it's not a databases, and we don't care, we just have to care of the new business application that we're writing. What I'm asking is that, do you see that we need cultural changes or you're seeing cultural changes around data? Because data is the new world, we do need to focus a lot on that. If you go back into the year 2009, you had the NoSQL movement at its peak. Databases have been created quicker than Kubernetes extensions today. So it's basically, as an application developer, you've been flooded with new databases that deprive from the relational database architecture. And since then, a lot of them, they became very popular and they have conquered their own niche. So in this regard, there was a cultural change in managing data for application developers. And you have to also see how this connects to the emergence of microservice-based architectures. Because in a monolith, you have one database technology and adding more databases to one application makes things very complicated. So think about a classic Java app where you have an object relational mapper, like Hibernate or anything, you wouldn't dare to use a second database. But if you split that application into smaller apps, you can use different programming languages for different applications or different services. And you can use different data stores as well. So the idea of different databases and the NoSQL movement with making the availability of new technologies much richer, together with microservices, naturally that only was possible because of the progress in automating software operations. Because if you have more apps to operate and more databases to take care of, you need to do that more efficiently. Otherwise, the increased number of different technologies increases operational complexity and you would suffocate the advantage you get from it. So you have to see all those things as being interconnected because they can only make progress along each other. They cannot move isolated from another. And so at some points in time, there are basically not gaps, but fast advantages, like for example, the emergence of declarative automation technologies that if you think about the ephemeral VM and persistent disk paradigm for a second, because that's the game changer that caused the leap from imperative to declarative automation, in my opinion, is if you think about Chef, for example, the underlying assumption is you have a physical or virtual server somewhere and then you download instructions and with a little bit of context information, you execute a predefined set of techniques. But the idea is still a long running machine and that's the basic break in assumptions that comes with the ephemeral VM and persistent disk. And that is only possible because network bandwidth became much quicker. So with a 10 gigabit network, you can remotely attach a disk with sufficient performance and you gain the ability to reattach that persistent disk that virtual hard disk drive, so to say, to any server in the cluster. And that basically opens up the possibility to repair servers, which means that you don't have to buy the most expensive servers anymore and it also allows to make that programmatically once visualization is there. And once you can make that programmatically, you can destroy servers as part of your life cycle management. So for example, if you think about making a database bigger in a Postgres database, you need to take out the secondaries first, make them bigger, at some point, you need to take out the primary. So you need failure detection, you need an automatic failover, a promotion of a new primary, and if you do that, you can basically make a database bigger without major impact on the service by recreating those virtual machines from scratch from a known state. So if you think about that, that's a huge change in attitude, that's a huge change in the tools you would use to tackle that problem, and it enables exactly that operational efficiency that makes microservices worth it. You can use a programming language here and a different there, because build packs and container technologies allow you to make that deployment quick and simple and low overhead. And for data, bootstrapping new database is just a single command, making a backup single command and scaling out from one virtual machine to three or one port to three, just a single command. So you shift the operational responsibility towards the application developer by increasing the depth of automation. And that shifts culture and that shifts and enables innovation, which is what it's actually meant to do.