 Okay, I'm going to go ahead. So welcome everyone, and thank you for joining Kubernetes on the Edge Day live in Amsterdam. I'm Steve Wong. I'm going to skip a bio and invite you to judge me by what I'm about to say. I want to shout out to the sponsors who provided some of the funding for this event. And with that said, let's get started. You saw the official theme when you registered some impressive numbers here. Edge compute four times larger than cloud, generating 75% of data worldwide. In marketing, forecasts like this are called a pitch deck. If this got your boss to okay, you coming to this event, that's great. But let's be skeptics and question the likely growth in Edge as a thought exercise. Why will Edge be generating and handling 75% of all data? Why will people decide to increase investments in Edge compute? Let me outline a few technology shifts taking place and lay out an idea of what this could lead to. Then you can decide for yourself whether that growth forecast is real. There are five significant changes going on right now. Some of the things on this list help with remote management of Edge workloads at locations with limited physical security and limited or no staff. And for Greenfield missions, new chips reach into places you couldn't go before with connectivity going increasingly wireless. We're seeing needs for better energy efficiency and data access governance. The world is using approaches like get ops and device twins to manage Edge at scale. And of course the elephant in the room is AI. These changes are the building blocks that are going to account for dramatic changes in growth at Edge. But let me walk you through an example that puts all of these things together. In a recent blog post, a former founding engineer of Google's big query dropped the line, big data is dead. It talks about data sovereignty and privacy challenges along with a reevaluation of whether hosting a central ocean of data delivered value to those who amassed it. Large amounts of data are getting generated and holding this in a cloud can have significant costs. But maybe the raw size isn't the crux of the issue. Maybe the secret is to use Edge hosted analysis to act on it locally and reduce what gets forwarded. The savings aren't just in cloud storage. If you process workloads such as high def video, the downside goes beyond latency and added points of failure. It can bring wasteful energy burn at each step of copying and caching. What if you could summarize at Edge? What if you could use something to analyze and act on data at Edge, avoiding round trips to the cloud? Chat GTP, GPT recently woke up the world as to the potential of AI. But chat is not the only application for AI and a central public cloud is not the only place where this could be useful. Data is being generated in manufacturing, retail, utilities, and other Edge use cases. And we have affordable AI technologies to act on this data. The Google USB TPU, and I've got one here to show you it's real. And the MG24 SoC are just a few examples of technology for far edge that exists to run inference engines at low cost and high performance. Technology also exists for mid-tier locations that can use higher power accelerators as a shared resource. Let's look at a specific example of how these machine learning and AI accelerators could be used. When I worked in industrial automation, I'd run into plants where they'd have a veteran who could come in in the morning and tell just by ear if a turbine was off. This slide from Silicon Labs is based on training an ML inference engine that listens not just in the ordinary audio range, but way up into ultrasound territory to predict failures before downtime happens. Maybe it's not perfect. Maybe, but what's the cost of an error if it occurs? A false positive might mean that you changed a bearing a little early, but if this also catches and prevents occasional unplanned outages, what would that be worth? Bottom line is this doesn't have to be perfect. It has to be work most of the time and be better than the status quo and it becomes valuable. Don't get me wrong here. This is new stuff and it's changing rapidly, but the opportunity is huge. These devices aren't multi-million dollar quantum computers. Maybe you want to consider doing a proof of concept to learn about this. ChatGPT was trained on the resource of accumulated web pages from the internet, but it seems that training, this training was maybe burning through the world's accumulated information like burning up fossil fuel as a resource. You see here a quote that just came out yesterday from the CEO of OpenAI, the company behind ChatGPT. Well, edge data, if you'd view mining the existing internet text as a fossil fuel burn, edge data is more like a continuous source of renewable energy in the form of data screams that just don't ever run out. Now with ChatGPT4, you could upload limited amounts of additional information and use it as training data, but not all edge data is text and doing this might be giving away trade secrets or customer information. There are good reasons for hosting AI and machine learning under your own control when it comes to edge. So AI has massive uncertainties and risks, for those in competitive businesses that get caught unprepared. Every business is going to change whether we like it or not. And on top of that, AI is likely to alter your career, maybe sooner than you think. AI is not going to replace engineers, but engineers who know how to deploy AI are going to replace those who don't. If you wanted to use AI at edge, there's more to it than just buying some chips. You're going to need tools for deployments, updates, security, connectivity, and more. This stuff is available and we're going to talk about it today. So here are the sessions that we've got laid out this afternoon. Please take a look at this list. We're going to do four 25-minute sessions covering foundations like Kubernetes, ServiceMess for Edge, Management at Scale, and then take a brief break. After the break, we're going to visit security, connectivity, digital twin approaches, and we're going to close the day with a talk about a use case involving autonomous drones used to pick fruit out of trees. AI tech can be an intellectual force multiplier just like the steam engine and electricity were physical force multipliers that changed the world and got 12-year-old kids out of coal mines. Suppose we could use AI to improve the efficiency of a manufacturing or transportation operation by just 2%. These kinds of advances save resources and improve quality of environment while elevating the standard of living and quality of life for all. This is a critical time to come together to listen, learn, share, and be curious about what comes next. Let's explore together. The way to get the most out of this event is to go beyond passive listening. Make an attempt to meet the people here at the conference and hold discussions out in the hallway during breaks or after hours. Introduce yourself to the people around you who don't work for your own organization. This event is all about community. Finally, this event is subsidized by the event sponsors. Yes, you paid a registration, but these sponsors help cover the budget, so I want to quickly recognize them. So that's the end of my talk. Next up, we've got Danielle coming up to talk about Kubernetes for low resource edge. So give us a few minutes to cable up a different laptop.