 All right, hi everybody. I'm Ardyn Maginski here with Texas Instruments at Embedded World 23 and we're gonna take a look at some exciting products that we're launching for the Embedded portfolio at TI so come with me and check out our booth. It's been a busy show? Oh absolutely, it's been incredible. Lots of momentum, lots of exciting announcements. So we'll talk a little bit about some of the really cool technologies we're doing within MCU and MPU of our products. What we'll talk about today is some of the cool launches we're doing on our MSP portfolio. We'll talk a little bit about our new arm-based M-Zeros and then we'll circle around and take a look at some of our vision processors where we're adding analytics at the edge to make electronics smarter, more intelligent. The latest generations of microcontrollers get to be even smaller in nanometer, lower power consumption. Exactly, you're spot-on, right? Lower power and more integration, that's what we're about. And you can kind of see as the wafers are gaining to 300 millimeter, our capacity continues to grow as well. So let's take a look at some of the cool demos here around our MSP M-Zero portfolio and what we're showing here today is a brand new Just Launch portfolio that we announced yesterday of M-Zero Plus based MCUs. All these? All these? All these are new? Absolutely, so great scalability, lots of different packages, lots of different memory options, and lots of really good analog integration. And that's really where TI differentiates. We add a lot of the architectural and integration around the arm cores to enable our customers to innovate. But we don't stop there. What we do with these parts here is what we actually solve the ease of use and how fast our developers can get started, right? So we've developed a number of configurations and tools that, yeah, a number of different tools that we've enabled our customers to be able to configure the MCU without it needed to program it. So now imagine an analog designer who's trying to innovate with an MCU can start with graphical interface tools that simplify their development cycle, get to production faster, narrow it down to one of our many MCUs, and get going quickly with this M-Zero, MSP-M-Zero process. So this analog configurator, what is that about? Yeah, so what this configurator does is it lets you connect the different analog pins within the device without having to actually write any lines of code. So let's say you want to take an ADC signal in, you want to filter it, and then you want to react to it, right? You can do all of that with configure it with visual blocks rather than writing any lines of code. And the time to market is just substantially simplified, and you can innovate rapidly. Nice. You can see some of the cool applications that MSP-M-Zero's go into here, and really, these are small MCU's, so they're actually everywhere. Believe it or not, every little board in the world will have one of these little chips on it, and you can see it's everything from a car to power to motor control right anywhere you do in any kind of process. There's pretty much buildings that are being shipped. Absolutely. But this is the latest... This is the latest, greatest, lowest power, lowest cost, affordability, bringing MCU's to the masses. Is it ARM Cortex-M-Zero? It's got ARM Cortex-M-Zero Plus, exactly. All right. Very good. So we'll circle around and take a look at some of our visual processors as well. So in addition to the MCU's, we're launching lots of processors today as well focused on vision specifically. So what we've done is, these are also ARM core-based processors, but what we have done is we added a TI-developed AI engine and site that allows us to enable lower power or lower costs than analytics could ever be. And why does that matter? Well, electronics are becoming more and more intelligent, which means they have to make decisions, and those decisions have to be done with low latency. And so they have to happen where the data happens, and that is at the edge, right? And with the scalable portfolio, we actually tailor the size, the power consumption, the cost to the different applications they're needed. We started with something like a 6-2-A family here that enables a single to dual cameras with up to two tops of performance. And then we scale up with our 6-8-A to 1-2-8 cameras and 8 tops, and even to our AIM 6-9-A where we can support up to 12 cameras with 32 tops of performance. Are these all of your board, the partners? Absolutely, yes. So as you can see, all of our kind of TI boards up here where we're demonstrating the various demos. And as you look down, right, this is the scale of the ecosystem that is brought to our portfolio through our partners. And what this does is again, simplifies our customer experience to get to the market faster, through available hardware, through easy-to-use software, and most importantly, making it reusable across all applications. All right. So there's all kinds of shapes and use cases being developed and put to market. Absolutely. There's no one size fits all, right? And that's the differentiation from TI. We bring a portfolio of products that are enabling our customers to innovate for their applications. All right. And here on the screens, you show some of the Edge Vision AI processing applications. Absolutely. Yes, you can see a few different applications across board here from different cameras. And really what we're showing is the sensor is responding at the edge, right, where it's important to make that decision. So you can see whether it's an intersection where you're trying to see cars and people walk across and you have to turn it off the lights. You want to make sure that is accurate and safe, right? Same with something like a delivery robot that's driving around dropping off packages. You want it to be able to be safe and reliable. Nice. So you're launching the new Vision processor? Oh, yeah. So the same 6A family that you see here just announced yesterday, right? But it's actually now available on TI.com. So everything here you see here, you can come to TI.com slash Edge AI and order today, get started today. So you've been working over the past time to get all these board makers to prepare the boards so that they could be ready. Yeah, and this is an extension of the portfolio we have. So AIME 6 is actually a brand of many different processors and the AIME 6A added analytics to our already existing core base, which has enabled our customers to really reuse a lot of the software and add on the analytics to get to marketing and faster, including our board partners. And that's an ARM Cortex? A53 for the Cortex. And then the accelerator is actually TI developed one. So we can differentiate on the power and affordability of these. And that's directly on the SoC? Directly on the SoC. Integrated fully. So great power performance and power consumption. The power consumption is a really important aspect. It absolutely is very low power. And that's what makes it be able to be at the edge, right? Because once you're at the edge, you have to deploy in all kinds of small form factors, right? You have to make sure that all of these things are can fit what they need to go. And then power becomes a very big consideration. So that's where we've innovated by scaling across our portfolio with all of these low power processors. Nice. So this could be a huge seller potentially. This could be a big market. It's a huge market. As you walk around, you've seen AI, right? All of the electronics are becoming more intelligent. It's a major trend. And it's solving real world problems. So we see a lot of opportunity here for people to innovate. The next gen security cameras, it's just going to be not like the previous gen. Absolutely, smarter, better, safer. All right. And is a bunch of stuff more to boost? Yeah, we can look at a few more demos as we walk around here with different intelligence. You can see here pose estimation where you can actually tell what the human is doing, right? And you can react. So you can imagine in a factory where you're with a co located with a with a bot, right, that's moving around, and you're moving around, you want to be able to tell exactly where your hands are, right? Because you don't want the robot and the person running into each other, right? And with this kind of AI technology, we can actually do that at the edge and retain privacy because all the decision and all of the data processing happens at the edge, you don't have to send any of the video or data out to the cloud. Nice. All right, because it would be terabytes and terabytes of data. And there's no need. And there's no need. That's exactly. But that's the industrial innovation. Similarly, we see a lot of innovation in retail, home automation. And so one of the demos we're showing here, if you're able to see it. Maybe we'll go on the other side. Alright, let's do that. Yeah, so if you look over here, what we're showcasing with this demo, again, running on our aim six to eight products that you saw over here is object identification and classification for retail use case. I don't know how it is for you, but whenever I get lunch in the middle of the day, you know, you're waiting in line way longer than you're actually eating your lunch. And so what we're showing here is I can actually tell you what you put on your plate and charge you automatically for it, rather than you having to stand in a long line, you know, explaining what you're purchasing. And that innovation is going to make people's lives better. Give us more time back, right? And enjoy our lives a little bit more. So we're extremely excited about some of the things that are happening. Start the alarm. And if you're not getting the right ingredients for your diet. Sure. That's a fantastic idea. I love that. That would definitely help me get better with my diet. Go directly on your phone and connect with your fit app and tell you exactly what you ate. And tell you if you're missing it in the fridge too, right? So you can reorder it as you need it. Nice. Is this just smart programming or is really enabled by your chip? So it's really enabled by our chip because the way AI technology works is actually it is trained on the data, the real world data to make those decisions. So instead of it being, you know, pre-programmed for a decision, it's trained. And what that makes it do is be more accurate, right? More responsive. But also over time, it can self train even more, right? The more bananas I put here, the better the banana recognition will get, right? And so it only gets better with time as well. So each chip gets better or the whole training? The model gets better. Yeah. It gets shared to all the chip somehow. Yeah. I mean, there's a lot of models out there to get trained for sure. But generally you do it on the chip, but these are also connected, right? So we have the things like Ethernet, we have Wi-Fi protocols. So all of these things can talk to each other and improve together as well. Nice. Yeah, it's very exciting. We're super, super excited to be here. Nice. There's a bunch more you want to show? I think we hit all the key ones, I think. So it goes all the way around your booth. And that's all the meeting areas. Yeah, you'll start running into some meeting areas and coffees. But if everybody wants to see me drink some coffee, that's an option too. Cool. All right. We've got the cool stuff. Thanks a lot. I appreciate it. Thank you for coming by. And again, this is Arty Maginski with Texas Instruments here at Embedded World 23.