 From around the globe, it's theCUBE with digital coverage of AWS re-invent 2020. Sponsored by Intel and AWS. Hello, and welcome back to theCUBE's live coverage of AWS re-invent 2020. We're not in person this year, we're virtual. This is theCUBE virtual. I'm John Furrier, your host of theCUBE. Roger Barger, the general manager of AWS Robotics and Autonomous Service and a lot of other cool stuff was on last year. Always, you know, speed racer, you got the machines. Now you have real-time robotics hitting the scene. Andy Jassy laid out a huge vision and data points and announcements around industrial, this, IoT. It's kind of coming together, Roger. Great to see you and thanks for coming on. I want to dig in and get your perspective. Thanks for joining theCUBE. Good to be here with you again today. All right, so give us your take on the announcements yesterday and how that relates to the work that you're doing on the robotics side at AWS. Where does this go from, you know, fun to real world, to societal impact? Take us through how you see that vision. Yeah, sure, so we continue to see the story of how processing is moving to the edge and cloud services are augmenting that processing at the edge with unique and new services. Andy talked about five new industrial machine learning services yesterday, which are very relevant to exactly what we're trying to do with AWS RoboMaker. A couple of them, Monitron, which is for equipment monitoring for anomalies. And it's a whole solution from an edge device to a gateway to a service. But we also heard about Lookout for Equipment, which is if a customer already has their own sensors, it's a service that can actually back up that sensor on the device to actually again, identify anomalies or potential failures. And we saw Lookout for video, which allows customers to actually use their camera and build a service to detect anomalies and potential failures. Well, at AWS RoboMaker, we have ROS cloud service extensions, which allow developers to connect their robot to these services. And so increasing that combination of being able to put sensors and processing at the edge, connecting it back with the cloud where you can do intelligent processing and understand what's going on out in the environment. So those were exciting announcements and that story is going to continue to unfold with new services, new sensors we can put on our robots to again, intelligently process the data and control these robots. And then we're going to move on to the next slide. And I'm going to go ahead and talk a little bit about what's happening in industrial settings. You know, this brings up a great point. And I wasn't kidding when I was saying fun to real world. I mean, this is what's happening. The use cases are different. You look at, you mentioned, you know, Monotron Lookout, but there's the panorama appliance. You had computer vision, machine learning. I mean, these are all new, cool, relevant use cases, but they're not like static. And sometimes mostly purpose built for the edge piece. So it's not like you can build a product and say, okay, it fits everywhere. Talk about that dynamic and why the robotics piece has to be agile. And what are you guys doing to make that workable? Because, you know, you want purpose built, but purpose built implies supply chain years in advance. It implies slow and, you know, how do you get the trust? How do you get the security? Take us through that, please. So to your point, no single service is going to solve all problems, which is why AWS has released a number of just primitives. Just think about Kinesis video where I can stream my raw video from an edge device and build my own machine learning model in the cloud with SageMaker that will process that or I could use recognition. So we give customers these basic building blocks, but we also think about working customer backward. What is the finished solution that we could give a customer that just works out of the box? And the new services we heard about yesterday were exactly in that latter category, their purpose built, they're ready to be used or trained for developer to use end-to-end with very little customization necessary. But the point is that these customers that are working in these environments, the business questions change all the time. And so the need to actually reprogram a robot on the fly, for example, with a new mission to address the new business need that just arose is a dynamic which we've been very tuned into since we first started with AWS RoboMaker. We have a feature for fleet management which allows a developer to choose any robot that's out in their fleet and take the software stack, a new software stack tested in simulation and then redeploy it to that robot. So it changes its mission. And this is a dialogue we've been seeing coming up over the last year where roboticists are starting to educate their company that a robot is a device that can be dynamically programmed at any point in time. They can test their application and simulation while the robot's out in the field to verify it's gonna work correctly in simulation and then change the mission for that robot dynamically. One of my customers that I'm working with, Woods Hole Institute is sending autonomous underwater robots out into the ocean to monitor wind farms. And they realized the mission may change, may change based on what they find out of the wind farm with the equipment, with their autonomous robot. The robot itself may encounter an issue and that ability because they do have connectivity to change the mission dynamically. First testing of course in simulation is completely changing the game for how they think about robots. It's no longer a static program at once and you have to bring it back in the shop to reprogram it. It's now just this dynamic entity that can test and modify at any time. You know, I'm old enough to know how hard that really is to pull off. And this highlights really kind of how exciting this is. I mean, just think about the idea of hardware being dynamically updated with software in real-time and or near real-time with new stacks. I mean, just that's just unheard of. You know, because purpose built has always been kind of you lock it in, you deploy it and you send the tech out there. There's kind of break fix kind of mindset. This changes everything, whether it's space or underwater, you mean seeing everything. It's software defined, software operated model. So I have to ask you, first of all, that's super awesome anyway. What's this like for the new generation? Because Andy talked on stage in my one-on-one with them, he talked about, in referring to Lambda and some of these new things, there's a new generation of developer. So you got to look at these young kids coming out of school to them. They don't understand how hard this is. They just look at it as lingua franca, software defined stuff. So can you share some of the cutting edge things that are coming out of these new, the new talent or the new developers? Cause I'm sure the creativity is off the charts. Can you share some cool use cases, share your perspective? Absolutely. I think there's a couple of interesting cases to look at. One is roboticists historically have thought about all the processing on the robot. And if you'd say cloud and cloud service, they just couldn't fathom that reality that all the processing can be moved off of the robot. Now you're seeing developers who are looking at the cloud services that we're launching and our cloud service extensions, which give you a secure connection to the cloud from your robot. They're starting to realize they can actually move some of that processing off the robot that can lower the bomb or the bill of materials, the cost of the robot. And they can have this dynamic programming surface in the cloud that they can program and change the behavior of the robot. So that's a dialogue we've seen coming over the last couple of years, that rethinking of where the software should live, what makes sense to run on the robot and what should be pushed up to the cloud, let alone the fact that if you're aggregating information from hundreds of robots, you can actually build machine learning models that actually identify mistakes a single robot might make across the fleet and actually use that insight to actually retrain the models, push new applications down, push new machine learning models down. That is a completely different mindset. It's almost like introducing distributed computing to a roboticist that you can actually think about. This fabric of robots. And another more recent trend we're seeing and that we're listening very closely to customers is the ability to use simulation and machine learning, specifically reinforcement learning for a robot to actually try different tasks out because simulations have gotten so realistic with the physics engines and the rendering quality that it's almost nearly realistic for a camera. The physics are actually real world physics so that you can put a simulation of your robot into a 3D simulated world and allow it to bumble around and make mistakes while it's trying to perform the tasks that you frankly don't know how to write the code for it's so complex. And through reinforcement learning, giving rewards signals if it does something right or punishment or negative rewards signals if it does something wrong, the machine learning algorithm will learn to perform navigation and manipulation tasks, which again the programmer simply didn't have to write a line of code for other than creating the right simulation and the right set of trials. So it's like reversing the debugging protocol. It's like, hey, do the simulations, the code writes itself, you debug it on the front end, it writes itself rather than writing code, compiling it, debugging it, working through the use cases. I mean, it's pretty different. It is, it's really a new persona when we started out, not only are you taking that roboticist persona and again, introduced into the cloud services and distributed computing, but you're seeing machine learning scientists with robotics experience is actually rising as a new developer persona that we have to pay attention to and we're talking to you right now about what they need from our service. Well, Roger, I'm getting tight on time here. I want one final question before we break. How does someone get involved with Amazon? And I'll see, you know, whether it's robotics and new areas like space, which is emerging, there's a lot of action, a lot of interest. How does someone engage with Amazon to get involved, whether I'm a student or whether I'm a professional, I want a code. What's the, what things would look like? Absolutely. So certainly reinvent, we have several sessions that reinvent on AWS RoboMaker and our team is there presenting and talking about our roadmap and how people can get engaged. There's of course the RIMARS conference, which will be happening next year, hopefully to get engaged. Our team is active in the ROS open source community and ROS industrial, which is happening in Europe later in December, but also happens in the Americas where we're present, giving demos and giving hands-on tutorials. We're also very active in the academic research and education arena. In fact, we just released open source curriculum that any developer can get access to on GitHub for robotics and ROS, as well as how to use RoboMaker that's freely available. So there's a number of touch points. And of course I'd be welcome to our field and any requests for people to learn more or just engage with our team. Roger Parker, general manager, AWS Robotics and also the autonomous systems group at AWS, Amazon Web Services. Great stuff. And this is really awesome insight. Also, it's candy for the developers. It's the new generation of people who are going to get, put their teeth into some new science and some new problems to solve with software. Again, distributed computing meets robotics and hardware. It's an opportunity to change the world literally. It is an exciting space. It's still day one in robotics and we look forward to seeing what our customers do with our service. Great stuff. Of course theCUBE loves this content. We love robotics. We love autonomous. We love space programming, all this stuff. Totally cutting edge, cloud computing, changing the game at many levels with the digital transformation. This is theCUBE. Thanks for watching.