 From San Francisco, it's theCUBE, covering Accenture TechVision 2020. Brought to you by Accenture. Hey, welcome back everybody, Jeff Frick here with theCUBE. We're high atop San Francisco on a beautiful day at the Accenture San Francisco Innovation Hub, 33rd floor of the Salesforce Tower. For the Accenture TechVision 2020 reveal, it's where they come up with four or five themes to really look forward to. A little bit innovative, a little bit different than cloud will be big or mobile will be big. And we're excited to have really one of the biggest brains here on the 33rd floor. She's Teresa Tong, the Managing Director of Accenture Labs. Teresa, great to see you. Nice to see you again. So I have to tease you, because last time we were here, everyone was bragging on all the patents that you've filed over the years. So congratulations on that. It's almost kind of like a who's who's roadmap of what's happening in tech. I looked at a couple of them. You've got a ton of stuff around cloud, a ton of stuff around Edge, but now you're getting excited about robots and AI. That's right. That's the new passion. That's the new passion. All right, so robots, one of the five trends was robots in the wild. So what does that mean? Robots in the wild, and why is it something that people should be paying attention to? Well, robots have been around for decades, right? So if you think about manufacturing, you think about robots. But as your kid probably knows, robots are now programmable. Kids can do it, so why not enterprise? And so now that robots are programmable, you can buy them and apply them. We're going to unlock a whole bunch of new use cases beyond just those really hardcore manufacturing ones that are very strictly designed in a very structured environment to things in unstructured and semi-structured environments. So and does the definition of robot begin to change? We're just talking before we turn on the cameras about, say, a Tesla. Is a Tesla a robot in your definition or does that not quite make the grade? I think it is, but we're thinking about robots as physical robots. So sometimes people think about robotics, process automation, AI, those are robots. But here I'm really excited about the physical robots, the mobile units, the gantry units, the arms. This is going to allow us to close that sense, analyze, actuate, loop. Now the robot can actually do something based off of the analytics. Right. So where will we see kind of atomic robots kind of operating in the wild versus as we said, you know, kind of a classic manufacturing instance where they're bolted down, they do a step along the process. Where do you see some of the early adoption is going to see them on the streets, right? Or wherever we will see them. Well, you probably do see them on the streets already. You see them for security use cases, maybe mopping up a store after where the employees can actually focus on the customers and the robots maybe restocking. We see them in the airports. So if you pay attention to modern airports, you see robots bringing out the baggage and doing some of the baggage handling. So really the opportunities for robots are jobs that are dull, dirty, or dangerous. These are things that humans don't want to or shouldn't be doing. Right. So what's the breakthrough tech that's enabling the robots to take this next step? Well, a lot of it is AI, right? So the fact that you don't have to be a data scientist and you can apply these algorithms that do facial recognition that can actually help you to find your way around. It's actually the automation that's programmable as I was saying, kids can program these robots. So they're not hard to do. So if a kid can do it, maybe somebody who knows oil and gas, insurance, security can actually do the same thing. Right. So a lot of the AI stuff that people are familiar with is things like photo recognition and Google photos so I can search for my kids, I can search for beach, I can search for things like that and it'll come back. What are some of the types of AI and algorithms that you're applying with this robot revolution? It's definitely things like the image analytics. It's for the routing. So let me give you an example of how easy it is to apply. So anybody who can play a video game, you have a video game type controller. So when your kids, again, playing games, they're actually training for on the skill jobs, right? So you map a scene by using that controller to drive the robot around like a factory around the airport. And then the AI algorithm is smart enough to create the map. And then from that, we can actually use the robot just out of the box to be able to navigate and you have a place to say, going from, you know, Teresa here and then I might be able to go into the, you know, get us a beer, right? Maybe we should have that happen. They're setting up right over there, that's right. Well, so it's kind of like when you think of kind of the revolution of drones which some people might be more familiar with because they're very, very visible where when you operate a DJI drone now, you don't actually fly the drone. You're not controlling pitch and y'all and those things. You're just kind of telling it where you want it to go. And it's the actual AI under the covers that's making those adjustments to thrust and power and angle. Is that a good, good terminology? And so the work that we would do now is much more about how do you string it together for the use case? If a robot were to come up to us now, what should it do, right? So if we're here, do we want the robots to even interact with us too to get us that beer? So robots don't usually speak. Should speaking be an option for it? Should maybe it just gesturing and it has a menu we would know how to interact with it. So a lot of that human robot interface is some of the work that we're doing. So that was kind of a silly example, but now imagine that we were surveying an oil pipeline or we were actually as part of a manufacturing line. So in case this case is not getting us a beer, but it might need to do the same sort of thing. What sort of tool does Teresa need to actually finish her job? Yeah. And then the other one is AI and me. And you just said that, you know, AI is getting less complicated to program. These machines are getting less complicated to program. But I think most people still are, you know, kind of stuck in the realm of we need a data scientist and there are not a lot of data scientists and they got to be super, super smart. You got to have tons and tons of data and these types of factors. So how is it becoming AI and me? Jeff, who's not necessarily a data science that don't have a PhD in molecular physics, how's it going to happen? I think we need more of that democratization for the people who are not data scientists. So data scientists, they need the data. And so a lot of the hard part is getting the data as to how it should interact, right? So in that example, we were saying, how does Teresa and Jeff interact with the robot? The data scientist needs tons, right, thousands, tens of thousands of instances of those data types to actually make an insight. So what if instead when we think about AI and me, what about we think about, again, the human, not the data, well, data scientists are people too. But let's think about democratizing the rest of the humans to saying, how should I interact with the robot? So a lot of the research that we do is around how do you capture this expert knowledge? So we don't actually need to have tens of thousands of that. We can actually pretty much prescribe. We don't want the robot to talk to us. We want him to give us the beer. So why don't we just use things like that? We don't have to start with all the data. Right, right. So I'm curious, because there's a lot of conversation about machines plus people, you know, is better than one or the other. But it seems like it's much more complicated to program a robot to do something with a person as opposed to just giving it a simple task, which is probably historically what we've done more here. You could do that task now, people are not involved in that task. They don't have to worry about the new ones. They don't have to worry about reacting, reading what I'm trying to communicate. So is it a lot harder to get these things to work with people as opposed to kind of independently and carve off a special job? It may be harder, but that's where the value is. So if we think about the AI of, let's say yesterday, there's a lot of dashboards. So with the pure data-driven, the pure AI operating on its own, it's going to look at the data, it's going to give us the insight. At the end of the day, the humans are going to need to read, let's say, a static report and make a decision. Sometimes I look at these reports and I have a hard time, even understanding what I'm seeing, right? When they show me all these graphs like I'm supposed to be impressed, I don't know what to do. Versus if you do, I use TurboTax as an example. When you're following TurboTax, there's a lot of AI behind the scenes, but it's already looked at my data. As I'm filling in my return, it's telling me, like, maybe you should claim this deduction. It's asking me yes or no questions. That's how I imagine AI at scale being in the future, right? It's not just for TurboTax, but everything we do. So in the robot, in the moment that we were describing, maybe it would see that you and I were talking and it's not going to interrupt our conversation. But in a different context, if Teresa's by herself, maybe it would come up and say, hey, would you like a beer? I think that's the sort of context that, like a TurboTax, but more sexy. Right, right. So I'm just curious from your perspective as a technologist, again, looking at your patent history, a lot of stuff on cloud, a lot of stuff on Edge, but we've always kind of operated in this kind of new world, which is if you had infinite compute, infinite storage and infinite bandwidth, which was taking another big giant step with 5G, kind of what would you build and how could you build it? You got to just be thrilled as all three of those vectors are just accelerating and giving you basically infinite power in terms of tooling to work with. It is. I mean, it feels like a magic. If you think about, I watch things like Harry Potter and you think about, they know these spells and they can get things to happen. I think that's exactly where we are now. I get to do all these things that are magic. And are people ready for it? What's the biggest challenge on the people side in terms of getting them to think about what they could do as opposed to what they know today? Because the future can be so different. That is the challenge, right? Because I think people, even with processes, they think about the process that existed today, where you're going to take AI and even robotics and just make that process step faster. But with AI and automation, what if we jump that whole step? If as humans, if I can see everything, like because I had all the data and then I had AI telling me these are the important pieces, wouldn't you jump towards the answer? Like a lot of the processes that we have today are meant so that we actually explore all the conditions that need to be explored, that we do look at all the data that needs to be looked at. So you're still going to look at those things, right? Regulations, rules, that still happens. But what if AI and automation check those for you and all you're doing is actually checking the exceptions? So it's going to really change the way we do work. Very cool. Well, Teresa, great to catch up and you're sitting right in the Catbird seat. So exciting to see what your next patents will be probably all about robotics as you continue to move the strain forward. Thanks for the time. All right, she's Teresa. I'm Jeff, you're watching theCUBE. We're at the Accenture TechVision 2020 release party on the 33rd floor of the Salesforce Tower. Thanks for watching. We'll see you next time.