 Live from Las Vegas, it's theCUBE, covering AWS re-invent 2018. Brought to you by Amazon Web Services, Intel, and their ecosystem partners. Okay, welcome back everyone. We're live here in Las Vegas with AWS Amazon Web Services re-invent our sixth year. I'm John Furrier, Dave Vellante, Dave, six years, two sets, people rolling out of the keynote, so much action. We got another day coming tomorrow. We've got two great guests here. We've got Dr. Vassi Philliman, who's the general manager of machine learning and AI at Amazon Web Services, and Dr. Taha Khas-Hauthut, senior leader at healthcare and AI at Amazon. Guys, welcome to theCUBE. Thank you so much. I'm so excited that you're here because I've been waiting to have this conversation. Dave and I have been, we just had analysis of the distractions and we got the stack around machine learning. So much value now coming online that's been in the works around AI and really mainly machine learning that's creating AI-like benefits. And he just had to spend a lot of time in his keynote, say almost a third of it, around AI-like capabilities and how Amazon integrates in, from chipsets with elastic inference, beautiful, it's just good stuff. So congratulations. So what does it mean? What does it mean for customers right now who want to kind of grok what's going on with Amazon and AI? Is it new sets of services coming online? Is it how long it's been in the works? Explain. Our mission at AWS has always been to take technologies that have been traditionally available for a few special technology companies and take that and make it available to all developers. And we've done that, I should say that we've done that fairly well when it comes to compute, when it comes to storage, when it comes to databases, analytics and we're doing the same thing for machine learning and AI. And what we're doing because it's a new field is we've got to innovate at three layers of our stack. So the bottom most layer, as you saw in the keynote earlier has to do with frameworks and infrastructure. So this is more for the people that fully understand how to deal with machine learning models and like to go in and tweak these models. The middle layer then is for everyday developers and data scientists. And that's sort of where SageMaker fits in. And finally at the top layer of the stack is where we have our application services. And this is meant for developers that don't want to get into the weeds of machine learning, but they still want to use, make use of all of these technologies to make their applications more smarter. So they get the insight benefits, get the insights out of the data without getting down on the weeds. Exactly. And people who want to get down on the weeds, they can get down and dirty with all this other stuff. Yeah. Think of that right? Yeah, and typically what we do at the top layer of the stack is we try and solve really hard problems. And so customers can now take advantage of it because we've solved it for them and they can just take that and integrate it into their application. Real quick, what was the hardest problem that you guys solved? I mean, traditionally speech recognition is a very hard problem. That's one of the hard problems. The other one is NLP, Natural Language Processing. But I would say speech recognition is probably a hard problem. We just launched streaming transcription so you can now transcribe live as somebody speaks. And of course you can connect it to translate and translate it as well live, so. Great for our Cupiders, looking forward to having that online. As a healthcare practitioner, how does this all apply to that industry? What kind of projects are you guys working on in that regard? Of course, yeah. So I mean, to Vasi's point is when I continue to innovate on behalf of the customers across all layers of the stack and machine learning, in particular this week we launched Amazon Comprehend Medical. Particularly in a hard problem where the majority of healthcare data is captured, conversation and observations in an unstructured format. So petabytes of data is stored across entire healthcare system that's in unstructured form. So to drive actionable insights and to be able to find the right elements to treat patients or to manage a population or even to do accurate billing, it's been really important that we can empower our customers with building blocks for them to build the right solutions to take advantage of that. So Amazon Comprehend Medical is able to understand the medical language and the context, similar to how clinicians understand the medical language and context. For example, if you're looking at a patient medical note, Amazon Comprehend Medical is able to with high accuracy extract medical conditions, medications, tests, procedures, being done on the patients as well as the relationship between those and understanding that context, that this condition and this treatment go together as well as the nuances. For example, a patient has no family history of X or there's no smoking history. All those are things in relation in the past or in the future or other members. And this is really what we're really proud about launching Amazon Comprehend Medical. Talk about how it works because healthcare has been a great field around where AI's old fashioned AI say, when I was doing it in the 80s, early 90s, ontologies were really popular and it's linguistics is kind of known. But now that, but you need a linguistics guru to do that. You mentioned streaming, the transcribe, you got metadata. How do you guys get this kind of benefit when the ball's moving so fast around these rapidly changing and verticals like healthcare? Because healthcare's got a big problem like other verticals where there's so many notifications what I pay attention to. So much data. How do you put the puzzle together? Let me first give you some context here. As you're probably aware at last re-invent, we launched Amazon Comprehend, right? Comprehend is a text analytics service. It helps you look into text and understand what's in there. We started out with general things that we could detect like people, places, things, sentiment, the language, the text is written in and so on. But when we started, customers have picked on it and they're using it a lot. But as they keep using it, they came back to us and said, hey, it's great that you guys have this, this, you're giving us the capability to understand general language, but some of our domains have some special language. Like jargon. Yeah, like take the legal domain, for example, right? It's got judges and defendants and very particular things that are very relevant to the legal domain. So they were asking us for a capability to sort of extend Comprehend to include their custom domain terms and phrases as well, right? So last week we actually launched a custom entities feature that allows them to bring in their custom domain into Comprehend so that Comprehend can be extended to include their domain. And so legal language is difficult to understand, but medical language, on the other hand, is even more harder to understand. Yeah, exactly. Acronyms, jargon, what does an entity look like, extracting that. And extracting entities is alone, misspells, right? But relating those entities together is super important because in one clinical note, you could have multiple drugs in there with different dosages, different frequencies. So you need to be able to relate those entities together, right? And that's the sort of thing that Comprehend Medical allows our customers to do to solve some really big issues. So you're doing a lot of that entity extraction under the covers. Is that right? How does it work? I mean, how does Comprehend and medical work? I mean, just out of the box, you have to train it. There's no training needed, no machine learning expertise needed. So the algorithm extracts these entities as well as the relationship between those entities. And then also extracts any attributes that might be related, such as negation or past and future, or what's anatomy of the body relates and whatnot. All that is done out of the box. And that's super important. You want to know whether the patients stop taking a medication, right? So negation, things like that. You want to know because that gives you the context. Just getting the terms alone doesn't really tell you much. Andy Jackson had a great video about the F1 analytics. Imagine having that for a person. You're not doing good right now, take a break. So I feel like we're kind of now scratching the surface of healthcare transformation. Think about the healthcare industry. For years, it's been compliance-driven, whether it's HIPAA, Affordable Care Act, EMR, and meaningful use. But the industry hasn't been dramatically transformed and disrupted and it kind of needs to be. How do you guys see that evolving? I feel like you're now beginning to see that sea change. And it's going to take a while. It's a high-risk business, obviously. But what's your sort of prognosis for that transformation and what's the vision as to the outcome? Yes, that's a really great question. I mean, one great thing that's happened over the last decade is the digitization of your medical record. So, and that's really wonderful because before it was all paper-based primarily, unless you were in acute setting. So now, the majority of the US, for example, and globally, there's this huge adoption and propagation of these electronic medical records. The issue there remains now when the majority of that data is observations and conversations as well as unstructured, that creates a different kind of roadblock for our customers. And this is what we're hoping for, service-like Amazon company and medical, that's HIPAA-eligible, meets a lot of the compliance or help our customer meet their compliance needs that we'll be able to remove the heavy lifting of this undifferentiated task about having a large amount of time being spent on analyzing this text and extracting very low, where now with Amazon company and medical, we'll be able to really fast track that and be able to elevate that. Hit the nail on the head on the undifferentiated heavy lifting. That's the ethos of DevOps. Exactly, yeah. Let me give you some stats actually. There are 1.2 billion medical documents that are generated every year in the US and 80% of them, it's unstructured text. So to make sense of that, it's going to enable our customers to do some really amazing things. One of the things, one of the use cases that we see is clinical trial recruitment. So Fred Hutchinson, which is one of the nation's top cancer research centers, they recruit patients for clinical trials. If you go to clinicaltrials.gov, you'll see like 290,450 clinical trials open and typically from history we know that most of these clinical trials don't end up recruiting, they don't end up meeting their recruiting goals because it's very hard to figure out which patients fit the clinical trial that you're actually trying to perform. So Comprehend Medical helps these customers to very quickly narrow it down. Yeah. Expand on the involvement of people in the community. You mentioned Fred Hutch, and Roach has also been involved from what I heard. Yeah. Who was involved in this project? It sounds like it was a collaboration. Think a minute to explain that. Right. It's very similar to a lot of other services that we put into the market. We collaborate a lot with customers. 90% of what we do is really coming from customers. So we've collaborated with people like Fred Hutch and some of the nation's top institutions to help us validate the service that we've built to actually make sure that it's meeting sort of the requirements for those use cases that they are thinking of. So we collaborate closely with them to get the service to where it is today and we announced it as generally available yesterday. Okay, so what's the use case? Oh, go ahead. Yeah, I can expand a little bit on that. So some of the customers as well, they use cases. We're talking anywhere from hospital systems that when I use or take advantage of their unstructured texts for things such as identify people who are for their followup appointments or stop in treatment or finding alternative routes to billers were trying to identify accurate procedures were done. They'd be account for all the procedures account for all the billing, which oftentimes is hidden in those unstructured texts and require a lot of manual process. And oftentimes the rules can't really scale to things such as clinical trials, recruitment. How can you, for example, in Fred Hutchinson Cancer Institute use case for identify a patient and match them to the right clinical trial. These patients oftentimes have hairy potters worth of clinical notes done on them and they'll launch you on a journey and to go from one institution, another, another and be able to really find it's no longer a needle in a haystack, it's like a needle at the bottom of Atlantic Ocean and then be able to really do that match from hours and months down to a few seconds and that's really the beauty about the service. John likes to talk about the 20 mile stair and I wonder if we could just look ahead. How far can we take AI and machine learning in healthcare and how far should we take it? And that may be a more specific question is as a practitioner, when do you think machines might make better diagnoses than doctors, if ever? How do you feel about that? Where do you see this all going? I think, I mean, the whole idea about machine learning, the beauty about it, I mean, similar to how the status scope was introduced, similar to how the thermometer was introduced in medicine, these are tools that we use to our advantage to really provide better care and better outcomes and that's really what we're, that's the mission that our health IT and customers and whatnot are really driving towards. Machine learning can do a lot of great things for routine things that human being can go and focus their attention to other things such as the Fred Hutchinson, instead of going and mining these diagnoses in amounts and amounts of data, machine learning will be able to identify that where the clinical staff can focus on care. And that's really what I think, I mean, over the next decade and so, we're going to see a lot of this advancement in these building blocks, as well as what Amazon's offering from forecasting and prediction, algorithms and whatnot, we'll be able to fine tune our capabilities to help customers achieve even precision medicine. It's real world impact because you're changing the workflow. I mean, someone's in the wrong line or the wrong process based upon their history. HIPAA requirements really cause a lot of this record sharing thing to be a problem from what we've been reporting over the years. It's kind of a solution to that. So if I move to a service, medical service, I get all that records with me, is this kind of how you see going and how does other regulations that are holding you back that are blockers, is that clear now? How does that solve the industry challenges of privacy? If you look at the healthcare system today, there are lots of inefficiencies in there, right? In the end, this is all about improving patient outcomes and making sure that we reduce costs. And that's what this boils down to. And these are tools that allow our customers to do exactly that. Well, guys, thanks for sharing this insight. Comprehend Medical is really awesome opportunity. So I think it's early days, day one, as you guys say. I think there's so much more that could be there. I'd love to see the industry, just from a personal, you know, society changes, just get out of the way of all these hurdles, get the data out there, expose the data, check the privacy box, would be good, right? This is going to change the game. Yeah, I mean, maybe we should say a little bit about the, how we built the service in terms of that, right? As you know, at AWS, security and privacy is number one for us, right? So this service is HIPAA-eligible. It's a stateless service. What that means is nothing gets stored. The data is not used to improve the models or anything like that. The only person that can actually see the data is the customer. He's got the keys. He's the only one that's sending the data to the endpoint and whatever he gets back, only he can decrypt it. So we've taken care to make sure that we can remove some of those hurdles that people have always been worried about. Well, doctors, thank you so much for sharing. Thank you so much. Thank you so much for having us here. All right, we are bringing you all the action here from AWS ReInvent. Again, as the compute power has increased, as software is written with new apps, AI is changing the game, of course theCUBE, and a lot of video. We're going to need some of these services to make these transcribes on the fly. Thanks for coming on, I really appreciate it. Thank you so much. Back with more after this short break. Thank you for your time. Thank you.