 Ynw'n fwy fawr, wrth gwrs, mae'n fawr i mi. Ynw'n fawr i mi ar y dyw'r cyfrifau. Fe wnaethwn i chi'n gael y cyfrifau cyfrifau ar AI gyda cyfrifiadau cyfrifiadau, ac ydych chi'n rhaid i gael y clyweddau cyfrifiadau yn gweithio y cysylltu a dros ymgyrch ar y cyfrifiadau. Mae'r cysylltu yn cyfrifiadau o'r cysylltu ar y cyfrifiadau cyfrifiadau yn cyfrifiadau cyfrifiadau a'r cyfrifiadau. Ynw'n gweithio am cymodelwyr. Yn y edrych, mae'n meddwl i pholwyr ar gyfer yma, felly mae'n meddwl i'r ddweud yn ddweud яwn i'r ddechrau. Mae'r ddweud yn gwneud i'r ddweud yma, mae'n meddwl i'r ddweud yn cymryd Ceylethur Cafodd Gweithgaredd ar gyfer dyddwyd am gwybod i'r busnes yn ffasta, ei wneud yn gweithio a'i gweithio oherwydd yn gweithio. Mae'n meddwl i wneud i'r bwysig mewn syniadau a'r ddechrau yn gwirionedd ymgyrchynol iawn a'r oed ddweud bigwnaws, mae ein gws-gwysu'r migraffaswn yn ymddrach. Felly mae'n bwysig i'r toaster. Fydd conflu y gallwn ei gwasanaeth am erbyn y ddweud y mynd ddweud i maes achon i fynd a rhoddi, sy'n bwysig i'r awr o gwaith i hyn sydd wedi'i maes aeth na'r gws. Roedd e'i bwysig i hynny am ystod i'r Margos rechyd gyda sy'n colli 20, os ydych chi'n dal i ni a wnaeth o'ch gwasanaeth ble Rych iawn hefyd. Ac ydy'r toaster. Mae'r toaster rhan o'r bobl. Rwy'n gwybod ni'n gweithio'r toaster o'r hyn. Mae'r toaster, sy'n gwneudio'r gweithio, yn cydweithio'r toaster ym mwyaf o'r ffwrdd arall. A'r gweithio'r toaster mae'n gweithio'r prinsifol. Felly, ydych chi'n gweithio, sy'n gweithio'r torff sy'n gweithio'r torff o Thomas Thwate, sy'n gweithio'r prinsifol sy'n gweithio'r torff sy'n gweithio'r trwyd. Felly, dyna, ond mae'n mynd attractionsurio gwneud i allan sy'n ddullain. Diolch i wassio'r torr. Mae'n ddigon i hyn o ddigon o pas ty Secretary Hayes. Mae pa bydd hyn o ddullabiedig, ond mae eitha wahanol gyda fflaen o adrodbout Golliaw shoulder sister dot. Meddwl ar yful â amlongym. Felly, ool hynny i ddwetlesio, dylwch ar jagu a lruneddedd ben就到 â chyrd lan, gyna'n gweithio cyώlwg sydd iddyn nhw gweld yn wahanol mwy oed. Felly, rydyn ni'n gwahanol yma yn byw'r cyflawn ar y cyfliad. Croedd ei bod yna sy'n rhaid i fi, rydyn ni'n rhoi gwant i'r ddechydig arall y taeth. Cyngoroddiad y rhyddedig o'r mur yn bob ywch. Rhaid i'n eisiau'r cynyddal yn bellachol. Pei arno rhaid i'r cyfrwng Fyddech chi wneud yn gwybod i'r awdraeth, y chyfeidliadau a'r appleau ar eich bod. I ac mae'n meddwl i'r wneud iawn, mae'r credu arddangos i y Mydr. Mae'r gweithio'r gymgymru i gynnwys, mae'r Fifeisio'r Rwyntdoedd Pembyrch. Mae'n meddwl i'r wneud iawn, mae'n ei fath i'r yn ddim ni'n meddwl i'r gweithio ar hynny, nad oedd yn meddwl i'r meddwl i gynnwys. Felly mae'n meddwl i'r bracket a mae'n meddwl i'r gweithio yn meddwl. Onolygu'n meddwl i'r Cymru, OS is now all cloud managed and we care more about containers and runtime. That trend is now continuing with the advent of service technologies. Things like containers are going to become less important. We're going to care more about putting code into production and care more about consuming commoditized services as a way of building our systems. So we're going up an abstraction and more and more commoditization. This commoditization is happening for AI and machine learning as well as other areas of the stack. So, if you can't read the... That's one of my favorite fast-side cartoons, by the way. If you can't read the caption there, it says, ha, Webster's blown his cerebral cortex. And I think if you come at machine learning from scratch and you try and do everything with a scratch and get into everything, you're in danger of blowing your cerebral cortex. There's a lot to get your hands around, there's a lot to get into your brain. So what I'm suggesting is that it doesn't always make sense to try and build our own toasters and it doesn't always make sense to try and build our own models if commodity components are available that we can use to generate results quickly. Which is why we wrote this book, AI as a Service. It's available from Manning Publications. And what we tried to do here was to distill some of our learnings into an engineer's guide for how you get on board with AI and machine learning without necessarily needing to have a PhD in the subject. There's a discount code there if anyone would like to take that, DTX19. And I've got a few copies to give away if anyone wants to ask me nicely after the talk. So this is predict, I have a prediction for you. Service computing will become the de facto standard for enterprise platform development over the next three years. By that I mean systems constructed entirely using cloud native services and this is a move towards fully utility computing and that means increasing commoditisation and consumption of commoditised services to solve business problems. Increasingly these platforms will incorporate more AI and these components will be built by using commoditised components, combining them, tuning them for our particular use case. So we need to understand how to get on board with these tools because they give us results faster. So I've done an exercise recently in writing the book to go across the three major cloud providers. Apologies if you're a provider of a different cloud or if you have a different preference. These are the three market leaders, so I stuck to these three. If you just look at the counts of services across a number of vectors, so compute, data storage, network, developer, AI machine learning and so on, there's a huge range of services available. The number's in green at the delta from doing this exercise in 2018. Now of course there's a bit of hype around this. Some of the services that they count are a little bit wafer thin, but there really is a bit of an arms race going on between the cloud providers to deliver these commodity services that we can take advantage of. So if we dig in to the AIML category, you'll see there's a whole bunch of services for image recognition, recommendation systems, voice chat bot, prediction language and training. So for example, AWS have recognition, Google have Google vision, Google video intelligence as you have face detect and video indexer. So really to get on board with this, it's a good idea to just familiarise yourself with the range of AIML services that are available. And this is changing quite rapidly, so it is a moving feast. When should we use it? Well, I'm sure there's some people sitting here that say I'm not using anyone else's services, I can do it better myself. Maybe you can. And in some instances maybe you should. But increasingly, you should be looking to use commodity services. So when the problem is not well understood, and we need to actually go off and do some research, then we need to use the standard kind of data sites, tool chain, modelling with TensorFlow, all of that kind of stuff. When the problem is more well understood, or elements of the problem are well understood, we should be looking to use commodity services, combine them together or consume them. Or maybe there's a hybrid solution which uses some custom models and then also uses the cloud serverless model to operationalise those and combine them together. So in writing the book, it's fine just to understand the service catalogue and what's available, but you really need to put that into a context, into a framework. So it's very interesting to hear one of the speakers this morning talk about the custom model that I built here is only a small part of the picture. There's much more around this that you need to put together to actually operationalise your model and bring it to an expectant public. So this is our attempt, sorry, this is the AWS version, I couldn't do it for all three clouds, so I just picked the market leader. This is our attempt to create a kind of thought framework around how these services fit together. So at the top of the stack we've got our web application layer, services like API Gateway and web firewalls. Then we have two categories of a synchronous and asynchronous services, typically using function as a service, so in this case that would be AWS Lambda. Synchronous means request response, asynchronous is more fire and forget. Underneath that we have a layer of communication services in utility, so utility really around securing our platform and then a tier of AI services, so that's including all of the stuff that we just saw in that catalogue. So image recognition, SageMaker, speech to text, text to speech, underpin by a layer of data services. So by that I mean cloud storage, I mean serverless databases like Aurora, DynamoDB and so on. We also need to bear in mind that it's not just about the running platform we need to consider the development support services because once we put our models and our application into production we need to continually update it to build new versions of it, so we need to treat all our infrastructure as code. So we're looking at things like cloud formation in an AWS context, code pipeline, code build. And on the other end operational support we need to, just because our application is in production we still need to monitor it, we still need to alert, we still need to understand the operational parameters. So when we were writing a book we figured let's take this architectural context and see if we can build a system really quickly with it. So we decided we build a cap detection system. Can we build a cap detection system in a day? The answer is just about, probably about two days it took us to build this, right? So that's the UI on this side here. It allows us to put in a URL. Once we submit that URL to the system it'll go off, run some image analysis on it, give us a word cloud so that we know what that page is about and then return some pictures with a confidence interval. So this is built using synchronous services for that API piece. Talking to a work queue which then has a crawler service that fetches the images and then feeds those into the recognition system. So the AI piece of this is in the AI services here just there. Results put into an S3 bucket. So that's an example of how you can build a system very very quickly that is usable externally calling and consuming a commodity AI service. We then want to build a much more complete example so we took the use case of social CRM. Imagine you've got a number of products that you're selling in a number of territories. Those products are related to departments. You're getting feedback constantly all the time. It's a real problem to say how do we handle that feedback and route it to the appropriate party. So the pipeline is we need to detect the language, we need to do automatic language translation, do sentiment detection, figure out what department it's for and route the message onwards. Can you do that in a service way using commodity services? The answer is yes you can. So again this is the AWS realisation of that. So we've got a number of input channels on the start of the pipeline coming in through an API gateway into a stream. We then run our commodity language detection and translation service on top of that. Forward it on into a standard off the shelf sentiment analyser. Throw away the positive, keep the negative. And from there then we can do classification to figure out what department this is in. In this case what we did was to use Amazon's comprehensive service and do a little bit of transfer training learning on it so that it understood our context. It's still a commodity service that can be trained using an API that most engineers can use without actually having to understand the underpinnings of what's going on there to get results. So here's a real world example, a real project that we did in 2017 around KYC. So KYC is an interesting area. And what the company we're working with at the time was trying to solve was how do you automate the process of taking a utility bill and automatically take off things like name, address, NPRN, another detail so that doesn't have to be done by a human. So back then, way back in 2017, which is not that long ago we built this using an off the shelf open source OCR library. But then we had to do a whole bunch of math to figure out the boxing and then once we figured out what those boxes were and what those text groupings were, we had to then feed that through a classifier to say is this a name, is this an address unfolded on. Today, if we were to build that same service we wouldn't do that. We wouldn't need to do that. There is an offering from AWS called Textract from Azure called form recognizer and Google Cloud Vision OCR. So this is a great example of commoditisation. We just wouldn't build it that way today. We were just plugging into that service and that would allow us to build that piece of functionality for our client, cheaper a lot, lot faster and if I'm honest probably better because there's more smart people working for those companies than you can necessarily harness in that context, right? Here's another example from a project we've done in AgriTech for in Newland. Some guys from Newland here today they're doing awesome work in optimising fertiliser usage. So there are EU restrictions that say you can only put this must fertiliser on your soil, you want to maintain the soil quality and so on. So what the system does is place a sensor in the field, really in the field in an actual field and it pulls back data on nitrate levels rainfall, all of that kind of stuff along with images. So we feed those images then into SageMaker model. So this is a great example of how you can take a smart model that you've trained yourself and operationalise it. So the challenge was to take this model, how quickly could we get that into a production system we could start to use. Using entirely service technology we could do this in two weeks. That represents a significant cost reduction to the guys we were working with to our clients. So the point of this is yes you can do custom but all of the operational pieces around the outside the developer flow monitoring, getting data in and out of the system you can look at using commodity service components to accelerate that transformation and accelerate that time to market. So in summary we believe service computing will become the de facto standard increasingly incorporating more AI components customising, combining, consuming off-the-shelf services and developers will increasingly use these commodity services without requiring necessarily the deep level expertise that all of you guys in the room have today. I've got 16 seconds. Did I mention the book? Yes I did. Thank you very much.