 Welcome. My name is Murray Woodman and this evening I'd like to talk about and discuss how to train a quick chat AI powered chat bot with Drupal. So this evening I'll be walking you through a number of different technologies. First off there we have Drupal so I will be showing a quick chat module that we've been working on here at Morft. The quick chat module integrates in with quick chat chat bot. So quick chat is a SaaS. It's a startup which is provides a chat bot and a trainable model that you can use for the chat bot. And that chat bot is in turn sitting on GPT three, which is a very powerful language model that's been developed by open AI so they're the three technologies will be talking about this evening. The first sort of concepts I would like to talk about and you know, for anyone who sees my talks I always seem to at least start off with talking about Drupal as a content experience platform but it really is very much sort of one of the ways that you know, I'm thinking about content management systems these days and you know when we think of a CXP or a content management system at least you know what are we thinking about and you could say that a CXP is really a CMS that behaves in an omnichannel way that is it is able to store and manage and deliver content to a number of different platforms. So typically when we're thinking about these different platforms that we're developing for or producing content for, we may be thinking about websites, decoupled websites, third party, so native applications, in store displays and basically you know a number of different applications that are able to consume that data. But you know another way we can think about things is the CMS can be used to deliver content to different systems, not end users necessarily but different systems that are able to make use of that data and add value to that data so that we can then serve that back to the users. So for example we have different recommendation engines that we can use Drupal for for storing the content and delivering recommendations to users. There are many different SAS services out there where you know for search where the search index can be consumed and smart search results delivered and in this particular case this evening we're going to be talking about chatbots so why not store the content and the knowledge base for a chatbot inside Drupal. And basically have that content available to editors when they are editing the content. So the aim here is to provide an awesome or great editor experience where the editors can manage the content and the knowledge base right alongside the content that they would be, you know, working on just for the website or for these other services that we have on the left hand side. So I don't want to labour the point but it's this is the, you know, a really sort of key insight that the CMS can grow and have more utility when we start thinking of it in terms of serving, you know, different machines out there, you know, as well as users. How did I get here this evening so you know this is Lita and Lita is an avatar that's been produced by a company called Synthesia. Synthesia provides, they generate animated avatars that are able to, you know, speak a script. Dr. Alan Thompson, he's an Aussie based in Perth, very smart and inspirational person. He's done, you know, a lot of really interesting YouTube videos around AI and embedded intelligence and he's basically got this whole series of 50 or 60 videos now where he has a conversation with Lita. And, you know, he will ask a question and Lita will have an answer and basically it's the GPT3 model sitting underneath that is powering Lita. When I first came across these videos, I really firstly was astounded at, you know, the breadth of knowledge of Lita and GPT3, but also the conversational nature of the exchanges. And so we're not really sort of talking, you know, sort of canned responses here. We're talking very natural responses where, you know, ideas and thoughts are put together in new ways. And if you watch any of those videos, you really cannot help but be amazed at, you know, what is possible these days. So I've got a few links at the end there. You can see more about Lita and Dr. Alan Thompson, but yeah, very inspirational work. So, you know, the main idea that, you know, that spawned me on to the next thing was, you know, basically, we have these amazing conversational interfaces that we can have, powered by GPT3, it's a very natural conclusion to have a chat bot, you know, based on those. So let's have a little look at, you know, GPT3. Firstly, it's a language model that's been developed by OpenAI and it is probably off the charts in terms of scale and size for most of us to, you know, to understand. So I think it's got like 175 billion, you know, different sort of measures inside it. So as a model, it is incredibly complex. But in terms of the data it is consumed is just sort of awe inspiring as well. I mean, we can see the number of tokens there on Wikipedia, encompassing most of human knowledge, you could, you could argue, only 3 billion. And then we have all of this, you know, other information that it has consumed. So it's consumed a lot of the web, it's consumed a lot of popular academic journals and newspapers and, you know, and books as well. So the amount of information it's consumed is, you know, truly staggering. And this gives GPT an incredible breadth, just not only in knowledge, but just, you know, awareness of difference. You know, different sort of capabilities and things that are out there. So if we come across and we have a look at GPT3. Now this is basically you can go to OpenAI. And the cool thing is the GPT3 language model is publicly available. So it's been around for a couple of years now and there's a public API that you can easily sign up for and start using. So, pretty cool. If we come and have a look at some of the examples just to give you an idea of what is possible with GPT3. We have a number of different use cases here. I'll just scroll through slowly but you can see that there's a whole lot of different kinds of things you can do with GPT3. It ranges, you know, from, you know, creating pros to, you know, writing code to classifying content, summarizing content and, you know, working out sentiment analysis and those kinds of things. So yes, it can even write SQL requests for you if you want, right? So that's right. Look out, Margie. All of our jobs are on the line. So, you know, for those of us out there who are interested in, you know, the concept of, you know, content, you know, you may think of classification. For instance, you're able to classify different content into discrete categories. For instance, this is just a little example here where we may, you know, want to, you know, classify FedEx as a delivery company or Facebook as a social media company. The whole idea here is that you give a prompt and that prompt is then used to drive the response. So that's classification. I mean, keywords would be similar but possibly even more impressive in that you're able to give it a, you know, a string, a summary, a description if you will. And, you know, it will produce, you know, a nice set of keywords. So I think anyone working in the content management game would probably appreciate the, you know, the ability of being able to, you know, generate keywords like that. If we come down here to conversation, which is a little bit more relevant to what we're talking about this evening. We've got an example here of Marv, the sarcastic chatbot. Now, the whole concept here is that you are able to seed GPT-3 with a prompt and that prompt sets the responses that you're going to get, right? So just by saying that Marv is sarcastic, right, and he reluctantly answers, you're going to get a whole bunch of smart Alec, smart Alec comments from Marv, you know, because, you know, this is how you've seeded Marv with this prompt. And this is the incredible thing about GPT-3 is that it really depends on the prompt as to what response you get. And, you know, many people are saying now that this is actually a new kind of job that is a job description that is evolving. You don't have necessarily coders of computer code anymore. You will have coders of GPT-3, people who understand how to craft up prompts to get, you know, responses out of a language model. The other one, last one I would like to show you just from a content perspective is the, you know, transformation. So it can take text and transform it. I love this summarized for a second grader. You can take quite a concept bit of text and you can have this as a prompt to summarize for a second grader, and then you get a beautiful simple summary here. So you could see this would be a perfect, you know, description for SEO or, you know, maybe for a teaser or something like that. So, of course, there's, you know, incredible application here just for content management systems as a whole. The other thing I would like to move on to just to talk about GPT-3 is the fact that you can customize it. So there is an amazing model. We've consumed all those billions of tokens and we've got 175 billion sort of weights inside the system. But the very cool thing is that you can easily train this or fine tune it. So you can customize GPT-3 for your application. And if you want to use the OpenAI API, you can upload a training file. So if we just have a quick look at the training file, I think it's down here somewhere or maybe it's on the next screen. But essentially you can upload a list of prompts and responses to train GPT-3. So for example, you could train a tone of voice. You could train a knowledge base. You could train it to be, you know, to handle a certain way of speaking. And this is basically how different companies are now building applications on top of GPT-3 and OpenAI. So I'm not going to dwell on it too much, but you have, you know, a whole bunch of different applications that people are building where they're building whole business models, you know, based around this kind of concept. And if you come across into the fine tuning part of the documentation, you can see that. You know, one of the benefits of fine tuning the model is that you do not have to rely just on the prompt, right? So we've seen the prompt before, which is just a few words, but now you're able to train up with many more examples, you know, to essentially train the model. So yeah, so this is really the amazing thing with GPT-3 is that you're able to fine tune it with relatively few data points. So we're only talking 100 or so data points and you can basically bring your own flavor sitting on top of GPT-3. So that is what is underlying the chatbot that we'll be, you know, looking at this evening. So moving on to QuickChat, QuickChat AI. So we've seen how, you know, what the possibilities are for GPT-3, how it's a very broad and deep model, how it's very conversational. And we've also seen the API provided by OpenAI and how we were able to fine tune models. What QuickChat have done is they've taken that underlying foundation and they've just built basically a SaaS service sitting on top of that. And OpenAI basically has provided a UI so that editors or administrators at least are able to manage a knowledge base that sits on top of GPT-3. So I'm going to come across and we can just have a look at the back end of QuickChat. So here we are, we've signed up for a QuickChat subscription. And I'm actually looking at the morphs subscription here. So yeah, that's right. We're looking at production data at the moment. Please forgive me, Margie. We can come in and have a look at the knowledge base here. So the beauty of QuickChat is that it does allow you to manage a knowledge base just through a series of statements, right? So, you know, how many have we got there? I don't know, 100, 150 different statements. These are just little snippets of text. And I think so long as there are around a thousand characters or whatever, we are able to push those over into QuickChat. So if you were just coming at this direct as a user of QuickChat, you can come in here, enter your content, retrain your chat bot. And so this is effectively fine-tuning GPT-3 underneath. And basically you are sending your content over and sort of training up the information there. So this is essentially the QuickChat model. So we found this, we thought it was really exciting. And what I wanted to do was hook this up with Drupal. Unfortunately, the API that QuickChat offered at the time did not allow us to push a new model over. But, you know, we entered a conversation with QuickChat and they're incredibly flexible and basically extended the API to allow us to include the, to allow the, hey. Push the training. Yeah, to push the training set. So that's what we're doing. So we've built the QuickChat module and we can push a model, the knowledge base over to QuickChat via Drupal. So that is what I'm going to show you now. So, well, firstly, before we do that, here we have the QuickChat module. It's available now. It's all open source. It's Contrib and it's got a very nice set of documentation with, you know, very simple instruction. So it's all ready to go. I'd like to do a super big shout out to Elio, Naveen, and Tanishin who've been working on this module. They're the brains behind the implementation there. So let's jump across onto the Morph website. And this is even more production data. My apologies again, Margie. So here we are on the About page for Morph. And, you know, I'm going to click Edit here. Like one of the key insights just going back to what I was talking about with the CXP before is that we have this concept of an editor focused experience, right? So the editors are able to manage the knowledge base on exactly the same page that they manage the node, the content. So this is the About page for Morph and we've added a knowledge base to that. And so we're able to basically copy and paste a whole bunch of content from up here and just put it down into the knowledge base and do it that way. So for instance, if we wanted to say the Sydney meetup, you know, is held in Newtown. Yep. Each month for Drupal enthusiast. Okay, so we're held out in Drupal. Okay, so we can basically come in and save that right. So this is now part of the, the, the model that we have the knowledge base that is actually sitting inside Drupal. And if we come into the content area of the site, we now have this quick chat knowledge base section of the site. We've configured a back end, which is the Morph production back end here. So we're now basically what we're looking at here is a view of all the data that we have, which is essentially a knowledge base field across a number of content types. We're able to update the model and this will push the data over into quick chat. Hopefully, you know, with the demo gods, if I just refresh, come back here, refresh his page, go to the knowledge base and Sydney meetup. There we go. So you can see that we've pushed. I still haven't removed the old one there that I was testing with, but you can see, I've pushed this one across. So basically we're just exercising the open, the quick chat API at this particular point in time. And then we can, you know, rebuild the model. And this is once again doing another API request and essentially retraining or as you know I discussed before refining the model that is in GPT three. Okay, so that that gets pushed across. We are also supporting a block so the module comes with a block. So yeah, nothing too, too amazing but we've got our little GPT block here, wherever that is there we go the quick chat one configure. That's that's a public scenario ID, and this just basically put a block on the, on the page and then when we have a look at the site we now have our, you know, the meetup you sorry the the chat bot there and of course this is probably not going to work but you know where is the, the city meetup proof is in the pudding and all that kind of stuff will be a new town I've got no idea. I honestly haven't tested this one. Okay, all right. So, doesn't know that but we do have a drop in here in in new town. Okay, but so anyway so we now have this chat bot using using the, the, the knowledge base that we have got inside of quick chat and an open AI. So, I'll just come back and just touch on a couple of takeaways here, moving right along. So that, you know, we've discussed Drupal as a CXP, and we really taking an editor centered. If I spelled that right and editor centered, centered approach. We really want the editor to be able to manage the knowledge base right next to the content I think that's the main, the main emphasis there. We really are standing on the shoulders of giants when we use something like GPT three and open AI. You know, the, the knowledge it is consumed is staggering, you know, the number of GPUs and CPUs, like in the, I can't remember but it's like in the order of tens of thousands over many months in data centers churning away on all this knowledge to build these models it's absolutely staggering. And then, you know, being able to leverage something like quick chat, which really has provided an inconvenient way to host the actual chat button to, you know, to manage the knowledge base so yeah they're all the things we're trying to do. What are the strengths, you know I really like the, the CXP approach that we've, we've done. In terms of how this particular chat bot works, like the its language abilities are very advanced and natural, and I think that is a really strong point about it. You obviously have a huge breadth of knowledge sitting underneath. And of course you can train different models sitting on top so you know if your particular organization has certain set of knowledge that you wish to upload or a tone of voice or a certain language you know style of language, you're able to to train the model like that. Yeah, I don't want to say there are weaknesses but certainly there are caveats and I would say the chat bot is quite focused. So, it is possible to ask GPT three very sort of philosophical questions and get philosophical answers and if you look at any of the AI videos, which I'll link to at the end you'll be amazed at sort of how fluid and how the connections can be put together it's astounding, but quick chat AI is really much more focused so I think the prompt that they're using there really is is driven to be focused on you know the knowledge that you're updating. So the open AI in general does not want people to use GPT three for life and death situation so, you know, it doesn't want people to provide medical or health responses doesn't want people to provide financial advice, you know, based on GPT three right so it's it's not for everything so there's a lot of use cases out there where it's not appropriate. So where is it appropriate I think yeah, if the kind of information you want to serve as informational and conversational. That's really good, but it, you really have to, you know, consider that what you're uploading is a knowledge base not necessarily a database so it's going to be great at you know answering sort of what is going to be good at answering text based sort of prosaic kinds of queries, but it's not really going to be good at you know answering you know what is the weather you know at this particular location right it's not actually as it set up designed to wire into a database at the end it's really a knowledge base of content. Okay, so that's, that's it. I we've got some links for you here. I really recommend life architect leader down the bottom and the YouTube channel there that's that's very inspirational place to get started. And of course we've got some, you know, a API stuff and some examples there. Really huge, huge shout out to quick chat AI team, they were very responsive, and we're able to, you know, extend their API to work with us to to bring the Drupal module to fruition, and they've just released a blog post there as well on on how to add that to Drupal. All right, so that concludes the session this evening thank you so much everyone coming along and happy to take questions, you know, extend their API to work with us to bring the Drupal module to fruition, and they've just released a blog post there as well on on how to add that to Drupal. All right, so that concludes the session this evening thank you so much everyone coming along and happy to take questions. Thank you.