 Hello and welcome to Gamma. Go ahead, ask me anything. Put together by our team at IBM Research AI. So the problem that Gamma addresses is answer finding. Say you have a document and a question you want to ask about that document, such as the example on the right where we have a sublease agreement and a question of are cats and dogs allowed. How do you find that answer? You could open it up in a document reader and control F or command F if you're on Mac. Search for that phrase. You could try are cats and dogs allowed, cats allowed, dogs allowed, but this is a slow manual process and you may not use exactly the right words. Here we can see that pets is used instead of cats or dogs. If you wanted to try to speed it up, you could try automated keyword based searching, which could quickly search a lot of different keywords type things, which would be quicker, but it still may not use the exact right words you need to find your answer. What Gamma does is provide answer finding by machine reading comprehension or MRC for short, which has a better accuracy and a middle ground between the two previous methods on speed, which can get the correct answer. That is no pets of any kind are permitted. So how we approach this is with our machine reading comprehension model. We use a BERT based transform models of BERT, Roberta, Albert, etc. And you can see on the diagram on the right how we take a question and you pair that with a candidate paragraph and you run it through the network to get both a prediction of the answerability, whether you've answered the question from the document and also where the start and end positions within the document of the correct answer are. In order to train this, we start with a pre-trained doc model on unlabeled text. And then we fine tune this on Stanford's question answering or squad data set. We then further fine tune this on Google's natural questions or NQ data set to produce our final model. And we also enhance the model's attentions with attention over attention from Quiet Al 2017. This natural questions trained model is currently second place on the natural questions short answer the reward with 61 F1 on short answers. We were previously in first place on this challenge for five months before being knocked down to second place. Now we're going to take a quick look at the Gamma demo in action. So for this first example, we have the Charles Bridge Obicapedia passage and some questions. Here we're going to ask what type of bridge is Charles Bridge. And the system is going to respond that it is a Gothic Bridge. Next, we're going to take a look at a sub lease agreement. This is the same example we saw earlier. And we're going to ask if cats and dogs are allowed. And as we saw before, no pets of any kind are permitted. Finally, we're going to take a look at an example of the top grossing movies of all time. We're going to ask the question in which year did the fifth highest grossing movie come out in. And we can see here that the fifth highest grossing movie is Avengers Infinity War, which made a little bit over $2 billion and came out in 2018. So when we ask the system, we can see that it did in fact come out in 2018 just as we would expect. Thank you. This has been a demo of Gamma. Go ahead, ask me anything put together by our team at IBM Research AI.