 Hi, I'm Eric. I'm Connor. And I'm Eric. I would have followed up with, I would have followed up with I'm Connor. Yeah, we couldn't catch it in time. But yeah, this is the ENC recipe generator. So for years, possibly even decades, humans have been eating food. And usually when they eat, they want to eat good food. And there's a whole industry of food scientists and professional chefs trying to figure out what kind of food is good and how to prepare that food. But we were wondering if we could replace all of that with a single model. So that's sort of the basis for our project. So for our project, we took a data set of over two million recipes scraped up from across the internet. And we ran it through our machine. And the purpose of this was to create a model which could take in a input of a few ingredients and expand that to a whole recipe. And by a whole recipe, I mean it had to have a complete list of ingredients. So not only what we input, but also anything else that was necessary. And with appropriate measurements, it had to have instructions, which was a followable set of set of instructions. So you could actually do them if you had the correct equipment and have to have a relevant title. And the ultimate goal of all of the recipes was to generate something that could be categorized as food. And ideally it would be delicious. And a little, Connor will go into a little bit more about how the model actually does that. Yep, so to give an overview of the model, we start with a raw input, which is a comma separated list of ingredients. We then add some special tokens to that input, as you can see with the example here with input start, as well as the ING tag to separate ingredients. This input is then encoded using the Hugging Face GPT2 tokenizer, which is then given to our model, which uses a GPT2 component for encoder and decoder. The output of that model is the output of that model also includes some special tags, such as tokens for title and instruction start and separators for instructions, for example. The output is then decoded with the Hugging Face tokenizer, the GPT2 tokenizer, and then the output is finally formatted. And that's outside of the model structure. That's part of the demo. Yeah, so our task was taking that small list of ingredients and comparing it against the associated full recipe. We ran it for about 80,000 iterations over a couple hours. We got a final blue score of about 65. It's a little high, some would call it suspiciously high, but we think our number might have been slightly inflated by the number of special tokens we had. But as you can see on the right, the output of our model compared to another model doing the same thing, I think we are about on par or possibly even exceeding in terms of coherence and relevance. Given an input of flour, butter, egg, and tomato, there's a recipe there. I don't know if everyone can read it. And again, because I said the ultimate goal was actually cooking food, I was so confident in our model's output that I made the recipe shown in the slides. I would like to invite the moderator to give it a taste. I feel like we did a really good job. Well, work for me then. Work for me. So, while Eric eats this creation, here's our demo. I'll just choose some ingredients. This is some sausage and watery jack. Yeah, that's going to be disgusting. So, let's throw some chicken, some potatoes, some cheese in here, and yeah, let's see what we get. It takes about 10 to 20 seconds. Yeah, I just want to emphasize again, that was delicious. Nice. So, we have royal chicken. We put chicken in a baking dish, cover it with potatoes and cheese, then bake it at 350 for 30 to 40 minutes or until done. Not too bad. It's true to be by.