 But I guess that most of you didn't understand her signing. And for this reason, I will talk about my project, WebSign. WebSign is a project that we conducted at the University of Tunis and the research laboratory of technology and for formation and communication. And its goal is to build a system to translate automatically written text to sign language. You know, when we want to communicate with the deaf, we should use sign language. And if we do not know sign language, we need someone who translate. For this reason, if we write a text, most of deaf people, according to the World Federation of Deaf, cannot read and write because most of them are illiterate. And for this reason, we need automatic translation. Our system allows automatic translation. So we can write a text and you will see the demonstration here. For example, I can choose the sign language. For example, here I choose the French sign language because sign language is not universal. I write a text to say this project is dedicated for deaf and the avatar translate automatically what we write in the box. Here, for example, this is a translation of the text to sign language. Of course, the translation needs many steps of linguistic treatment and the key step of the translation is the construction of the dictionary of science. And for this reason, we built an interface to build science. And this is a huge task, building the dictionary of science. And thanks to this interface, we can just by using mouse, we can, as you see here, we can make any signs just by mouse. And the program generates automatically the program that animate the avatar. For example, here I can choose the hand configuration. So I just have a set of initial hand configuration. I can choose the hand configuration of the left hand just with the mouse. And I click, I click. And after this, I can play the sign. And when I save it, the interface and the program translate automatically and generate automatically the program. Here, an application of WebSign that we develop a generator, of course, to teach and learn sign language. For example, here is a multimedia course. For example, you can choose the sign. For example, here I choose a town, how we say town. So I click on town. After this, we can do exercises. For example, here I show a sign. I ask, what's the sign? Here's another exercise. For example, what's the sign? This is a car or a bike. Another application is the screen reader. For example, for blind people, the screen reader use audio. But for deaf people, they cannot hear. Our screen reader use the avatar translation. And you see here, for example, for everything with the mouse, when the mouse point at anything in the screen, the avatar translated automatically to sign language. And this is very important to teach, especially kids, deaf, how they use the computer and to eradicate illiteracy of ICT for deaf kids. Another application, this is the MMS sign. And this is our favorite. How we make mobile phone accessible to deaf people. Our program translates automatically an SMS to MMS multimedia. For example, here I want to call a deaf for an appointment to tell him that I will relate, but he cannot hear. So I write an SMS, I write I will relate, and I send it to him. And our tool translates the SMS to a multimedia animation, an MMS, as you see here. So you send him with your text and your language, and he receives in his language. So he receives a multimedia message containing I will relate in sign language. And this is a tool that we have appended on it. Another application, a cash machine, how we make cash machine accessible for deaf, and especially illiterate deaf. So here you see the cash machine, everything is both in text and in sign language. So all guidelines to use the cash machine, you can find guidelines in text and also in sign language thanks to the avatar and thanks to a website tool. And here I'm glad to show you the recent development we did few weeks ago. So we include a facial expression because sign language is not just the movement of hand and the body, but also the facial expression. And here you see the new avatar we generate. He looks like a human. And we are really glad by this new realization. This is the new avatar. And finally, this is the picture of the team of the Research Laboratory UTIP. Thank you. The two. So thank you, thank you, thank you, thank you very much. So finally, I will say that I'm really glad to say that we will be more than happy to share this product with a partner from all over the world to make quality of life better for people with disability and especially for deaf and to break the silence of deaf. Thank you very much. Thank you, thank you, thank you, thank you, thank you, thank you, thank you very much, thank you.