 Good morning, good afternoon, good evening, wherever you are. This is Abhinav Joshi. I'm a senior manager in the OpenShift group at Red Hat. And I have over 20 years of industry experience in a number of roles on both the customer side as well as on the vendor side. In my current role at Red Hat, my team focuses on developing and evangelizing the value of Red Hat OpenShift hybrid cloud platform for cloud-native workloads such as AIML, data analytics, databases, programming languages such as Java, Python, and so on. Joining me today is Enant Cakilolu, Director for AI Data Analytics and Data Science at Turkcell. Operating within Turkey and internationally, Turkcell currently serves close to 48 million customers with a wide range of communications and digital service offerings. Over to you Enant to tell us how Turkcell democratizes data science and accelerates AI innovation to transform the customer experience. Hello, my name is Enant Cakilolu. As Abhinav said, I am working as Director of Data Science, Artificial Intelligence, and Analytical Solutions at Turkcell. It has been already 22 years carrier in Turkcell. Today I'm gonna talk about Turkcell AI use cases, how we accomplished and what we accomplished and our partnership with Red Hat and what we gain actually, Red Hat. Talking about the data science and AI, actually Turkcell has been working with the data since it started its operations. We have been using analytical models and the predictive models for the last two decades and it's for core operations basically. And as you can imagine any kind of activity around our customer, we can utilize our models. New acquisition, churn, propensity, product switch, risk score calculations and so on. We are trying to make sense of our customer journey across the channels by data also. And this is the data science part as we call it. When it comes to industrial AI side, stories rather new actually. Three years ago we established our core AI team which was intended to focus on the industrial AI solutions. What I mean by the industrial AI solutions, we actually group our solutions into several pillars. One main part of the pillar is the computer vision. What I mean with the computer vision is actually the steady and the streaming image. I mean the both videos and the pictures and the photo analysis. And the NLP technology is another important pillar for us. With the NLP technologies, we include text analytics, speech analytics with the speech to text and the text to speech models. And we are actually very powerful in Turkish language. We create a quite powerful NLP engine in Turkish language and even we created our own AI voice. Chatbots, it is also a part of NLP technologies. We created a chatbot, even a platform for the creation of new chatbots. We have smart RPA solutions. We are active on the autonomous cars and the in-vehicle AI applications. And also the health industry is one of our key pillars that we are focusing on. Today I'm gonna talk about some concrete, tangible examples and use cases that we apply our AI technologies. And I am talking about how we utilize our platforms and the use cases. Here it comes. This is digital onboarding. Digital onboarding is one of major use cases that we apply our AI technology. We have a self-service applications called DO. It is a customer self-service applications and it enables our customer to perform any kind of subscription activity and the service operations related with their life cycle and with their subscriptions. Also we are utilizing our self-service applications to allow our new customers to start their subscription without any involvement on the call center site and the dealer site. How it works, our AI automatically detects the ID of the customer. And it's captured by the camera and it captures the image, I mean it captures the photo on the ID and it captures the related fields on the ID like name and surname. Then we ask our customers to have a selfie on their phone and we match this selfie with the identity photo. Also we got, we're using the OCR, we got the name and the surname information and then we ask our subscriber to confirm their subscriptions saying that, okay, I confirm the subscriptions and then we're analyzing the voice and then we confirm the confirmation of the subscriptions. Which kind of AI services that we are using? Which kind of AI actually use cases that we are applying? Computer vision, OCR, speech-to-text model and the NLP technologies are using for that specific use case. Another lovely use case is, it's also our AI and division technologies applied. It's a special campaign. We call it in Turkish language, gülümse gülümset. It means in English, smile and to make you smile. It's a special campaign. In that campaign, our AI service detects your face, detects your smile, rate your smile and based on these ratings, it gives you something. It can be a data package, it can be some donation for animal or the tree plantation. Anything depends on your use case. This is providing. We did something special for Abhinav. We got his face photos and did he apply our technology. This is a nice smile, but we had better one. This is a smile for donation of the three gigabytes and this is the best one and our AI application rates this as five gigabytes. This is just- Make me smile more. Yeah, you should. Yeah. I will talk about later. The essence behind this technology is for each use cases, we don't develop the same code and the same scenario over and over again. I mean, this is a cloud native application. It is working on the public cloud and it's available to open up anywhere and in the world. We started this kind of service in Turkey then using the OpenShift and our cloud native platform, we open this capability to other countries, other subsidiaries of TwoSet. Another scenario, it's a live box service. This is a service that you can upload some documents, some videos, some photos to the cloud. It's quite similar to Google Cloud or the Apple Cloud. There is a such kind of a service of TwoSet and we created a nice feature for that service. This feature weights your photo and the selected ones which can possibly get higher ratings if you upload them on the internet or Instagram. Another use case, fraud detection on online trainings and online exams. As you all know, our life quite changed in that activity on the pandemic and all our life activities already changed or altered and the digitized actually. And we thought about how we could apply our AI technology on the trainings and on the exam and we found out like this. We apply our AI-based recognition technology and computer vision capabilities and system detects your photo, takes your selfie, match with the identification and during the trainings or the exam continues to check your availability and your presence, whether you are online, connected and doing the desirable things. If something undesirable detected, the system creates alarm and reports about your activity on the exam and also it creates a detailed report about the desired ones and the undesired moments. We are trying to apply our AI technology and AI capabilities in terms of any kind of business that Tupsel is active. Actually, I should talk about first that as Tupsel, we no longer define Tupsel ourselves as a core type of company. We rather start to express ourselves as a technology and digital service provider. In line with that positioning, game positioning, we have several digital services and all digital services. We have our own TV service, we have our own music service, online magazine service and with all these services, they contain contents. And we are trying to enrich our services with the AI capabilities. One example is the NLP and the text-to-speech capabilities. As I said at the beginning of the presentation, we created our own AI voice and it is artificial, but the pretty close to a real human voice. It is not like an airport announcement and our AI voice can vocalize any kind of written message into digital format and into spoken format. So we applied this technology in the digital magazine service as well as in the IEA announcements. And now we have the ability to convert any kind of text into the speech using the Turkish language. Another nice example is AI-based HR interviews. Actually, Turkestan is a very active company in Turkestan's job market. And each year, our HR department analyzes and evaluates more than 50,000 applications yearly. It is too much effort, I mean, in terms of human effort and in terms of making the right analysis. In order to help them in that process, we created an AI service based on the computer vision and the speech technology. Our AI service analyzes all the interview video shot by the candidate and it automatically creates a profile of applicants. I mean, the emotional status, choices of words, like how much did you talk about the technology, how much did you talk about the customer, et cetera. And based on this interview, actually the automation, automatic evaluation, our AI service creates the report which helps HR department to evaluate the candidate. Beautiful recommendation engine. Actually, it is very early and the most popular application of AI. As I said in the previous slide, digital services are key part of our strategy and our customer value proposition. And part of AI, actually, the digital services is content. I mean, you have to make your customer to consume your digital contents as much as possible. Recommended engine are key mechanism to make your customer to consume more content. We are working actually a couple of different ways. The one popular way is it profiles your taste and it also profiles the content and it makes suggestions based on this match, I mean, match of your taste and the content. Another popular is collaborative filtering. Suggest the content which similar profiles to you already enjoy but you don't touch them. Chatbots, everybody knows about this. We also have several active chatbots that we are actively using. The key thing is about these two things I can say about them. One is rather than creating different chatbots each time we created a chatbot platform which you can easily create and configure your own chatbots. This is one as essence. Another thing is these chatbots, they are basically they are basically created on the NLP technology in Turkish language. So once we actually created on the OpenShift environment you can build a different services easily based on this technology. And the second thing is that we are trying to make our chatbots as much as possible on a proactive way not to answer your questions about your topics. Also it is suggesting you some new contents or a new package on the new tariffs based on your profile. Social responsibility, it is another area that we are using AIR technology actively. This is the application for autistic children to support their learning process and helping them to understand human emotions through gamification. We are now in the final position for the GSMA awards for that use case. Thruxel playground, this is a great example of the data democratization through the organization. Actually the story started like that. Three years ago as Thruxel we decided to provide online trainings with the cooperation of university more than 1,000 employees. We trained 1,000 employees regarding data science and the different fields of the analytics and the artificial intelligence. Training part is not a big deal but the question was that what would happen after all these people to get their training? They needed an environment to polish their skills to apply what they just learned, make data science without concerning about the infrastructure, environment, data access, data preparation and the necessary software libraries. Therefore we created a data science playground. It's a great data democratization example throughout the organization. It's a cloud based environment that contains all the necessary software packets, libraries for any data scientist that we need. And it's also integrated to all our internal data sources and the environment is GPU supported and the GPU resources can be shared automatically based on the load. This is a high-level solution architecture. It is working under the OpenShift 3.11 and the Jupyter Hub and the Jupyter Datbook is supported. OpenShift is actually responsible for the running Jupyter Notebook which is assigned for each user in isolated environment and the Python, we are using mainly the Python in Jupyter Notebooks. Active Directory is also supported for the identity management. Persistent volume integration exists to retain data in a long period of time and also we are supporting the system with the Git plug-in to share your codes and the versions easily. All the resources, I mean it can be RAM, CPU or GPU, resources can be allocated per user basis and the GPU versus JSON in place and we started all our analytical models into that environment. It is open source environment and it is now used for more than 200 employees in terms of. We talk about some use cases then the topic comes to an environment I mean that enables all these use cases and I mean what did we need creating such use cases and what was our long-term focus? We have some want to do and don't want to do. Our reasons like I mean starting from the don't want to do we don't want to develop and deploy all codes from the scratch and repeat the development trainings and the development process each and each time. We don't want to deal with the infrastructure and the environment necessities and over and over again for each use cases. We want to create a microservice-based service architecture which is very easy to integrate and to manage. We want to dynamically manage the software and the hardware resources and the capacity. We want to globally implement the identity and the access management. We want to integrate all our internal base resources to the system. We want our developers and the data scientists to not deal with the environment, libraries, softwares and the hardware just to do their job and we want to create AI services which are necessarily the cloud native and the accessible anytime and anywhere in the world. So this is the high level structure of our solution for the AI cloud native platform. As you can see, we already utilize various rather technologies. I have listed our reasons and the needs why we did this in the previous slide. Actually, our solution, the primary finding for our needs was the Kubernetes. I mean, we already decided to use, I mean, we had to use Kubernetes, we say. And as our major aims provide our developers the same comfort that they're working in the public cloud environments. And since the OpenShift is the reddit-supported version of the Kubernetes, we decided to go for the reddit OpenStack. At the bottom level, reddit OpenStack supports for the virtualization and the software-defined infrastructure. And as I said in the previous slide, we use GPU-based services and the resources are allocated, I mean, can be allocated for each use cases and for each users. Reddit OpenShift is positioned and is a main container platform in the Kubernetes. All our models are the Python-based and they utilize some public open source libraries such as TensorFlow. Also Jira or Jenkins-like technologies are in place. As I mentioned previously, it is always the best to develop cloud-native platforms containing AI services, I mean, which can be easily deployable. We use flexible and the scale on the cloud. With our AI platform, our goal was to gather different product families under one roof and to make them available to all our customers in terms of software as a service. By doing this, we aimed both to share our work in the field of artificial intelligence and to create new revenue channels. We basically focused on the three channels. One, our solutions should work under GPU. The second, gather different products under one roof and to share in these GPU resources efficiently. And for all these challenges, our solutions should be containerized solutions than open shifts. And with these containerized solutions of the open shift, we created reusable images, similar structure for different services, quite scalable, quite flexible, efficient and high performance platform we achieved. Now I am giving the word to you, Abhinav. All right, thanks a lot, Inaj. That's like a lot of great work for the social good and very good use cases that you have for the AI technology. But before I talk more about Red Hat OpenShift, I wanted to share some market data on the value of containers for improving the AI workloads. So the results that you see on the screen are from a recent survey conducted globally by the industry analyst firm called 451 Research, with like hundreds of AI, the companies that are implementing AI. And then what you see on the screen on the left is like more than 94% of the AI doctors, they indicated that they are using or plan to use containers for their AI projects within the next one year. And the top benefits that were highlighted that are listed on the right-hand side of the screen were in terms of the increased scalability, faster deployment time, improved performance, like lower cost, and then there are a few others that you can see on the screen. Can you go to the next slide, Inaj? The Red Hat OpenShift is the industry leading container and Kubernetes-powered habit cloud platform to help accelerate the AI lifecycle, be it your data center, public cloud, or edge. And it provides a lot more than the fundamentals value that you get with containers and Kubernetes because of a lot of capabilities that we have added into OpenShift. So what OpenShift does is it simplifies the deployment, scaling, and the lifecycle management of the containerized AI ML tool chains such as Jupyter Notebook, Python, TensorFlow, or even commercial, let's say software from our partners like H2.ai, Starburst, Presto, Seldon, and so on, by being able to automate the day one to two operations with these tools and then being able to ensure the faster time to value. Now the integration with the hardware accelerators such as the NVIDIA, yeah, can you go back to the previous slide? Okay, okay. Yeah, the integration with the hardware accelerators such as NVIDIA GPUs is to ensure that the modeling and inferencing stuff can seamlessly consume GPU from OpenShift. And then like OpenShift with both the self-managed or a public cloud hosted option provides a consistent way to perform the day one to two operations, like regardless of the location where your developers and the data scientists are working on. And then they get the consistency and the portability of modeling and the application development workflows, be it the data gathering, preparations, modeling, and the deployment. And then finally, OpenShift, it helps extend the value of DevOps to the entire machine learning lifecycle and being able to enable the ML Ops and the much needed collaboration between the various teams in your AI project. And this helps ensure that the machine learning model can be easily integrated in a continuous integration and the continuous deployment, continuous delivery of your intelligent AI powered applications. And then finally, so OpenShift is a fully integrated platform that includes the key capabilities like the self-service monitoring automation, the DevOps tool chain, and it's all built on all open source technology. And this helps drive the innovation and without any kind of a vendor lock-in. Can you go to the next slide, Inesh? Yeah, and then what you see on the screen is that we have many organizations across the globe, with many industry verticals that have accelerated the AI projects from pilot to production with Red Hat OpenShift as the underlying hybrid cloud platform. You see healthcare organizations such as HCA Healthcare, they've been able to achieve the data-driven diagnostics and then the automotive companies such as BMW are speeding up the autonomous driving initiatives. Financial organizations such as RBC and Discover Financial are speeding up the delivery of intelligent applications for various use cases. The Department of Agriculture, Food and Marine are speeding up the farmer grant applications. So there are like a number of use cases and we heard from Inansh as well on the customer experience use cases as well. So back to you Inansh to give the closing comments and the lessons that you learned that others can benefit from. Thank you. Okay, and final recommendations, anti-take aways. Actually the organizations can be different, anti-different organizations, they have their own doing things styles. But I can suggest that and I can recommend that in order to have a successful result inside of the organization for the good use cases, your artificial intelligence, data science and the infrastructure teams should be working in a collaborating environment. I mean, they should be collaborating together for the successful result. It is not the job of just one team. I mean, you can have a very powerful AI team but without including the infrastructure teams in the environment, you cannot have successful results. The other thing again and again I am repeating. Okay, I mean for some use cases it is good to create, you know, tailor-made services but for the most of the use cases, it is best to create reusable services rather than developing one time solutions over and over again. So I mean, whatever you should build, whatever you build, you should build as a cloud native. And also you need a content rise architecture in order to have the best scalability, in order to have the best maintenance performance and the efficiency and the service access. The open shift-like platforms are quite successful for this because I mean, this is a world of area of the open source and for the open source environment, they also need a good support from the successful vendor, successful partner. So I mean, with our partnership with Reddit and with the use of the Reddit utilization of Reddit technologies, which was the safe way for us. And I can also suggest you to stick with such kind of a way, get a good partner, it can be, I hope for some this is that we are talking about open shift, we can recommend it. Since it is a hand-to-hand integrated software package with the many benefits that we all talk about this presentation, it is a quite, I mean, we can recommend it and we can say that you can stick with it. Thank you. All right, thanks a lot all. Have a great rest of the day and enjoy the show. Thank you. Bye-bye.