 Hey everybody, welcome to the session about Apache Kafka and Kubernetes in the telco industry. My name is Kai Werner from Confluent and today I will talk about real world use cases in the telco industry with cloud native event streaming. We will see use cases for 5G, multi-access edge computing and different business unit use cases for OSS, BSS and OTT scenarios. Let's get started. So this is a packed agenda for the next 30 minutes so we will directly get started with the evolution of the telco industry before we explain how data and motion works and what that is and how it changes different business units and the thinking in the telco industry. Well if you're working in the telco industry then you're probably aware of that the telco industry needs to transform everything. So this is not just a simple technology change so from migrating from one version to the next or from one technology to the next. It's really about a complete mind shift about how you build scenarios. This might be for technical scenarios like monitoring the network infrastructure but also then for business innovations for providing new services to the customers. And therefore no matter what business unit you're working in, the industry is really changing and this is where Kubernetes and Kafka often help and that's what I will discuss today. A key part in most telcos that is changing this innovation is 5G. This is because it's much more reliable and much better latency but especially also because of the network slicing where you can provide different SLAs for specific use cases like if you have a customer in the manufacturing space then they can only leverage 5G if they have guaranteed SLAs for that. And with this you can really innovate your business and providing new services and of course increase the revenue with that kind of new service with your customers. And these use cases for 5G, they exist everywhere. It really doesn't matter what industry you're working in or what partners you have, here's just a few examples like if you think about the first one on the left select the stadium and you want to work with the sport partners where you will provide a good experience for the customers in the stadium. This has to be a reliable networking, this has to be low latency and this has to work even if you have thousands of people attending in the audience. And this is really not just about having the internet there to use your mobile browser but this is about providing new innovative services. And we will tackle this specific example later in this presentation just to give you a little bit of example of use cases but it's really important across these industries no matter what use case you're working in. From a technical perspective the success criteria for 5G is that it's cloud native. So this doesn't mean that everything is in a cloud but this means that everything is elastic and scalable and flexible and this is what cloud native actually means. And when you think about that this is where Kubernetes and Kafka come into play because on the one side you need to be elastic and scalable that's what Kubernetes provides no matter where you run it at the edge in the data center and in the cloud. And on the other side you need to act on data in real time because in most of these new scenarios you build real-time data beats slow data and therefore Kafka comes into play and therefore Kubernetes and Kafka is the perfect combination for building a cloud native 5G network for all these use cases you have seen on the last slide. One great example for that is Dish networks in the US. So Dish has the huge benefit that they can start from greenfield building a new 5G network. And with that if you read all the articles where they're talking about this they are really starting from scratch in a cloud native environment. So they heavily partner with a cloud provider in this case AWS and also leverage open source frameworks like Kubernetes and third party open source frameworks and tools and products so that they can build a cloud native 5G framework. So they want to focus on the business problem that's what they point out all the time right. So they take the right serverless offerings and cloud native technologies so that they can focus on building their 5G network. And if you're interested in this topic more then I really recommend reading all the Dish articles both on a business and on a technical level. There is so much on internet to find about how they build their 5G network in a truly cloud native manner in the cloud but then also at the edge where it's needed. In general if you're going away from Dish now the next generation telco architecture has to be more flexible and open. So this is what I hear from all the telco customers I work with across the globe. Not just in the US but also in Europe and in Asia. And therefore the key challenge is also to be cost efficient and standardized so that you can build a cloud native architecture. And really it doesn't matter if you're talking about building a new 5G infrastructure or focusing on some business problems in the BSS or OTT space or monitoring and building your infrastructure in the OSS level. All of that is true that you have to target architecture of being more open and cloud native. And this is actually the motivation for this talk. So I want to introduce the concept of data in motion with event streaming and how that helps with building these new innovations in the telco industry. We already heard a lot about cloud native which means the future of the data center. And I think everybody agrees that your future infrastructure needs to be more elastic and scalable. And once again cloud native doesn't mean that everything has to run in the cloud. It can also be on premise or at the edge. But still you need the elasticity and scalability of these cloud native frameworks like Kubernetes and containers. And in a similar way event streaming is the paradigm shift for data. So this means instead of storing data addressed and analyzing it too late we now can continuously process events in motion. And that's what event streaming is. And this is really a paradigm shift for building new innovative use cases. Because again if you think about your use case think about it. Real-time data beats slow data in almost all of the use cases. And that's what we will explore today with several different examples. Let's start with a high level overview about data and motion in telco industry. And this is just one example to give you a pitch for that. So here is an OSS infrastructure example. Where on the left side we see all the telemetry data. From the syslog from the firewalls from all the interfaces you connect to. And then you need to process the data to normalize it. To aggregate all the different technologies and data formats. And then you can continuously proactively do the network monitoring in real-time. And also act on incidents to manage the incidents and act on that. No matter if it's a human interaction or automated depending on the use case. And then last but not least you also do some reporting on all the events that happened in the past. The key difference with data and motion is that most of this is happening in real-time. So that you can act when the data is hot and in motion. For example the reporting is still a batch process. This is business intelligence. You can run a report overnight for example. This is totally okay. But the difference in data and motion is that the heart of the infrastructure. And that's why it's powered by Apache Kafka. Is in real-time. Even at scale for gigabytes per second and for petabytes of data. And this is the key difference of using a data lake where you store the data at rest. And in many cases you act too late on the data. In this case you can act in real-time when the data is interesting. Yeah and that's actually what Apache Kafka is. It's a platform for data and motion. Kafka is a real-time messaging platform at scale. So as I said before you can process high volumes of data like gigabytes per second. And you can have that with low latency like 10 milliseconds for processing the data end to end. But Kafka is much more than just a messaging layer. There's other messaging players in the market for many years. Kafka is an event streaming platform so that you can also continuously process the data. The real added value does not come from sending data from A to B. Like sending it in your data lake. That's okay for some reporting use cases. But the actual value comes when you process and correlate the data in real-time. And that's what Kafka is. So it provides features for data integration. With the high volume SysLoc connector for example to attach to events. And on the other side to correlate the data with stream processing. With technologies like Kafka Streams or KSQL. So the overall platform is much more than real-time messaging. And that's important that you know that for Apache Kafka is open source framework. It's not just messaging but much more than that. And it's also a storage layer that you truly decouple the high volumes of sensor data. From the BSS and CRM systems that are not built for that high volume. So Kafka can decouple it with its storage. And then you can also still act on real-time where needed. And actually when we're talking about the telco business. Kafka is actually much more than just being a cloud service. Or being an open source framework. The important thing is that you can provide the cloud native experience everywhere. In this example here I talk about Confluent, a company I work for right. So powered by Apache Kafka. So here in the public cloud, well you typically don't want to manage the infrastructure. That's why you go to the cloud to be more elastic and flexible. And use consumption-based pricing and serverless offering. So that's why most of our customers leverage Confluent cloud there. Where you have a truly serverless offering. But on the other side in many cases you cannot be in the cloud. If you run in your data centers. Or if you deploy closer to the edge like in a cell tower. Or even closer to the edge like we have customers that run. Kafka in a vehicle for example to do diagnostics. Then there you have to run some edge workloads. And therefore in many of these cases you still need to have the cloud native elasticity of Kubernetes and run on that. In other cases you have more an embedded use case. Where you embed a Kafka broker into a small server or system. So the point is that you can leverage Apache Kafka everywhere in a cloud native way. Automated with decoupled microservices and containers. And that's what we provide everywhere. And of course that's here is this Confluent solution. But the same is what you can do with the open source Apache Kafka. One key point I did not focus on too much is that Kafka really is not just a real-time messaging layer at scale. It's also a storage. And with that you truly decouple the different systems. And this is super important in the telco industry. Where you have a lot of legacy and monolithic technologies that are proprietary. But on the other side you also have open standards you use for future products. Like in the cloud. And therefore with Kafka you can really connect all these different systems to each other. They leverage different technologies. And also different communication paradigms. Kafka does true decoupling. It handles the slow consumers. It does the preprocessing. If sensors produce high volumes of data continuously. But then another consumer can only act on the process data. Because it's not built for high volumes. Or maybe it's not built for real-time. But for batch or request response. With Kafka in the middle you have the true decoupling for connecting all these different systems. And the difference to a data lake or databases that the heart of the infrastructure still is real-time and scalable and reliable. And that way is what makes Kafka so unique compared to these other systems. And if we map this idea to the telco business. Well then that's what you see here. Instead of having a monolithic and proprietary architecture. You can have a more elastic and flexible microservice architecture. And this is true for all the OSS and BSS components. Where you can still buy products where it makes sense of course. Which is then a more monolithic microservice. But then you can connect it to all the address. And this is true for the software and applications on top. And this can even be digital services you buy as a software as a service in the cloud. But on the other side this is also true for the network function virtualization. So for the core infrastructure of the telco networks. Like as we talked about 5G before. And this is really where Kafka helps by truly decoupling different systems. And providing a real-time scalable infrastructure that's needed as the heart of this. And then in the telco business. Well we see use cases for event streaming with Kafka everywhere. Here you see a few of the examples. Like this can be infrastructure level. Like a proactive monitoring of the networks. This can also be customer based. Like building innovative new services. Like for the contact center, the call center. And then even when you do automated data processing with natural language processing and machine learning. Like for chatbots or even for speech translation. So here you see sometimes it's more on a technical level about the integration layer. Sometimes it's about building innovative new business applications. Or modernizing the existing use cases. So in the telco industry no matter what business unit you're working in. Kafka and then especially in Kubernetes in an elastic way can help you everywhere. Let's now talk a little bit more about the architecture before I dive into the use cases. So as I said before you can run Kafka everywhere. This can be at the edge. Like in a cell tower. Or it can be even embedded into some small hardware like a Raspberry Pi. Maybe for the development but then also into OEM hardware boxes. Like in this case we're for example partnering with HiveCell. That's these yellow boxes here that you can deploy everywhere. And then run your workloads at the edge even in a disconnected way. So that's of course the edge deployment of Kafka. And even as a single broker if you don't need high availability. And then in most cases in the telco industry you have more hybrid architecture. Where you run some workloads at the edge. And sometimes you're working in a data center in the cloud. So Kafka and then especially with Confluent is not just about running Kafka. But also about the hybrid integration in real time at scale. Leveraging the Kafka protocol. Even for bidirectional replication of the data between different Kafka clusters. On-prem and at the edge and in the cloud. And as you see here you can still integrate with proprietary telco systems. Like Amdux or Ericsson solutions or something like that. And with that you're totally free how to define your architecture. And no matter if you're working just in a country. Or really deploying this globally or over a continent. And no matter if you're running in the cloud only. Or if you're running at the edge. If you're running in an integrated or disconnected way. It's totally free and depends on your SLAs. There is many different architectures. And this cannot be talked about in a 30 minute discussion. But you have to understand that Kafka can be used for many different use cases. Including analytical scenarios that are not that mission critical. But also then for transactional data. Where you really talk about zero downtime and zero data loss. Like you thought about that in the mainframe scenarios 10 and 20 years ago. Here's one specific telco example we've built. So this is where we work together with AWS and its wavelength product team. To deploy a solution with AWS wavelength. So this is based on outposts. And this means that in this case we are also working with the telco providers. Like the example was built together with Verizon in the US. So that we can deploy confluent at the edge in a 5G wavelength zone. So that we have that for low latency scenarios. Where you need to process high volumes of data in milliseconds. Reliably, leveraging 5G. But then also integrating that with Confluent Cloud. Where we're integrating and operating the rest of the IT infrastructure. Like integrating with the data leg and with machine learning. So this is just one example to show you. How you can leverage Kafka everywhere in hybrid scenarios. For MIT use cases. And to give you a specific example. This is a hybrid retail architecture. This is exactly what I talked about in the last slide. So here we are seeing a lot of IT scenarios in a normal cloud. Where we leverage Confluent Cloud as a fully managed Kafka service. In AWS in this example. And here we're connecting to a CRM system. Like Salesforce and other business applications. But then at the bottom you'll see that we also deploy Kafka at the edge. In this case we need to process the data. For location based services in the retail store. And if we go deeper into the retail store. In one scenario this could be like in an AWS wavelength zone. Where we are connecting to the retail store. Or at many of our customers in the retail space. They say no we don't have connectivity over 5G yet. And also our network does not work well in general. So we need a disconnected service. That can process edge computing without connection to the internet. Like for the point of sale integration. Like really transactional data. But also for diagnostics of the cameras. For analytics data. And therefore in this case now. Staying in the AWS scenarios. We deploy on AWS outposts. So that's really running at the edge in a retail store. To do edge computing with low latency. But also reliably. And then when there is internet connectivity. Like overnight. When nobody is using the wifi. Then you can replicate the transactional data. Into the cloud for further analytics. And aggregating it from different retail stores. So a huge benefit here is that you really can deploy Kafka. Even in disconnected scenarios. In this case in retail. It's more about customer experience and transactions. In other cases it's really about air gap environments. And cybersecurity. Like in manufacturing for example. Or in the shipping industry. Where we have customers that deploy Kafka on every ship. For doing diagnostics and edge analytics. For cyber security scenarios. And also for other scenarios like predictive maintenance. The key benefit is that Kafka is also storage. So that where it's needed or possible. You can process data in real time in memory. But it's also stored on disk. So that you can correlate historical data with new information. And that's the unique difference of Kafka. To other messaging systems. Because it's built not just for real time. But also for reprocessing historical data with Kafka. Now I want to walk you through a few scenarios quickly. As you see here OSS, BSS and OTT. So no matter what business unit you're working in. In the telcos space. In OSS, a great example for Kafka is open source mono. So that's a standard framework open source. Built for network functions, virtualization, management and orchestration. And this is where many vendors are leveraging this. So the heart of this infrastructure for the orchestration. Has to be real time and scalable and reliable. And this is why the heart of this framework. Is using Kafka for exactly the reasons I explained. Explained beforehand in this talk. It's open, it's scalable, it's interoperable. You can integrate with both open standards. But also with proprietary technologies in your infrastructure. And that's why this is a perfect example. No matter if you're talking about building a new 5G network. Or integrating more traditional legacy infrastructure. From an end user perspective. A perfect example for using Kafka and the OTT space. And in the OSS space is entity communications out of Asia. So they had the challenges of not being able to process the high volumes of data in real time at scale. And that's why they went to Apache Kafka. Is the heart of their infrastructure. For things like capacity planning and traffic engineering. So with this also from a business perspective. They could optimize the resources. And do detection of the revenue leaks. So there's plenty of different scenarios. And of course also cyber security is important. You need to correlate the data in real time. To provide situational awareness. And straight intelligence. And that where Kafka is the heart of such an infrastructure at NTT. In this case, they are combining it with Apache Druid. A time series database. For doing real time analytics. So that's also important to understand. So Kafka is of course not solving every problem. You can do it for data integration. And data processing in real time. But then you also complemented. With other real time analytics tools. Or you ingest the data into a data lake. Or data warehouse. In the BSS business. So when you talk more about the customer perspective. You also have plenty examples where Apache Kafka is leveraged. So compacts is a very well known. BSS solution and CRM solution. And in the end, if you take a look at their website. They're explaining how they build it. And no surprise. It's powered by a cloud native Kafka backbone. So this means that they. Under the hood provide a micro service based BSS product. It's flexible. It's elastic. It's open. And with that you really can use these modular components. From an end user perspective. Like you need it. And you can scale it like you need it. Because under the hood. It's a reliable scalable real time infrastructure. And with that. This is a great example from the telco product perspective. How companies use Kafka under the hood. To build these telco services. And here's a more end user. Customer 360 application 8x8. They are leveraging confluent cloud to build a cloud based voice. And contact center application. So this means that in the end. They wanted to leverage a cloud native service. So that they can process all the data in real time at scale. And that's why they are leveraging confluent to build a multi tenant. Software as a service offering for their contact center solution. It's also very important telco space when you talk about customer data. That this is GDPR compliant. And that privacy is important. And cyber security is important. And all of that requires that all the data can be processed. Reliable at scale. And in real time. And some of the use cases 8x8 has built is. Things like a great customer experience. And also proactive alerts and prediction of the staffing needs. And if you think about this scenario once again. Real time data beats low data in most of the use cases. And that's why 8x8 is leveraging. Kafka in this case powered by confluent and for this scenario. Yeah. And once again, the benefit of using Kafka is that it's not just a real time system. However, it's also a storage system. All of the events are appended to the log. And then you can consume the data again and again and again. We are real time systems. We are batch systems. And we are request response web services. For example, from a mobile app. Like in this omni channel example, it doesn't matter if the customers using different devices and interfaces. And it doesn't matter if some of these events happened many months ago. You can still correlate the data. Like in this example. Where some data was sent from a newsletter from the CRM system. And then many days later, the customer configured in this case a car. So that he wants to buy it. And then you can provide real time location based services. Like when the customer is entering a dealership. So that the sales person is directly seeing the correlated information. About the historical data of this customer. In real time context specific. When you need to know it to make the right decisions. Selling a car to the customer in this example. However, in addition to all of the real time correlations then. Other data teams can also use the data. And a lot of that is often not real time. Like for reporting and business intelligence. Or the data science team that does machine learning with the data. Of course, in a privacy meant way. So that you only get access to the data that you are allowed to get. From a law and GDPR perspective. But the point is data that is created once. Can be processed by many different consumers. In real time at scale. In the OTT business. And that's also where we see many telcos going in more and more. Is also where real time data beats low data. And where our telco customers build services on top. Like in Toronto in this example. They have built a tech enabled real time communication solutions. The story is the same like for many other scenarios. Instead of using a monolithic. And proprietary middle where for doing the integration. They leverage event streaming powered by Kafka and confluent. To provide personalized experiences. From market intelligence and social sensing. As you see in this case. They also integrate with other systems. Of course, like service now as a cloud service. But then they also leverage Kafka streams and case SQL DB. To do continuous stream processing. Of the data with Kafka native technologies. Another great example is hot star. Hot star from Disney plus is providing an OTT service. For millions of cricket fans. And so in this case it's not just about the video stream. That's not Kafka. But it's about providing additional services. Like for gaming and betting in this case. So Disney plus hot star talked a lot. At past Kafka summits about their architecture. Where they heavily rely on Kafka. And Kafka connect. Last but not least. Let me conclude the session with a case study. For infrastructure modernization. That's super important in telco industry. Like when you want to migrate away from a mainframe for example. Like the point is that it's a step by step approach. Right. You cannot shut down everything from the beginning. So you do that step by step. So in this example. We are having our legacy application. Like a customer service system from the 80s. And then first we integrate this with other applications via Kafka. It's still decoupled. It's leveraging different technologies. And then over time. We can build new cloud native services. With Kafka native technologies. Or with other services. But all of them can also connect to the legacy systems. And then over time. We can even replace a legacy system. Like with the new computer service system from 2020. Maybe even as a software as a service system. And this is the approach most of our customers do. It's not greenfield. It's step by step on brownfield. And that's exactly why customers come to us. Right. So the rise of data in motion. The rise of data in motion started many years ago. With Kafka being open sourced. But then when Confluent was founded. It made Kafka enterprise ready. And now almost every bigger company. And it's also true for the telcos base across the globe. It's leveraging Kafka and most of them also Confluent. Because we help with the projects. It's not just about having a great technology and product. It's about the services and support. For starting small. But then rolling out the mission. Critical workloads and hybrid and global scenarios. For mission critical transactional. And for analytically use cases. And that's in one slide what Confluent is doing. Right. We are not just building a car engine. That's the open source frame of your page. You Kafka where we do over 80% of the commits. And you can use that by yourself for free. But with Confluent we provide a complete car. That's safe secure. That provides the operations and monitoring tools. Including a Confluent operator for Kubernetes. To run at the edge and in hybrid scenarios. But then in the cloud we also provide the self driving car level five. So that you get really a truly serverless on consumption based offering. With mission critical SLAs across all major public clouds. And with that I hope this was a good overview. In 30 minutes about Kafka and Kubernetes and the telco industry. And how you can leverage this to innovate your business. No matter if you're talking about infrastructure and OSS level. Or about BSS and OTT customer services. Thanks a lot for watching. And feel free to reach out to me and connect to me on LinkedIn and Twitter. Goodbye.