 Good morning! I bet if I asked each of you what 5G means to you that I would get almost as many different answers as there are people in this room. And many of you would think of 5G as something that's way off in the future. But from an infrastructure perspective, 5G is actually happening right now and if anything, I believe, we need to go even faster. I don't have the opportunity to ask each one of you what you think about 5G, so we conducted a not-so-scientific survey of the average person on the street and asked them what they thought about 5G, what 5G means to them, and this is what they said. I'm Sarah Sweeney and we're here in New York's Grand Central Station asking real people what they know about 5G and some of the innovations it will bring. Let's go! Can I talk to you for a second about 5G? Sure. Cool, tell me about it. I have no idea what it is. Can you tell me what 5G stands for? 5 grand. What's 5? So 5G as in... The good thing about 5G is it puts a lot into a small package. 5G, it's, um, I know that like, it has to do with Wi-Fi, right? Is that the 5G that they do for the phone or the network or something like that? There he is. I say 5G, you say... Internet. There's no idea what it is. No, the next thing past 4G. Do you know what the G stands for? I don't. Generation. Yes, you are the first guy today. Good job. Thank you. If you have 4G, that's the fourth generation. So 5G is... The fifth generation. The next generation. It's generation, it's the fifth generation. Oh my God. Technology. Oh my God, that's why we're in the fourth generation. It is the new fifth generation wireless communications technology that has new spectrum approved by the FCC of 28, 37 and 39 gigahertz bands. Yeah. You heard me ask somebody else and you looked it up on Wikipedia, didn't you? Yeah. Have you ever heard about Intel? Let me spit about it. It's 5G technology and here's what's legit about it. I believe it's fast, correct? Reliable? If it would become real. If it would become real, I would be really happy. 5G. Internet. You got it. Guys, let's wrap it. We got this one. So 5G equals 5 grand. That's my personal favorite because I think that's the way my mother would have answered that question. So when we look out into the future and we're talking about 5G because we see a future that is all about data. In fact, of course, a lot of that data will be generated by humans where we believe that in the next five years over 70% of the human population will have or use a smartphone and over 90% of everyone over the age of six would have some kind of mobile device. But beyond people, the next data revolution in the future is really about the things. In fact, by 2020 we believe that over half of all the data that is generated on the network will come from machines. And to put that into context, if you look at a user today, you and I, we generate on average about one and a half gigabytes of data per day. But that pales into comparison in terms of what a connected car will generate. A connected vehicle will generate 2,500 times as much data up to four terabytes a day. A connected plane, five terabytes a day, and a smart factory up to one petabyte of data per day. And one petabyte for any of you keeping track is a thousand terabytes or one quadrillion bytes. To handle all of that data, we need a fundamental different approach in terms of how networks are architected and deployed. And we believe 5G is an answer to that problem. Looking backwards now, we see that over time there have been innovations that have been captured as general purpose processing technologies. These are innovations and inventions like the printing press and the steam engine and the telegraph and the internet. All of these are in an elite class of socioeconomic mainstream technologies known as general purpose processing technologies because they have been pervasively deployed across many, many industries and they have become catalysts for transformative changes that raise up the level of human productivity and improve all of our lives. In fact, we believe that 5G has the capacity to become one of these general purpose processing technologies. If we look at 5G and all of the different use cases that are going to be enabled by the increased capacity and coverage and distributed intelligence across the entire spectrum of the infrastructure, we know some of these today, some of these are yet to be invented, but they can be categorized into three different areas. The first is massive machine type communications and these are indicative of the types of technologies and capabilities that are enabling the internet of things. So, sensor data in the smart appliances in your home or smart lighting and smart infrastructure in a smart city. Of course, all the infrastructure in a smart factory. They don't require large amounts of bandwidth and they are not sensitive to latency, but there's many, many millions of these sensors that need to be aggregated and where you can collect data and they turn that into valuable insights and information. A second use case is enhanced mobile broadband. In fact, many of the trials that we're participating in today are under this category of capability. The 5G tech forum with Verizon is focused on not just the immersive experiences you're trying to enable in the home, but that last mile, that last few hundred yards or few hundred meters of connectivity. That's what the innovations in the new wireless technology will enable for enhanced mobile broadband. And the third category is ultra reliable, low latency types of use cases. And one of the best examples of this particular use case is the connected car where you need a very fast response time when you're doing functions like collision avoidance or overtaking a vehicle or lane changing. We know with today's 4G network that round trip response time typically takes about 15 milliseconds between the connected car and the network. But in order to deliver on the collision avoidance promise that we have with connected cars, you really need to take that down to 3 milliseconds. And with 5G, that response time is actually shortened down to just one millisecond with 99.99% reliability. So if we look at then what is fundamentally different about the network infrastructure that's needed to support these very broad and varying use cases, we know that we need a different approach to the network architecture. We need a network infrastructure that is much more intelligent, not just in the core, but in the edge and in the access of the network with distributed intelligent programmable and scalable capabilities through that spectrum of infrastructure. We also know that when we introduce programmable computing in parts of the network that never existed before, you're able then to harness a lot of the data that you collect and turn that into valuable information that you can then monetize and where you can create new services that are available both upstream to the content providers and also downstream to the consumers of that data, whether they're people or their things. And of course, open source and open standards really represent a critical capability of accelerating that distributed intelligent network architecture because open source and open standards lowers the barriers to entry, accelerates the pace of innovation. It allows us to have many more new use cases and capabilities and overall more choice and lower cost in the market. So let me step through a few of the use cases that are being enabled today by this intelligent infrastructure and that will be enhanced by 5G. And we'll start with drones in the access network and I love drones. Drones are fun, but they're also quite useful. Today, they're used for photography and surveying. They're used for inspections and they're even used to create an on-the-fly cellular transmission capability. So imagine a natural disaster where your infrastructure is wiped out. You can actually recreate that infrastructure, that wireless access capability through a drone network. In fact, this is being trialed today by AT&T in an example or in a technology called cell on wings or cow. So there is a flying cow in this environment. But if you look beyond some of the things that we're doing in the future in terms of wireless tower inspections and just this recreation of a cellular infrastructure, we can look at some of the examples that are happening in the enterprise. We've been working with Airbus. And Airbus provides a number of aircraft inspections clearly for their aircraft throughout the year. And typically, this is the job of two people hoisted on cherry pickers that inspect that airplane end-to-end and that process takes about two hours. With a drone, that process only takes 15 minutes. But you have added capabilities with a drone because you can begin to collect data on those inspections and build a database of flaws and faults and diagnosis of problems with airplanes and be able to test what you see with the 360 cameras on the drone with the database that is stored back in the cloud. And that combination of local value creation, local data collection, that whole inference capability that is provided by the drone locally. And then the scoring capability that happens in the cloud is an example of where artificial intelligence and machine learning is being used to increase and improve a capability that today takes humans to do. In fact, one of the other ways that this helps the industry is that Airbus has found this to be such a great improvement in efficiency and data collection that they're also now providing this as a service to other airline companies or aircraft companies. And so they see an opportunity to turn valuable data into services and monetization opportunities. When we look at 5G intelligence in the edge of the network, we look at retail as another example. Retailers today have a lot of challenges of course competing with or complimenting the online retail experience. But one of the biggest problems they have is in something called inventory distortion. And this is the ability to accurately assess what their inventory position is and to avoid stockouts or overstocks or something they call shrinkage, which is really just a nice word for theft. So when you look at this inventory distortion problem, it represents over $1 trillion of negative impact in terms of retailers globally. And in the U.S. alone, that shrinkage or that theft problem represents about $42 billion of negative impact because you can't trace or track what is actually happening with your inventory. We know that with 5G technology and with sensor data and RFID and NFC communications that you can improve that accuracy of your inventory up to virtually 100%. And with every 3% improvement in your knowledge and assessment of your inventory or your true inventory position, you actually increase revenue by 1%. Today, the best that you can get in most retailers is somewhere between 65 and 70% accuracy. But again, with 5G technology, we see that increasing to almost 100% of the accurate information around an inventory position. And then you extend that beyond just the inventory to the actual experience that you're trying to enable with the user coming into the store. Immersive experiences where you can try on different outfits and see how they look on you virtually in a mirror. You can also just track the behavior of the consumer, suggest complementary items to the things that they've selected, and collect data to try to ensure that you have the right products that satisfy your users in that store at any point in time. Moving to the core of the network and greater intelligence there, Event is a great example of a use case that we're enabling through technology and that 5G enhances. So today at events and concerts, we know that the distribution of the data and the load on the network is very spiky. And if you look at something like the Super Bowl, the high peak traffic for content actually isn't for touchdowns. It is during the halftime show when you see people taking selfies and photos and videos and uploading the Beyonce halftime show, the Lady Gaga halftime show. What that represents to the network provider is of course a great load on their network. And one of the ways of overcoming that is by bringing more intelligence to the edge of the network from the core and extending that capability and building a cloud ran capability that now allows you to cash content locally and to be able to satisfy the user's experience without the backhaul costs and latency that you would normally incur. With 5G, that capacity and coverage increases anywhere from 100 to 5000 times and it does so in a way that is much more cost effective. Again, bringing that intelligence from the core to the edge to the access network and having a consistent architecture by which you deliver these new experiences to all of the users exactly at the same time. Finally, you can't talk about next generation networks and network requirements without talking about video. Video is that workload that every single network operator has to contend with. It represents over 80% of all the data traffic at peak times, over 75% of video traffic and over 60% of the traffic that's generated by the Internet of Things. And increasingly that video experience is becoming more immersive through experiences like augmented reality and virtual reality or even merge reality, the idea that you would place a physical object in a virtual world or a virtual object in your physical world. This requires a lot of low latency processing and large amounts of bandwidth. And you see 3D experiences with VR glasses or experiences that are enabled by Facebook 360 or Google Street View. All of these requiring much more immersive, much higher bandwidth and much lower latency to deliver to the end user experience. So Intel's view is that 5G truly is an end to end opportunity. Where you have distributed intelligent programmable networking from the device through the access to the core and up into the cloud. And it's a circular cycle of goodness in terms of the content creation, the ingestion into the network, the processing in the cloud, and then back out to deliver that experience to the users. And in that stadium example, we know that for example in the Olympics that we're looking at for 2018 and 2020, some of the opportunities exist to actually create a virtual ticket to an event and have a courtside view to the actual race or competition or match when you are half a world away. All of that requires innovations that 5G is enabling. So when you look at all of those varying use cases and all of the different capabilities that we need to deliver from a network infrastructure, we know that we need an infrastructure that is highly composable. That can actually create on the fly in real time an experience that is tailored to that end user or that end use case without over provisioning of the network resources or under provisioning and then impacting the quality of experience. And those will vary greatly in just the examples I gave from high bandwidth to low latency, low energy that you need, low bandwidth for the inner things, and then the ultra high bandwidth examples of video, immersive video, and of course connected cars. All of that capability with a logical slice being served up by the network that is highly tailored and highly tuned to that particular use case. And of course, what we need here is not just the resource orchestration, but actually the services orchestration to manage all of that capability. And here to share a little bit more of their insights on the network of the future and how 5G meets some of those requirements is Chris Rice, Senior Vice President of AT&T Labs. Please welcome Chris. Hi, Chris. Thanks for coming. So I've talked a lot about 5G and IOT. So and I wanted to ask, well, how does AT&T see 5G and next generation networks impacting your capabilities and what you're offering to your customers? So think of it as we're creating a VIP network. VIP. VIP. Virtual, intelligent, and programmable. Not exclusive to VIPs. Not exclusive to VIPs, but virtual, intelligent, and programmable. And so for us, you know, we have at the core now tens of millions of mobile subscribers on a virtual evolved packet core. That's already happening now. At the edge, we're seeing these low latency use cases that you talked about driving, you know, more compute and more intelligence at the edge, which again you just discussed. And in both at the edge and in the core, you know, what's really key is open interfaces because those open interfaces allow us to get data, allow us to do analytics, and it drives an unprecedented amount of automation into our network going forward. So open interfaces. Yes. Open source. Made some big announcements recently. Right. Open app. Tell us a little bit more about what you think Onap will deliver not just to ATT, but really to the industry at large. And I promised you I'd be brief on this. So, you know, Onap is a Linux foundation project. It's a combination, a harmonization of what we were doing in open source e-comp and open O. Today, the service providers that are members of Onap account for about 38% of the worldwide mobile sub. So about 1.8 billion. There's more on the way. In fact, we have 27 members now, and we had five new members just this week. Let's see. Wind River, Sienna, Reliance Geo, Microsoft, and we even had the ONF. Pretty broad. Open networking forum joined, right? Yeah. I mean, so that's a very diverse group. Very diverse group. Right. And I think the open networking forum is really interesting in the sense that, and we talked about this on your panel the other day of kind of driving a directed innovation. And so I think that harmonization that we wanted to do is paying off because, you know, just like IOS and Android became de facto operating systems for the mobile phone, we'd like to see Onap become the de facto network operating system for SDN automation. And so we think that's a really good harbinger of that. And an invitation to everyone to participate in that. An invitation to join, yes. So let me ask you finally, then, when you look at that re-architecture of the network, what role does data analytics play in terms of either ways to lower costs or perhaps to become a vehicle of monetization for new services? Yeah. So I think if you think about analytics, you've got to go two steps back because analytics really requires data and data requires open interfaces. And so I've said that a lot, but I'm going to say it again for emphasis. The really the key for us in analytics and data is open interfaces. And that's open interfaces at the edge and at the core. And really, to be honest with you, what we're trying to do is get more and more capability to be able to collect, analyze, and act on data. And once you can do that, you can learn from it. That's where the machine learning comes in. And once you can learn from it, that drives literally a whole new level of automation, something we call hyper automation. So it totally changes the way we run and manage our networks. And so is 5G and the foundations of 5G happening now at AT&T? And is that really part of the journey that we're on in terms of the network transformation that you're driving? Yeah. I like the way you described it. When people buy that first phone and it's a 5G phone, they kind of feel like that's the first day. But there was a lot of work going on. Just like when the baby arrives, there was a lot of work right before that. So we're excited about that. And yeah, we're well on our way. And we certainly thank you and others for the help in making that happen. The 5G baby. I have to think about that. There you go. Take care. Thank you very much. So from a management and orchestration perspective, clearly we have a lot of work to do, but we are well underway with ONAP and with other open source networking projects. In fact, two that I wanted to call out here today is DPDK and OPNFV. So DPDK, the data plane development kit, is a set of optimized libraries and device drivers that Intel invented back in 2010. And this is to provide high performance packet processing functions on general purpose CPUs. So functions like Q and buffer management and quality of service, flow classification, all of these functions that you can run on your general purpose CPU that's probably running already in your server platform of the application and the control plane functions. Intel invented it in 2010. It was actually one of the key enabling technologies that was included in the original Etsy NFE white paper in 2012. We open sourced it through dpdk.org in 2013. And over the years have added multi-architecture and multi-vendor support. And earlier this week on Monday, Jim Zemlin announced that dpdk will now be hosted in the Linux foundation as a way to broaden the community of developers that are contributing to this innovation. So we're very excited about dpdk now being hosted by the Linux foundation. Another big announcement that we made just yesterday is with OPNFV or the open platform for NFV. So OPNFV is an open source project that brings together all of the elements that you need to deliver and accelerate the commercialization of NFV solutions in the market. We are on our fourth release. We're two and a half years old, but this was our fourth major release. And we work upstream with the upstream communities like OpenStack and ODL and OVS. And then of course we work within the project to harden a lot of the capabilities to automate functions of NFV onboarding to include capabilities in this latest release for a faster data path through the integration of the FIDO stack and also some key features that we've introduced like service function chaining and support for IPv6. So OPNFV is a catalyst to accelerate the network transformation which of course NFV and SDN are a critical part of. So in summary, 5G is happening now. This idea of an intelligent scalable programmable network from the core to the access to the edge is indeed the hallmark of what work we're driving in network transformation. And in fact when you have that programmable computing capability in all areas of your infrastructure you're able to provide data analytics and turn so much of that data into valuable services that you can monetize with your upstream customers and your downstream customers. Open source and open standards, all of you are a critical element and accelerator to that 5G vision. And on behalf of Intel, I invite all of you to join with us to accelerate the 5G future and deliver on the promise of 5G. Thank you. And I think we have time for a few questions, yes. So if you have a few questions, I think there's mics in the audience. There's one right there. Good morning. Hi, good morning. Cliff Grossner, IHS market. Just one question. I'm wondering if you have any thoughts about hardware accelerators for machine learning and AI and do these find themselves way into this network? Yes, that's an excellent question. Hardware acceleration is a way to perform very, very fast deterministic functions of highly cost-effective in a low-power way. And they can take on any number of forms. Within a CPU or a chipset, a lot of the technology that we provide to the market, we have crypto and compression capabilities for hardware acceleration that we integrate into the standard platform with the common tool chain and all of the capabilities that the ecosystem and the developers have to take advantage of that. But beyond that, we know that ASICs actually provide a lot of excellent functionality and Intel made an investment in an ASIC company that specializes in artificial intelligence, Nirvana, and that we are integrating that now into our tool chain so that you have that same interface, whether it's running natively in the CPU through hardware accelerator blocks there or in an ASIC. And then finally, FPGA is another way that we can provide hardware acceleration. And FPGAs in particular are very valuable in the wireless network infrastructure because so many of the standards and so many of those networking stats, and particularly in the case of 5G, are evolving over time. And so FPGA gives you that programmability. But the key here, and certainly Intel's strategy, is to provide a common programming interface so that the developers themselves don't necessarily have to care how that hardware acceleration is delivered, whether it's in the CPU, in the FPGA, or in the ASIC. It's a common interface and allows them to develop solutions much more quickly. Hello, Sandra, Don Clark. Hello, John. Just a bit of information, really, there is another joint operator, White Paper, on 5G, which was published in February, which outlines what the operators consider to be the priorities for NFV to support 5G. So that includes things like network slicing and cloud-native design. So that's available. And in fact, I'm talking about that tomorrow morning. And then the second thing about Open Standards Collaboration is that we've just finished an Etsy NFV joint session with the 3GPP SA2, which is about trying to align the 3GPP network architecture to consume the NFV stuff that we've been developing over the last four years. Just a bit of information. I know this is mainly an open source software group, so of course I also welcome the open source people participating in that collaboration as well. Well, actually it's a valuable comment, Don, because we see an increasing interdependency between open source and open standards, one helping the other to really go faster. So thank you. Okay, I think we're out of time. Thank you so much. Have a good day.