 from theCUBE Studios in Palo Alto in Boston. It's theCUBE, covering IBM Think, brought to you by IBM. Welcome back everybody. This is Dave Vellante of theCUBE and you're watching our continuous coverage of the IBM Think Digital 2020 Experience and we're really pleased to have Rob High here. He's not only an IBM fellow but he's the vice president and CTO of the IBM Edge Computing Initiative. Rob, thanks so much for coming on theCUBE. Good to see you. Wish we were face-to-face, but... Yeah, thanks, I appreciate that. It's time to be safe and healthy, I guess. Indeed. So Edge, obviously a hot topic. Everybody has this sort of point of view. Would be interested in how IBM looks at Edge, how you define it and what your thoughts are on its evolution. Yeah, well, you know, there's a really kind of two fairly distinct ways of thinking about the Edge. The telcos are, you know, they're creating Edge capabilities in their own network facilities. We call that the network Edge. The other side of the Edge that I think matters a lot to our enterprise businesses is those remote on-premise locations where they actually perform the work that they do, where the majority of the people are where the data that actually gets created is first formed and where the actions that they need to operate on are being taken. That is a lot of interest because if we can move workloads, IT workloads, to where that data is being created, where those actions are being taken, not only can we dramatically reduce the latency to those decisions, we can also ensure continuous operations in the presence of, perhaps, network failures. We can manage the growth of increasing demand for network bandwidth as more and more data gets created and we can optimize the efficiency of both the business operations as well as the IT operations in the port of that. So for us, Edge competing at the end of the day is about moving work for the data when the actions are being taken. Well, so this work from home gives a result of this pandemic is kind of creating new stresses on networks and people are pouring money actually into beefing up that infrastructure as sort of an extension of what we used to think about Edge, but I wonder if you could talk about some of the industries and the use cases that you guys are seeing in notwithstanding, as you say, the work from home pivot. Yeah, absolutely. So, I mean, look, we have seen the need for placing workloads close to where data is being created and where actions are being taken in virtually every industry. The ones that are probably easier for us to think about and more common in terms of our mindset are is manufacturing. If you think about all of the things that go on in a factory floor, the need to be able to perform analytic in the equipment and in the processes that are performed in the manufacturing floor, if you think, for example, production quality. If you've got a machine that's putting out parts and maybe it's welding seams on metal boxes, you want to be able to look at the quality of that seam at the moment that it's being performed so that if there are any problems, you can remediate that immediately rather than having that box move on down the line and find that the quality issues that were created earlier on now have exacerbated it in other ways. So quality production, quality inspection, production optimization in our world of COVID, COVID-19 and worker safety and getting workers back to work and ensuring that people wearing the masks and are exercising social distancing, this is on the factory floor, worker insight is another major use case that we're seeing surface suddenly with a lot of interest in using, whether that's infrared cameras or Bluetooth beacons or infrared cameras, any variety of devices that can be employed in the work area to help ensure that factories are operating efficiently, that workers are safe, and whether that's in a factory situation or even in an office situation or in a warehouse or distribution center in all these scenarios, the utility, the edge computing to bring to those use cases is tremendous. Now, a lot of these devices are unattended or infrequently attended. I always use the windmill example. You don't want to have to do a truck roll to figure out what the dynamics are going on at the windmill so I can instrument it, what about the management of those devices from an autonomous standpoint? And what are you doing or are you doing anything in the autonomous management space? Yeah, in fact, that's really kind of key here because when you think about the scale, the diversity and the dynamism of equipment in these environments, and as we point out, Dave, the lack of IT resources, lack of IT skills on the factory floor or even in the retail store or a hotel or distribution center in any of these environments, the situation is very similar, you can't simply manage getting the right workloads to the right place at the right time in sort of the traditional approaches. You have to really think about an autonomous approach to management and letting the system decide for you what software needs to be placed out there, which software to put there, if it's an analytic algorithm, what models to be associated with that software and getting it to the right place at the right place at the right time is a key part of what we do in this thing that we call IBM Edge Application Manager. And it's that product that we're really kind of bringing to market right now in the context of edge computing that facilitates this idea of autonomous management. Yeah, I wonder if you could comment, Rob, on just sort of the approach that you're taking with regard to providing products and services. I mean, we've seen a lot of situations where people are just essentially packaging traditional compute and storage devices and sort of throwing it over the fence at the edge and saying, hey, here's our edge computing solution. Now, I'm not saying there's not a place for that. Maybe that will help flatten the network and provide a gateway for storing and maybe processing information. But it seems to us that at a bottom's up approach is going to be more appropriate. In other words, you've got engineers who really understand operations technology people, maybe a new breed of developers emerging. How do you see the evolution of products and services and architectures at the edge? Yeah, so first of all, let me say IBM is taking a really pretty broad approach to edge computing. We have what I just described as IBM Edge Application Manager, which is the, if you will, the platform or the infrastructure on which we can manage the employment of workloads out to the edge. But then add to that, we do have a whole variety of edge enabled applications that are being created. Our global services practices and our AI applications, business, all are creating variations of their product specifically to address and exploit edge computing and to bring that advantage to the business. And of course, then we also have global services consulting, which is a set of skill resources who not only understand the transformations that business needs to go through when they want to take advantage of edge computing and how to think about that in the context of both their journey to the cloud as well as now in this case, the edge, but also then how to go about implementing and delivering that and then further managing that. Now, couple that then with at the end of the day, you're also going to need the equipment, the devices, whether that is an intelligent automobile or other vehicle, whether that is an intelligent robot or intelligent camera, or if those things are not intelligent but you want to bring intelligence to them, then how you augment that with servers and other forms of cluster computing that resides co-resident with the device, all of those are going to require participation from a very broad ecosystem. So we've been working with partners, whether that is vendors who create hardware and enabling that hardware and certifying that hardware to work with our management infrastructure or whether those are people who bring higher order services to the table that provide support for, let's say, data caching and facilitating the creation of applications or whether those are device manufacturers that are embedding compute in their device equipment, all of that is part of our partnership ecosystem. And then finally, I need to emphasize that the world that we operate in is so vast and so large, there are so many edge devices in the marketplace and that's grown so rapidly and so many participants in that. Likewise, there's a lot of other contributors to this ecosystem that we call edge computing. And so for all of those reasons, we have grounded IBM Edge Application Manager on open source. We created an open source project called Open Horizon, we've been developing that actually now for about four and a half years. Just recently, the Linux Foundation has adopted stage one adoption of Open Horizon as part of its Linux Foundation Edge, LF Edge, Edge X Foundry project. And so we think this is key to building out a ecosystem of partners who want to both contribute as well as consume value and create ecosystems around this common idea of how we manage the edge. Yeah, I'm glad you brought up the ecosystem. I mean, it's too big for any one company to go it alone. But I want to tap your brain on just sort of architectures. And there's so many diverse use cases. We don't necessarily see one Uber architecture emerging but there are some characteristics that we think are important at the edge. You mentioned sort of real-time or near real-time. In many cases, it has to be real-time. You think about autonomous vehicles. A lot of the data today is analog and maybe it doesn't have to be digitized but much of it will be. It's not all going to be sent back to the cloud. It may not all have to be persisted. So we've envisioned this sort of purpose built architecture for certain use cases that can support real-time, that maybe have arm-based processors or other alternative processors there that can do real-time analytics at the edge and maybe sending portions of the data back. How do you see the architectures evolving from a technologist perspective? Well, so certainly one of the things that we see at the edge is a tremendous premium being placed on things like energy consumption. So architectures, they're able to operate efficiently with less power is certainly an advantage to any of those architectures that are being brought forward. Clearly, X86 is a dominant architecture in any information technology endeavor. More specifically at the edge, we're seeing the emergence of a lot of arm-based architecture chips out there. In fact, I would guess that the majority of the edge devices today are now being created with arm architectures. But some of this is about the underlying architecture of the compute, but also then the augmentation of that compute, the CPUs with other architectures and other types of processing units, whether those are GPUs, of course, we're seeing a number of GPUs being created that are designed to be low power consuming and have a tremendous amount of utility at the edge. There are alternate processing unit architectures that have been designed specifically for AI and model-based analytics, things like TPUs and MPUs and et cetera, which are very purpose built for certain kinds of analytics. And we think that those are starting to surface and becoming increasingly important. And then on the flip side of this is both the memory storage and network architectures, which aren't sort of exotically different, but at least in terms of capacity have quite a bit of variability. Specifically, 5G though is emerging. And 5G, while it's not necessarily the same as edge computing, there is a lot of symbiotism between edge and 5G and the kinds of use cases that 5G envisions are very similar to those that we've been talking about in the edge world as well. Rob, I want to ask you about sort of this notion of programmability at the edge. I mean, we've seen the success of infrastructure as code. How do you see programmability occurring at the edge in terms of fostering innovation and maybe new developer models or maybe existing developer models at the edge? Yeah, we found a lot of utility in sort of leveraging what we now think of as cloud computing or cloud computing models. The idea of containerization extends itself very easily into the edge, whether that is running a container in a Docker runtime, let's say on an edge device, which is resource constrained and purpose built and needs to focus on sort of a very small footprint or even edge clusters, edge servers where we might be running a cluster of containers using our Kubernetes platform called OpenShift. And then of course the practices of continuous integration and continuous delivery what we might otherwise think of as DevOps. And of course the benefits that containerization brings to the idea of component architectures, the idea of loose coupling, the separation of concerns, the ability to mix and match different service implementations to be able to compose your application are all ideas that were matured in the cloud world but have a lot of utility in the edge world. Now we actually call it edge native programming but you can think of that as being mostly cloud native programming with a further extension that there are certain things you have to be aware of which you're building for the edge. You have to recognize that resources are limited unlike the cloud where we have this notion of infinite resource, you don't have that at the edge. You have very confined and constrained resources. You also have to be worried about latencies and the fact that there is a network that separates the different services and that network can be unreliable. It can introduce its own forms of latencies. They may be bandwidth constrained and those are issues that you now have to factor into your thinking as you build out the logic of your application components. But I think by building on the cloud native programming development paradigm, you know, we get to exercise sort of all the skills that have been developing and maturing in the cloud world now for the edge. Yeah, it makes sense. My last question is around security. I mean, I've often sort of tongue in cheek said, you know, building a motor around the castle doesn't work anymore. The queen, i.e. the data has left the castle. She's everywhere. So what about the security model? I mean, I feel like the edge is moving so fast. Do you feel confident or what gives you confidence that we can secure the edge? Yeah, the edges does introduce some very interesting and challenging concerns with respect to security because frankly, the compute is out there in the wild. You know, you've got computers in the store. You've got, you know, people walking around the kiosks. You have in the manufacturing site, you know, workers that are, you know, in the midst of all of this compute capability. And so the attack surfaces are substantially bigger. And that's been a big focus for us is how to ensure not only validating integrity of the software that was great, but also takes advantage of one of the key characters with the edge computing to bring to the table, which is if you think about it, you know, when you've got personal and private information being entered into quote, the system, the more often you move that personal and private data around and certainly the more that you move it to a central location and aggregate that with other data, the more of a target it becomes, the more vulnerable and exposed that data becomes and by using edge computing, which moves the workload out to the edge where that data has been created, in some sense, you can process on it there and then have to move it back to any central location you don't have to aggregate it. And that actually in itself is a counterbalance to all the other issues that we also described about security by essentially not moving the personal and private and protecting it by keeping it exactly where it began. You know, Rob, this is an exciting topic. It's a huge opportunity for IBM and Genie and Arvin talk about the trillion dollar opportunity and hybrid cloud. The edge is a multi-trillion dollar opportunity for IBM and so you just got to go get her done. But I really appreciate you coming on theCUBE and sharing your insights. Awesome topic. And best of luck to you. I appreciate it, David. Yeah, thank you. Thank you for the opportunity to talk about this stuff. And stay safe and thank you for watching, everybody. This is Dave Vellante for theCUBE. This is our coverage of IBM Think 2020, the digital tank. We'll be right back for this short break.