 Hi, I'm Peter Burris. Welcome to Action Item. Every week I bring the Wikibon research team together to talk about some of the issues that are most important in the computing industry and this week is no different. This week I'm joined by four esteemed Wikibon analysts, David Floyer, Neil Raden, Jim Kobielus, Ralph Finos. And what we're going to do is we're going to talk a few minutes about some of the predictions that we did not get into our recent predictions webinar. So I'd like to start off with Jim Kobielus. Jim, one of the things that we didn't get a chance to talk about yesterday in the overall predictions webinar was some of the new AI frameworks that are on the horizon for developers. So let's take a look at it. What's the prediction? Prediction for 2018, Peter, is that the AI community will converge on an open framework, an open framework for developing, training and deploying deep learning and machine learning applications. In fact, in 2017, we've seen the momentum in this direction, strong momentum. If you were at AWS re-invent just a few weeks ago, you'll notice that on the main stage, they discussed what they're doing in terms of catalyzing an open API per building AI, an open model interchange format, and an open model compilation framework. And they're not the only vendors behind this. Microsoft has been working with AWS as well as independently and with other partners to catalyze various aspects of this open framework. We also see Intel and Google and IBM and others marching behind a variety of specifications such as Gluon, Keras, OnenX, NNVM, and so forth. So we expect continued progress along these lines in 2018 and that we expect that other AI solution providers as well as users and developers will increasingly converge on this, basically an abstraction framework that will make it irrelevant whether you build your model in TensorFlow or MXNet or whatever, you'd be able to compile it and run it in anybody else's back end. So Jim, one question, then we'll move on to Neil really quickly, but one question that I have is the relationship between tool choice and role in the organization has always been pretty tight. Roles of change is a consequence of the availability of tools. Now we talked about some of the other predictions, how the data scientist role is going to change. As we think about some of these open AI development frameworks, how are they going to accommodate the different people that are going to be responsible for building and creating business value out of AI and data? Peter, you hit on another level that I didn't raise in my recent predictions document, but I'll just quickly touch on it. We're also seeing the development of open DevOps environments within which teams of collaborators, data scientists, subject matter experts, data engineers, and so forth will be able to build and model and train and deploy a deep learning and so forth within a standard workflow where each one of them has task-oriented tools to enable their piece, but they all share common governance around the models, the data, and so forth. In fact, we published a report several months ago with Yvonne talking about DevOps for data science, and this is a huge research focus for us going forward and really for the industry as a whole. It's productionizing of AI in terms of building and deploying the most critical applications and the most innovative applications now in business. Great, Jim, thanks very much for that. So Neil, I want to turn to you now. One of the challenges that the big data and the computing industry faces overall is that how much longer are we going to be able to utilize the technologies that have taken us through the first 50 years at the hardware level? And there's some promise in some new approaches to thinking about computing. What's your prediction? Well, in 2018, you're going to see a demonstration of an actual quantum computer chip that's built on top of existing silicon technology and fabrication. Now, this is a real big deal because what this group in the University of New South Wales came up with was a way to layer traditional transistors and silicon on top of those wacky quantum bits to control them and to deal with, well, and I don't want to get too technical about that, but the point is that quantum computing has the promise of moving computing light years ahead of where we are now. We've managed to build lots of great software on things to go on or off and quantum computing is much more than that. I think what you're going to see in 2018 is a demonstration of actual quantum computing chips built on this and the big deal in that is that we can take these existing machines and factories and capital equipment, design for silicon and start to produce quantum chips without basically developing a whole new industry. Now, why is this important? It's only the first step because these things are not going to be based on the existing Intel I-86 instruction set. So all new software will have to be developed. Software engineers are going to have to learn a whole new way of doing things. But the possibilities are endless. If you can think about a drug discovery or curing disease or dealing with the climate or new forms of energy to propel us into space, that's where quantum computing is likely to take us. Yeah, quantum computing just to bring a kind of a fine point on it allows at any given time in the machine to be in multiple different states. And it's that fact that allows in many respects a problem to be attacked from a large number of directions at the same time and then test each of them out. So it has a natural affinity with some of the things that we think about in AI. So it's going to have an enormous impact over the course of the next few years and it's going to be interesting to see how this plays out. So David Foyer, I now want to turn to you. It's not like we're not likely to see quantum computing at the edge anytime soon by virtue of some of the technologies we face. More likely there'll be specialized processors up in a cloud service provider in the near term. But what are you going to talk about when we think about the role that the edge is going to play in the industry and the impact it's going to have on quite frankly the evolution of de facto standards? Well, I'd like to focus on the economics of edge devices and my prediction is that the economics of consumer led volume will dominate the design of IoT devices at the edge. If you take an IoT devices made up of sensors and advanced analytics and AI and specifically designed compute elements and together with a physical setup of fitting it into wherever you're going to put it, that is the overall device that will be put into the edge. And that's where all of the data is going to be generated. And obviously if you generate data somewhere the most efficient way of processing that data is actually at the edge itself so you don't have to transport huge amounts of data. So the prediction is that new vendors with deep knowledge of the technology itself using all the tools that Jim was talking about and deep knowledge of the end user environments and the specific solutions that they're going to offer they will come out with much lower cost solutions than traditional vendors. So to put a little bit of color around it let's take a couple of real world examples where this is already in place in the consumer world and will be the basis of solutions in the enterprise. So if you take the iPhone X it has facial recognition built in and it has facial recognition built in on their A11 chips to their Bionic chips, they've got GPUs they've got neural networks all in the chip itself and the total cost of that solution is around $100 in terms of the piece parts and that includes the software. So if we take that $100 and put it into what it would actually be priced at that's around $300. So that's a much, much lower cost than a traditional IT vendor could ever do and at least an order of magnitude and probably two orders of magnitude cheaper than an IT department could produce for its own use. So that leads to the conclusions that are going to be a lot of new vendors people like Sony for example Hitachi, Fujitsu, Honeywell possibly people like Apple and Microsoft and NVIDIA, Samsung and many companies that we'll predict are going to come out of India, China and Russia who have strong mathematical educational programs. So the action item is for CIOs is to really look carefully at the projects that you're looking at and determine do I really have the volume to be unique in this area? If that volume, if it's a problem which is going to be industry wide the advice we would give is wait for that device to come out from a specialized vendor rather than develop it yourself and focus investment on areas where you have both the volume of devices and the volume of data that will allow you to be successful. Alright David, thank you very much so let me wrap this week's action item which has been kind of a bridge but we've looked specifically at some of the predictions that didn't make it into our recent predictions webinar and if I want to try to summarize or try to bring all these things together I think what we'd say, number one we'd say that the development community has to prepare itself for some pretty significant changes as a consequence of having an application development environment that's more probabilistic, driven by data and driven by AI and related technologies and we think that there will be new frameworks that are deployed in 2018 and this is where it's going to start and will mature over the next few years as we heard from Jim Kobielus. We've also heard that there's going to be a new computing architecture that's going to drive change perhaps for the next 50 years and the whole concept of quantum computing is very, very real and it's going to have significant implications. Now it will take some time to roll out but again software developers have to think about the implications of some of these new architectures on their work because not only are they going to have to deal with technology approaches that are driven by data but they're also going to have to look at entirely new ways of framing problems because it used to be about something different than it is today. The next thing that we need to think about is that there still is going to be the economics of computing that are going to ultimately shape how all of this plays out. David Floyer talked about specifically at the edge where Wikibon believes it's going to have an enormous implication on the true cost of computing and how well some of these complex problems actually find their way into commercial and other domains. So with a background of those three things we think ultimately that's an addendum to the predictions that we have and once again I'm Peter Burris. Thank you very much for joining us for Action Item and we look forward to working with you more closely over the course of the next year 2018 as we envision the new changes and the practice of how to make those changes a reality. From our Palo Alto, The Cube Studios this has been Action Item.