 please a warm welcome for Harry Forbes. Welcome, Harry. Thank you so much, Steve. Thank you very much, Steve. And I'm honored to be here today. I'll try to keep my presentation right on time and make it worthy of your time and attention. I was very honored when the open group invited me to speak at this event. So immediately I said, yes, I'd be happy to come and speak. And they said, good. You can speak right after Peter Martin. And speaking after Peter Martin is kind of a challenge. And so please bear with me as I try to rise to that challenge. It's even more of a challenge that today the message I'm bringing is a little bit difficult, I would say, as a wise rabbi once said, no prophet is without honor except in his own town. Well, I'm in my own town in a way here in that I'm with people I've known for decades in this industry, including, I should say, Peter Martin. Furthermore, I would not claim the title of a prophet, but the message that I have today might be a little bit difficult. And that message is, the end is near. This is always an unpopular message. But we always hear it. There's no need to argue the point, I think, in this group that the end is near for the traditional process automation supplier business model. Peter just mentioned that in the question and answers. Automation hardware and software at the control level, that is at what's called ISA 95, one and two, has not changed fundamentally since the introduction of PLCs and DCSs in the 60s and 70s. And again, this is something that Peter just mentioned. Then as today, the industrial automation market revolves around bundled hardware and software where each supplier bundles much like the mini-computer market of the 1970s where suppliers like digital equipment, data general, prime, HP, et cetera, each came to market with their own proprietary set of application software and partners. The formation of the Open Process Automation Forum and its support by much of the process automation industry indicates awareness already that this highly vertically integrated business model is running out of time. The end is also near for the traditional market for embedded systems. Embedded systems and embedded system software development. We're all tired of foolish talk about the so-called smart refrigerator and the fact that network, but the fact is that network embedded systems, especially industrial embedded systems, need to change the way they are developed and managed through the life cycle. At present, much of the software is proprietary. Firmware and software updates are very difficult, although they're very important for security reasons. Needless to say, therefore, surprisingly, security is a chronic problem, especially in the consumer segments. Applications are inflexible, tool chains are fragmented, development is slow, hardware software integration is perilous, and worse, there's a relatively small base of developers who are experienced in these things. Other than that, it's great. Now, the point is that embedded systems are very, very critical in industrial internet of things and in industrial automation. Well, it's one thing to say that the end is near, it's another thing to say what's coming. And that's the prophetic aspect, and I certainly do not claim a prophetic vision. Rather, let me just point out some observations and state how I would interpret them and what I see for both embedded systems and for the industrial automation market. But I'll stick with my conclusion, the end is near. I had a discussion recently with a C-level executive who's not a member of this forum, and this is the reason he gave, he said, our customers do not wanna live in a plant where they have to manage their application software across hundreds or thousands of devices for many different suppliers. He expanded that it was difficult enough to manage a plant's automation across a small number of controllers made by only a couple of vendors, and managing an application with much greater supplier community seemed to be a bridge too far, too difficult. And yet, not only the industrial automation industry but several other industries are creating reference architectures or even specific solutions for highly distributed automation and functionality, and these have many things in common. For instance, the automotive industry is in the midst of developing a reference architecture that will encompass on-car safety, autonomy, remote services, infotainment, and convenience. And part of that architecture is the definition of a specific set of Linux software known as automotive grade Linux. In telecommunications, another absolutely massive industry, companies are developing models for digitizing both the cell tower equipment and the central office. And this is called a network function virtualization or NFV and cord central office re-architected as a data center. The huge challenges of digitalization in industries like automotive, telecommunications, and industrial automation will require management of remotely deployed software on a large scale and over a long period of time, decades. The way, and this will require breakthroughs, not incremental improvements but absolute breakthroughs in the way industrial software is developed, deployed, and maintained on all types of hardware including embedded devices. The happy news is, I'm here to say I think these breakthroughs are already in existence or in sight and can be adapted quickly. The capability to deploy and manage widely distributed applications in real time is available now in the form of some of today's open source tools for container management and orchestration. I'm going to make the case why it would be wise for open process automation and the open process automation forum to take a look at these technologies and give them extremely serious consideration when they choose and develop their specifications. Historically, we've kind of identified three different classes of software development which had some overlap but also each required unique forms of development and tooling. Enterprise software which ran the business, embedded software which ran the stuff or the things and most recently cloud software which runs on third party resources. My bet is that over the next five years or so cloud software development will come to dominate all these forms of software development and they will more or less converge. Furthermore, the convergence will take place not at the language pace that we're accustomed to in some of our industries especially industrial automation but it'll happen at an open source kind of speed. The first phase is the development of so-called cloud native software and then the second phase will be kind of the scaling down of this to encompass embedded systems which traditionally are much smaller and we can see this already in the industrial automation market. I'll get to that in a few minutes where a lot of products that have been in the market are now incoming with open source software for application management. Finally an emerging cloud technology called Unicernals combines really the discipline of embedded systems and with the cloud software technology and this area is receiving venture investment and the targeted markets are not only the enterprise the targeted markets are the industrial internet of things. And finally cloud computing is such a big industry it's about a $250 billion industry with massive amounts of R&D funding and huge companies supporting it such as Amazon, Microsoft, Google and they're like Alibaba. These giant firms are competing fiercely for a huge and growing business so I expect very rapid development there. So software is going to have a food chain and this is what the food chain is going to look like. If software is eating the world the software that's eating the software development world is cloud software and even our insular and specialized world of embedded software and industrial embedded software which has been kind of a haven is going to be overtaken and swallowed by the current and future cloud software technologies. Now please note I'm not saying that all applications will run in the cloud but I'm saying that software development is going to be totally driven by the kind of technologies that are used for the cloud. Why, let me say two reasons. First as I mentioned it's a big big industry with a lot of big companies working to and a new industry working to improve their services and compete for a very rapidly growing market. The second reason is because is the important one for the internet of things and for industrial automation. It can scale down. So this kind of software development technology scales down to very small systems and I'll provide some examples. Now just I'll make a remark here that this is an interesting assembly of people. It's IT people and OT people. I'm an OT person so if I'm talking about IT I don't know what I'm talking about but if I'm talking about OT I've got very deep segment here, or deep knowledge, domain knowledge. What I would suggest is during coffee breaks maybe some of the OT and IT folks could talk to each other even though they're not our kind. I think some of these things would be helpful to be shared among IT and OT people. Maybe you can say or ask an IT or an OT people what's your organization doing now? What are your major initiatives? What are your challenges? Because this segment is really for the OT people I'll be talking about IT. So I'm sort of obsessed with software container technology and what's the significance? I can see why these are important for the cloud people tell me but why are they important for us if things like embedded systems and industrial automation? So let's back up and review the history of containers. Containers started and cynical analysts will say oh there's nothing new, this has been around for 20 years, it's the same old, no it isn't. Container technology started when companies like Sun Microsystems were building large canoe anchor servers and from the standpoint of administration of applications it was easier to partition these machines into smaller machines and run things together on a smaller machine or a smaller space. The concept behind containers initially was to partition applications from each other to make management of these large servers simpler. The downside of course was that it required huge amounts of highly skilled Unix system administrators and administration which was not a problem for Sun Microsystems or Oracle or SAP or companies like that. But what changed? What changed was that as the container technology was made available in Linux, Venture Firms developed software tools that greatly simplified the creation and management of containers. The software they created was released to the open source community about five years ago and one of the principle components of it is a thing called Docker. So during the last five years, container interest has exploded in the enterprise space and the cloud space. Furthermore, during the last couple of years in the industrial automation space, many industrial products have been delivered supporting container technology for application deployment. And I'll bring some examples later. Now earlier I mentioned NFV and CORD for the telecommunications industry and this industry digitization moves the processing of analog signals to the very edge of the network and software defines all the services and that are digital from the cell tower through the central office. The advantage of that for telecommunications companies is a highly flexible infrastructure and flexible services and new services that can be added without changing the hardware. So how to outcome network operators and vision being able to manage such a massive software infrastructure with real-time requirements? During the coffee break, ask them. Ask them what the role of container software is in their plans and I believe you'll find the answer is that it's absolutely critical to their vision. And as I said, what makes it important is that it's scalable down from these massive telecommunications central offices and data centers to very small systems and I'll get to that point. As an added bonus, really the open source world has converged pretty much to a single container orchestration tool. Orchestration for you OT folks is a $50 word, maybe $64 word for software that manages large installations of containers. And the anointed software orchestration tool is named Kubernetes. It's a measure of the reliability and success that Kubernetes is perhaps the hottest open source application in space today. And so perhaps some of the folks in the open process automation forum could discuss with some of the IT folks during the coffee break or maybe in the cocktail break. So now in 2018, container technology is ubiquitous and easily available. Why then should it receive serious consideration, I'm having a problem with that word, from an organization like the Open Process Automation Forum? The simple answer is that Docker and the like are not technology for managing and building containers. It's a technology for delivering applications and that's the strong suit. As I said earlier, going forward, network devices that are embedded systems are near the end of the way they've run before. Going forward, high value devices and services will need to be remotely managed, expandable, updatable, and their software will need to be completely updated and redone and this needs to be done with an extremely high level of automation. So why should this concern the Open Process Automation Forum because in the world of industrial automation, especially process automation, application changes occur on a daily basis now. With our existing DCS, people make daily changes to configuration or add new applications. That's the stuff of work within the automation space. New measurements are added, new controls are tested, tuning parameters are adjusted, alarm limits are changed, new control schemes, et cetera. Never a day goes by in a process plan where changes aren't made. And that is why process industry companies, oil companies, chemical companies, et cetera, et cetera, have so much intellectual property invested in these systems. These systems represent to them literally man years of work in engineering and adjusting the control systems to match the physical plant and its operating objectives. But unfortunately, that intellectual property is captured in largely non-readable and highly proprietary forms of control languages and idioms. Yet today's DCS is our people are able to make these changes with ease. A successful Open Process Automation technology will need to support at least the same level of functionality and probably much higher with respect to changes. So the deploying applications on target systems, especially on smaller target systems as are envisioned in Open Process Automation, is going to be a critical task and it will be a task that will be executed on a daily, maybe an hourly basis for the installed life of some of these systems. This is a figure from the Open Process Automation technical reference model snapshot showing the solution life cycle and work processes for an automation system. And here I'm highlighting the steps in that life cycle that pertain to application deployment. As you can see, they are the critical ones that actually touch the Open Process Automation components. Now let me ask you a question. If there's a software deployment technology in existence for application deployment that is highly standardized, widely used, available in open source, works on a variety of platforms and has been field proven in installations from the very largest, say Google, to very small ones executing on single Raspberry Pis. Why would someone choose, or God forbid, create a different deployment technology for industrial automation? Please remember that the stated goal here and a very wisely stated goal of the Open Group, I believe, in general, but certainly of the Open Process Automation form was not to invent standards, but rather to reference existing standards where they are established and relevant. And are these relevant? Absolutely, I would say. This slide shows the architecture-defined interfaces for Open Process Automation from the view of the distributed control node. And notice that there are three interfaces that I think are particularly important here, the configuration management interface. I don't know if I can work the laser. Maybe not. It's too small. I should have brought my own. But the configuration management interface defines the services used to manage DCF configurations by outside tools. And second is an application management interface that defines services and information models used by applications in the framework. And third is the application services interface, which defines services and information used by applications to access framework services. Now my suggestion, and this is a serious suggestion for the Open Process Automation form, is that as you begin to flesh out and define these interfaces, take them at their highest level, take your highest level understanding of these interfaces and early thoughts about the required functionality and then compare these point-by-point with the available functionality within the container software that's available on Open Source. And I think that would be a very valuable exercise and I suspect that much of the required functionality will be available for these containerized applications using these software tools. Now here's another point. If you choose not to employ these container technologies, then you will choose, by default, to compete against suppliers who will pick up and use these very same tools. Remember that image of that unfortunate fellow standing on a beach in front of the huge tsunami on that tragic day when so many people lost their lives. Business positioning is being in the right place at the right altitude at the right time. So please make sure that you're not going to be on a beach where that guy was standing in a few years. And I'll point some examples to this in a couple of slides. Here's another twist, an interesting twist on the problem of DCS obsolescence. A few weeks ago at the Honeywell User Group meeting for the Americas, Honeywell introduced something called the ELCN. And I could give you the whole name, but it trips right off the tongue. It's called the ELCN 501.1 or something like that. No, I don't mean to criticize the marketing, but we have some really funny names for stuff in industrial automation. This is a combination of emulation and virtualization that emulates, that enables a 1980s Honeywell LCN or local control node. The ELCN enables legacy applications, software and management tools to interface with a modern execution platform, including on-premise cloud systems. So the LCN controller, the original one, dated from the 1980s. And there are many of them still in existence today. And you can ask the IT folks, you can ask the OT people about having them. I would guess that there are probably more than a thousand of those controllers. This is just a guess, not based on evidence, within ExxonMobil alone, and I've seen many others in plant tours and other plants. So this ELCN, which was just introduced, represents kind of a classic lift and shift solution to obsolete hardware. Note the quote from ExxonMobil. This has reset the odometer, if you will, of the obsolescence of these systems. In other words, the installed base of Honeywell TDC3000, a DCS, now has the option to move forward to much more modern hardware while maintaining their existing, although archaic, configuration and management software. The problem alleviated here is that there's no longer hardware available to create replacements for these old devices. But the problem is solved, if you will, through this emulation and virtualization. Now, what I suspect is that end users of Honeywell TDC3000 LCNs would also like to upgrade their software, but on the other hand, they don't see themselves anymore driving off a cliff with respect to, I'm not going to be able to get parts for systems I need for my plan. So they have an alternative path now to migrate forward. My suspicion, although I have absolutely no evidence or knowledge to support it, is that most of the other DCS companies, both members of the Open Process Automation Forum and others, have similar development programs now in progress. I'm willing to take the bet that Honeywell is not the only company that's pursuing this kind of lift and shift solution for old system obsolescence. In fact, for suppliers of more modern DCS, this kind of solution would be even simpler to implement. And they already offer virtualized controllers as part of their project services portfolio. So this would include companies like ABB, Emerson, Schneider Electric, Siemens, Yokogawa. The implication of this for the Open Process Automation Forum is that products designed to your specifications will be competing against virtualized products, virtualized legacy products. That's going to be one aspect of the market competition that you're going to face in the market for open process automation products and systems. You'll need to provide a higher business value than mere virtualization of legacy systems. The other implication is that the users of legacy systems will likely have alternative upgrade paths provided by their incumbent suppliers. That raises the bar for you as the Open Process Automation Forum, and you'll need to see that clearly in order to succeed in the market. So existing suppliers in this market are not standing still and neither are they fools. And what I would say is this gives the Open Process Automation Forum some breathing room to do things right the first time. And if you do that, then you'll bring the right weapons to the right war rather than the old weapons to the new war. Or to use a sports cliche, you'll be moving to where the puck is going to be, not where it is now. Now I've laid out this scenario of new software technology, especially containers, offering the possibility to revolutionize software within industrial automation. And so far I've done so without providing too much solid evidence. So let me now briefly, very briefly, survey a few products that we've seen in the last couple of years that I believe provide some evidence that this use of software containers will be the case. First, Cisco, not an automation supplier, but Cisco introduced in 2016 the so-called IOX application environment, which basically was a Linux environment running within their infrastructure. To make this more attractive, two years later, Cisco added container technology to IOX, which greatly eased the level of effort for partners and end users to add applications into this environment. A second example is a company called Phoenix Contact, a German company, really not a major automation supplier in the strict sense, but many of their customers require some level of automation in their solutions. And Phoenix Contact introduced something called the PLC Next, which enables control to be developed using a number of different software tools, not only the common IEC 61131, but also packages such as MATLAB and scripting languages and programming languages such as C and C+. I don't believe this product uses containers yet, but watch the space, it's inevitable. A third illustration is the General Electric Industrial Internet Control System. This emerged a few years ago. Again, I'm not an expert on the internals of this, but the stated properties are a painless software deployment at a fleet level. GE customers tend to operate a lot of similar assets, and so I'm willing to take a pretty big bet that containers are part of this offering. Fourth, a European company again, a company called Hilscher, which is a supplier of integration products, has recently introduced an edge computing platform called Net IoT, which includes support for any number of industrial protocols, which is Hilscher's stock and trade, along with support for Docker so that end users can add containerized applications to the product for their own edge computing needs. Fifth, a company called TT Tech, which is one of the companies that was the leader in the development of time-sensitive network, and it's a technology that's still going on in development for quality of service, network quality of service that's very important for a number of applications, including industrial. TT Tech has recently introduced a automation product at the edge that combines both infrastructure and compute capability. An end user can configure IEC 1131 applications as well as containerized applications or even build and add virtual machines to the device. Sixth, a venture firm, a venture stage firm, Risen I.O. Risen I.O. ported the Docker software to small single-board ARM computers, and they've also developed an updateable Linux operating system. This enables end user applications to run in containers on very, very small computers. I almost call them toy computers. In effect, users can now run on even the smallest and most inexpensive single-board computers. This kind of equipment is not designed for heavy lifting, but on the other hand, if you look at the open process automation form, their envisioning of DCNs being very small, potentially very small, doing just a single control loop is not designed for really heavy lifting either. A seventh example is a company called Opto22, which is well-known again for quality products mostly in connectivity and infrastructure. Opto introduced a PLC-like product last year with a set of third-party software like the branded Ignition SCADA software and Codesys 1131 control software and also provided a Linux environment for the user, albeit at the time without containers. Again, watch the space. It will not take them long, in my opinion, to add containers and Docker to such a package. The point I'm trying to make by showing you these products is that new product entries into the industrial automation market are already using software container technology to enable end users to develop and deploy and manage applications on these platforms, and they're doing it for a reason, and that reason is that they make it much simpler and easier for end users or partners to do it. This is, I believe, what the future of industrial automation is going to look like, and one part of future competition will look like. So my advice to open process automation is to be ready for it. There's no reason to be surprised, and there's no reason to bring a knife to a gunfight. Finally, my thesis is that the tools developed in the cloud will be usable for almost all types of software development and deployment, and once these tools are applied effectively to industrial automation, end users will have entirely different and much higher expectations. Management at scale and management over a long life cycle are requirements that are shared by many types of software, cloud, industrial IoT, and open process automation. It's containerization that enables management at scale and over time, and that's why it's important. The fact that OPAF products and members will have to compete in a different and more highly competitive automation space than exists today. So if you want to succeed in that future market, and I'm sure you do, then you must enable superior solutions for the fundamental needs of end users. And being as good as what's on the market will not incent users to shift to new technology and products. They need to be better. They need to be better at delivering business value to process operations. And I think you can do this, but to do so, you have to keep that competition in mind at all times. Thank you so much for your time and attention, and I look forward to our discussions now and this week during coffee breaks and during the cocktails. Thank you. Harry, please take a seat for a few questions right away, and then I'm sure you'll get more in the break. Oh, yes. You've got a few. Yes, we've got a few. We've got a few, so thank you very much for that. So as these distributed software defined everything systems become more prevalent, how do you see the semantics of information flow and decision control cascading get impacted by the lack of platforms and standards? That's a good question. I don't think I know the answer to that. I really don't think I know, and I'm just going to leave it right there because I think that's one of the challenges we're going to face is it's very true in the industrial automation space. A lot of things that we have done for software development have been very constrained by limitations we had for latency or network capability or footprint or other kinds of constraints, and that's always been the way that people have developed software in industrial automation and probably in embedded systems as well. So it is a big challenge when some of these constraints start to come off for people to think about how to operate and how to deliver value in a much less constrained environment. Okay. To date, only two pure play software process automation suppliers have thrived to reach a size somewhat comparable to DCS suppliers. The example's given at OCSoft and Aspen Technologies. Exactly, I agree 100%. And remained independent. Yes. So with the end approaching, how can automation software suppliers provide investors with sustainable growth? Automation software suppliers with the end approaching are going to have to, again, this is my opinion, will have to adopt more of the business model of a combination of consultancies and IT services. So this will be not traditional IT services but specialized IT services. In terms of the services, it's required because the systems that will be managed are going to be much more like the IT infrastructures that we have now than the distinct and different infrastructures that people have for automation. So that's why you need the systems administration expertise, the consultancy part is going to come because that's the value that Peter was talking about in terms of how do you deliver value to somebody who's doing a specific kind of manufacturing. And that's really understanding their world of manufacturing whether they're making ethylene or whether they're making some specialty chemical. That's something that's delivered in many ways by one of those companies that you mentioned, Aspen Tech and others, specialized consultancies. So I think in the long run those are going to be the differentiators for the automation space and you'll find that it's more the latter, the consulting where you can differentiate from other companies that might enter the space with the knowledge and ability to manage large IT infrastructures. Okay. With suppliers including containers as part of their solution suite what level of security controls across privilege limits are to be set for the consumers? Where do you see the balance between suppliers' ability to assure predictability and the consumers' need for business agility? Oh, I'm not an expert in network security and cyber security. It's a very serious concern. I don't think that any, especially the adoption of any particular technology like containers is not going to make those problems go away. I think the fundamental difficulty with this is that the threat vector has changed a lot. When I first heard someone talk about cyber security for industrial control systems I thought he had gone mad because cyber criminals would go after banks. But now the threat vector has changed to things like foreign... What do we call them? Well, yeah, we don't call them nation states. We call them foreign actors or something like that. But we mean armed forces of nation states which is a different situation. So I think it's kind of a push in that this isn't going to solve that kind of problem. The one thing that I'm a little bit optimistic about but it's very far off is that these unicernals basically take a segment of the Linux kernel and so they use only what's required. So these applications have some advantages in that they're small, they boot up very quickly, they have a single address space but they also don't have a lot of attack surfaces which is a nice part. Of course they have disadvantages too but the small attack surface is I think a big advantage. Okay, we'll take one more that's about containers. If the tools we have today only help with delivering containers what are your thoughts on what kind of products will become available to manage containers? Secondly, how do we solve the existing non-managed containers of today in the future? Another one that's hard for me to answer because again I'm an OT person. What I would say is if you look at the amount of human effort of very talented people and investment that's going into these questions what I'm expecting is that that investment is going to flow into all kinds of areas from big software companies to very small venture stage firms that are looking to address those kind of questions for particular customers. So what I would say is a lot of investment money combined with a lot of people who have both a broad knowledge and or some specific knowledge of particular customers and needs can address that over time but it's basically a flow of investment and there's not a shortage of that in this kind of business right now. Alright, Harry we'll leave it there. Thank you very much. My pleasure.