 Hi, I'm Peter Burris and welcome to Wikibon's Action Item. Once again, we're broadcasting from the Cube Studio in beautiful Palo Alto, California and I have joining me here in the studio, George Gilbert, David Floyer, both Wikibon analysts and remote. Welcome Neil Raiden and Jim Kabilis. This week we're going to talk about something that's actually quite important and it's one of those examples of an innovation in which technology that is maturing in multiple domains is bought together in unique and interesting ways to potentially dramatically revolutionize how work gets done. Specifically we're talking about something that we call augmented programming and the notion of augmented programming borrows from some of the technologies associated with newer declarative, low-code development environments, machine learning and an increasing understanding of the role that automation is going to play, specifically as it pertains to human and human augmented activities. Now, low-code programming has been around for a while, machine learning's been around for a while and increasingly some of these notions of automation have been around for a while, but it's how they are coming together to create new approaches and new possibilities that can dramatically improve the speed of systems development, the quality of systems development and ultimately and very importantly, the ongoing manageability of those systems. So Jim Kabilis, let's start with you. What are some of the issues associated with augmented programming that users need to be focused on? Yeah, well the primary issue or really the driver is that we need to increase the productivity of developers greatly because the required of them to build programs applications faster with fewer resources and deploy them more rapidly in DevOps environments and to manage that code and to optimize that code for 10 zillion downstream platforms from mobile to web to the internet of things and so forth. So they need power tooling to be able to drive this process. Now, low-code platforms, you know that whole low-code space has been around for years and very much evolved from what used to be called rapid application development, which itself evolved from the 4GL languages of decades past and so forth. Looking at it now here, we're moving towards the end of the second decade of this century. The whole low-code development space has evolved as rapidly merging into BPM on the one hand, orchestration modeling tools, robotic process automation on the other hand to enable the average end user or business analyst to quickly gin up an application based on, you know, being able to wire together UI components fairly rapidly and drive it from the UI on in. What we're seeing now is that more and more machine learning is being used in the process of developing low-code application or in the low-code development of applications. More machine learning is being used in a variety of capacities and one of which is simply to be able to infer the appropriate program code from external assets like screenshots and wireframes but also from database scheming and so forth. So a lot of machine learning is coming to this space in a major way. But it sounds so, there's still going to be some degree of specialization in the nature of the tools that we might use in this notion of augmented programming. So RPA has maybe associated with a certain class of applications and environmental considerations. And there will be other tools, for example, that might be associated with different application considerations and environmental attributes as well. But David Floyer, one of the things that we are concerned about is a couple weeks ago we talked about the notion of data-aware middleware where the idea that increasingly we'll see middleware emerge that's capable of moving data in response to the metadata attributes of the data combined with invisibility into the application patterns. But when we think about this notion of augmented programming, what are some of the potential limits that people have to think about as they consider these tools? So Peter, that's a very good question. The key for all of these techniques is to use the right tools in the right place. And a lot of the environments where the leading edge of this environment assumes an environment where the programmer has access to all of his data, he owns it, and he is the only person there. The challenge is in many applications you are sharing data. You're sharing data across the organization, you're sharing data between programmers. Now, this introduces a huge amount of complexity. And there have been many attempts to try and tackle this. There have been data dictionaries, there have been data management, ways of managing this data. They haven't had a very good history. The efforts involved in trying to make those work within an organization have been at best spasmodic. Spasmodic, good word. So when we go into this environment, I think the key is make sure that you are applying these tools to areas initially where somebody does have access to all the data and then carefully look at it from the point of view of shared data because you have a whole lot of issues in state environments which we do not have in non-state environments. And the complexity of locking data, the complexity of many people accessing that data, that requires another set of tools. So I'm all in favor of these low code type environments, but you have to make sure that you're applying the right tools for the right type of applications. And specifically, for example, a lot of the metadata that's typically associated with a database is not easily revealed to an application developer nor an application. And so you have to be very, very careful about how you exploit that. Now, Neil Raden, there has been over the years, as David mentioned, a number of passes at doing this that didn't go so well, but there are some business reasons to think why this time it might go a little bit better. Talk a little bit about some of the kind of a higher level business considerations that are on the table that may catalyze better adoption this time of these types of tools. Well, one thing is that no matter what kind of an organization you are, whether you're a huge multinational or an SMB or whatever, all of these companies are really rotten with what we call shadow systems. In other words, companies have applications that do what they do and what they don't do, people cobble together, the vast majority of them are done in access and Excel, still even in so advanced organizations who find this. So if there's a way to eliminate that because it's a real killer of productivity, then that's a real positive. I suppose my concern is that when you deal at that level, how are you going to maintain coherency and consistency in those systems over time without any orchestration of those systems? And what David is saying, I think, is really key. Yeah, go ahead, sorry, Neil. Go ahead. No, that's all right. No, what I was... I think... Sorry. That was... You think? No, go ahead. No, what I was going to say was, and a crucial feature of this is that a lot of times the application is owned by a business line and the business line presumes that they own their data and they have modeled those systems for a certain type of work, for a certain volume of work, for a certain distribution of control, and when you reveal a lot of this stuff, you sometimes break those assumptions and that can lead to real serious breaks in the system. Yeah, well, they're not always evil as we like to characterize them. Some of them are active, well thought out and really good system, better than anything they've been given from the IT organization. But the point is they're usually pretty brittle and they require a lot of effort for the people who develop them to keep them running because they don't use the kind of tools and approaches and platforms and methodologies that lend themselves to good quality software. So I think there's real potential for RPA in that area. I think there are also some interesting platforms that are deriving to help in this particular area, particularly of applications which go across departments in an organization. ServiceNow, for example, has a very powerful platform for very high level production of systems and it's being used a lot of the time to solve problems of procedures, of procedures going across different departments, automating those procedures. So I think there are some extremely good tools coming out which will significantly help, but they do help more in the serial procedures rather than the concurrent procedures. And there are also some expectations about the type of tools you use and the extensibility of those tools, et cetera, which leads me anyway, George, to ask the question about some of the machine learning attributes to this. We've got to be careful about machine learning being positioned as the panacea for all business problems, which too often seems to be the case. But we are certainly, it's reasonable to observe that machine learning can in fact help us in important ways at understanding how patterns in applications and data are working, how people are working together. Talk a little bit about the machine learning attributes of some of these tools. Well, I like to say that every few years we have a technology we get so excited about that we assume it tastes like chocolate costs a dollar and tears cancer and machine learning is that technology right now. The interesting thing about robotic process automation and many low code environments is that they're sort of inheriting the mantle of the old application macros and even cross application macros from the early desktop office wars. And the difference now is unlike then there were APIs that those scripts could talk to and they could then treat the desktop applications as an application platform. As David said, and Neil, we're going through application user interfaces now. And when you want to do a low code programming environment you want often to program by example, but then you need to generalize parts. You know, when you move this thing to this place you might now want to generalize that and that's where machine learning can start helping take literal scripts and adding more abstract constructs to them. So you're literally digitizing some of the digital primitives that are in some of these applications and that allows you to reveal data that machine learning can apply to make observations, recommendations about patterns and actually do code generation. And you know, I would add one thing that it's not just about the UI anymore because we're surfacing as we were talking earlier the data driven middleware. Another way of looking at what used to be the system catalog, we had big applications all talking to a central database. But now that we have so many repositories we're sort of extricating the system catalog so that we can look at and curate data in many locations and these tools can access that because they have user interfaces as well as APIs. And then in addition, you don't have to go against a database that is sort of unprotected with an application's business logic more and more we have, you know, microservices and serverless functions where they embody the business logic and you can go against them and they enforce the rules as well. That's great. So David Foyer, hold on Jim. Dave Foyer, this is not a technology set that suddenly is emerging on the scene independent of other changes. There's also some important changes in the hardware itself that are making it possible for us to reveal data differently so that these types of tools and these types of technologies can be applied. I'm specifically thinking about something as mundane as SSD, flash-based storage and other types of technologies that allow us to do different things with data so that we can envision working with this stuff. Give us a quick list down in the infrastructure some of the key technologies in making this possible. So when we look at systems architectures now what we never had was fast memories, fast memories, fast storage. We had very, very slow storage and we had to design systems to take account of that. What is coming in now is much, much faster storage built on things like NVMe over fabrics which really get to any data within microseconds as opposed to the milliseconds. That's thousands of times faster. So what you can do with these is not only can, the access density that you can achieve to the data is much, much higher than it was. Again, many thousand times higher. So that enables you to take a different approach to sharing data. So instead of having to share data at the disk level you can now, for example, take a snapshot of data. You can allow that snapshot to be the snapshot of, for example, the analytic system on the hour or on the day or whatever time scale that you wanted. And then in parallel you can use huge amounts of analytic data against a snapshot of that same data while the same operational system is working. So there are some techniques there which I think are very exciting indeed. And the other big change is that we're going to be talking machine to machine. Applications were designed for human. Most of application were designed for a human to be the recipient at the other end. One of the differences when you're dealing with machines is now you have to get your code done in microseconds as opposed to seconds. And again, a thousand times faster. This is a very exciting area. But when we're looking at low code, for example, you're still going to need those well-crafted algorithms, those well-crafted code, very fast code that you're going to need as one of the tools of programmers. So there's still going to be a need for people who can create these very fast algorithms. So an exciting time all the way around for programmers. What are we going to say, Jim? And then I want to come back and have you talk about DevOps for a second. Yeah, in fact I was going to add to what David was just saying. So most low code tools are not entirely no code, meaning what they do is they auto-generate code, pursue it to some business declaration to clarify the specification that the code, that actual professional programmers can go in and modify that code and tweak it and optimize it. One of the, and I want to tie in now to something that George was talking about. Okay, the role of ML in this process. ML can make a huge mess in the sense that ML can be an enabler for more people who don't know a whole lot about development. They're going to build stuff willy-nilly, just so there's more code out there that you can shake a stick at and there's no standards. But also I'm seeing, and I saw this past week, MIT has a project, they already have a tool that's able to do this. It's able to use ML to take a snapshot or a segment of code out of one program and then modify it so that it fit and then transplant it into another application and modify it so it fits the context of the new application along various attributes and so forth. What I'm getting is that ML can be, according to what say MIT has done, ML can be ML, can be a tool for enabling reuse of code and recontextualization and tweaking of code so that, in other words, ML can be a handmaiden of enforcing standards as code gets repurposed throughout these low-code environments. I think that, in other words, ML is a double-edged sword in terms of enabling stronger or weaker governance over the whole development process. Yeah, and I want to add to that, Jim, that it's not just you can enforce or at least reveal standards and compliance, but it also increases the likelihood that we become a little bit more tool dependent and then going back to what you were talking about, a little bit less tool dependent, I should say, and going back to what you were talking about, David, it increases the likelihood that people are using the right tool for the right job, which is a pretty crucial element of this, especially as we do an adoption. So, Jim, give us a couple of quick observations on what a development organization is going to have to do differently to get going on utilizing some of these technologies. What are the top two or three things that folks are going to have to think about? Yeah, well, first of all, in the low-code space, there are general-purpose tools that can bang out code for various target languages, for various applications, and there are highly special-purpose tools that can go gangbusters on auto-gening web application code and mobile code and IoT code. First and foremost, you got to decide how much of the ocean you want to boil off in terms of low-code. I recommend that if you have a requirement for accelerating, say, mobile code development, then go with low-code tools that are geared to iOS and Android and so forth as your target platform and stay there. Don't feel like you have to get some monster suite that can do everything potentially. That's one critical thing. Another critical thing is it's not the tool that you adopt, it needs to be more than just a development tool. It needs to also have capabilities built in that to help your team govern those code builds within whatever DevOps, CICD repository you have inside your organization, make sure that the tool you've got plays well with your DevOps environment, with your workflows, with your code repositories. And then number three, keep forgetting this, but the front-end development is still not walking the woods. In fact, specifying the complex business logic that drives all this code generation is the stuff for professional developers more often than not. These are complex. Even RPA tools are quite frankly not as user-friendly as maybe potentially they could be down the road because you still need somebody to think through the end-to-end application. And then to specify the steps at a declarative level that need to be accomplished. Before the RPA tool can do its magic and build something that you might want to then crystallize as a repeatable asset in your organization. So it doesn't take the thinking out of application development. Oh no, no, no, no. All right, so let's do this. Let's hit the action items and see what we all think folks should do next. And David Floyer, let me start with you. What's the action item out of this? So the action item is horses for courses. The right horse for the right course, the right tools for the right job. Understand where things are stateless and where things are state and use the appropriate tools. And as Jim was just saying, make sure that there is integration of those tools into the current processes and procedures for coding. George Gilbert, action item. And I would say that building on that, start with pilots where it involves one or a couple simple applications, or I should say, one or a couple of enterprise applications, but with less sort of branching, if then type of logic built in, it could be hardwired. So simple flows. Simple flows so that over time you can generalize that and play with how the RPA tools or low code tools can generalize their auto-generated code. Neal Raiden, action item. Yeah, my suggestion is that if you involve someone who is going to learn how to use these tools and develop an application or applications for you, make sure that you're dealing with someone who's going to be around for a while, because otherwise you're going to end up with a lot of orphan code that you can't maintain. We've certainly seen that before. That's great. Jim Cabela's action item. Yeah, action item is approach low code as tooling for the professional developer, not to necessarily bring in untrained, non-traditional developers. Like Neal said, make sure that the low code environment itself is there for the long haul. It'll be managed and used by professional developers and make sure that they are provided with their front-end visual workspace that helps them do their jobs most effectively and that is user-friendly for them to get stuff done in a hurry. And don't worry about bringing in the freelance, untrained developers into your organization or somehow retasking your business analysts to become coders. That's probably not the best idea in the long run for maintainability of the code, if nothing else. Certainly not in the intermediate term. Okay, so here's the action item. Here's our Wikibon action item. As digital business progresses, it needs to be able to create digital assets that are predicated on valuable data faster in a more flexible way with more business knowledge embedded and imbued directly in how the process works. A new class of tools is emerging that we think will actually allow this to happen more successfully. It combines mature knowledge in the application development world with new insights in how machine learning works and new understanding of the impacts of automation on organization. We call these augmented programming tools. And essentially, we call them augmented programming because in this case, the system is taking some degree of responsibility for the business to generate code, identify patterns, and ultimately do a better job of maintaining how applications get organized and run. While these technologies have potential power, we have to acknowledge that there's not ever going to be a one-size-fits-all at all. In fact, we believe very strongly that we're going to see a range of different tools emerge that will allow developers to take advantage of this approach given their starting point of the artifacts that are available and the characteristics of the applications that have to be built. One of the ones that we think is particularly important is robotic process automation or RPA, which starts with the idea of being able to discover something about the way applications work by looking at how the application behaves on a screen and encapsulate that, generalize it so that it can be used as a tool in future application development work. We also note that these application development technologies will not operate independent of other technology and organizational changes within the business. Specifically on the technology side, we are encouraged that there's a continuing evolution of hardware technologies that's going to take advantage of faster data access utilizing solid-state disks and VME over fabric and new types of system architectures that are much better suited for rapid shared data access. Additionally, we observe that there's new classes of technologies that are emerging that allow a data control plane to actually operate based on metadata characteristics and informed by application patterns often through things like machine learning. One of the organizational issues that we think is really crucial is that folks should not presume that this is going to be a path for taking anybody into business and turn them into an application developer. You still have to be able to think like an application developer and imagine how you turn a business process into something that looks like a program. But another group that we think has to be considered here is not just the DevOps people, that's important, but go down a level, the good old DBAs who have always suffered through new advances in tools that made the assumption that the data that's in a database is always available and they don't have to worry about transaction scaling and they don't have to worry about the way that the database manager set up. So it would be unfortunate if the value of these tools from a collaboration standpoint to work better with the business, to work better with the other programmers ended up failing because developers continue to not pay attention to how the underlying systems that currently control a lot of the data operate. Okay, once again, this has been, we really appreciate you participating. Thank you, David Floyer and George Gilbert and on the remote, Neil Raden and Jim Kabilis. We've been talking about augmented programming. This has been Wikibon Action Item.