 Hello, I'm Steve Nunn, President and CEO of the Open Group. Welcome to Toolkit Tuesday, where we highlight the various components and leading experts of the Architects Toolkit, a collated portfolio of the most pertinent technology standards for enterprise architects. During the series, I'll be calling on a number of recognized experts who will bring their particular insights on how to most effectively use the various tools in the Architects Toolkit. We'll have a mix of interviews, panel sessions and pre-recorded presentations along the way. While all standards of the Open Group are designed so they can be adopted independently of one another, the greatest value for an organization can be derived when they're used in unison. The sum of the parts should be greater than the whole. In the Architects Toolkit, we have collated a portfolio of the most pertinent ones for architects, together, all in one place. For most of these tools, certification from the Open Group is also available, so practitioners can demonstrate that they have the skills required, and recruiters can take the guesswork out of the recruitment process, all backed up by our Open Badges program. The River of Architecture. In enterprise architecture, we're very familiar with considering as is and 2B states. We do this so we can look for the gaps or the deltas between them. Indeed, we often use multiple time stamps, approximating some form of continuum. So as well as as is, or the here and now, we might consider multiple 2B states, like plus six months, or plus 18 months, or three years, depending on the organization and context, of course. Now, I could just talk about what do we mean for as is. It's always moving like a river. I could equally ask a question, what is our 2B state? Do we actually ever get there? Or is it just a North Star? But I'd like to introduce the concept of what was and what was architecture. Surely to understand trajectory, you need to know where something has come from, and at what velocity and what angle. But we tended to completely disregard yesterday's estate. Isn't it just as vital though for planning? I suggest that the River of Architecture requires what was, as is, and 2B states. Welcome, everyone. Welcome back to Toolkit Tuesday. We had a brief hiatus for a few weeks. But it's great to be back and glad that so many of you have registered for today and and and attending today. So I hope you're well wherever you are in the world. We we've been doing these a while now and we've got some great additions lined up in the coming weeks. So the reason we took a couple of weeks out was some of you will know because you you were there. We held an open group enterprise architecture practitioners event, which was a hybrid event, an in-person element in London, England, and obviously virtual attendance as well. And it was it was a great event and was delighted to announce, among other things, the launch of the Togov Standard 10th edition. So if you haven't heard about that, do go to our website and find out some more. It's an exciting development and there's there's more to come. So you'll no doubt be hearing more about that from us in these events, but also do go and and check it out. It's a significant development development and really adds some great additional content and utility to the Togov Standard. So couple of housekeeping items, please. This is many of you will be familiar with the WebEx tool, but if you want to ask a question of today's guest speaker, please use the Q&A channel for that Q&A, not chat. And if you can't see the Q&A channel, if you look at the go to the three dots in the bottom right hand corner of your screen and click on those, you'll get the opportunity to click on Q&A. Please use that for asking questions. Please use the chat channel to chat amongst yourselves, say where you're joining us from. I can see that that's that's starting up. So we love to love to see where you're all joining us from, because it is a very international audience here on Toolkit Tuesday, something that that we enjoy very much. So that's about it for the for the preamble. I'm going to dive straight in because we have some great content today. We're going to be talking about automation today and specifically automation of analytics and enterprise architecture roadmaps. And to guide us through that, we have no one more appropriate. We have Andrew Luthwaite, software consultant and support engineer of Evolution UK. So Andrew joined Evolution in 2013 and is a highly regarded consultant within the Abacus community. Abacus is a world leading enterprise architecture solution from Evolution. He specializes in a range of VA disciplines and he's worked closely with banking, insurance and financial brands helping organizations and teams implement and deploy successful business and IT strategies using Abacus. And without further ado, I'm going to hand over to you, Andrew, and we'll come back for Q&A. So please use the Q&A channel for any questions that you might have for Andrew. So without further ado, over to you, sir. Thank you very much, Steve. And yes, welcome everyone to Toolkit Tuesday. As Steve's mentioned, and the state's titles for automating analytics and enterprise architecture roadmaps. But generally there are a couple of things to go through first just before diving into some of the core content around today's particular topic. More so, other trends around automation. I think automation generally is a term that hopefully most of you have come across on the call today. And it can mean different things, different people. And we've maybe heard of automation around process automation. We maybe have heard automation in line with things like AI and machine learning. But really there are different trends within this automation space across organizations and across various industries as well. Also keep in mind for all of these trends, of course, none of these innovations really exist in isolation. Certainly for things like these cross-company collaboration tools that come out in terms of democratizing data. We'll go hand in hand with some of the other trends that we'll actually go through today in a bit more detail as well. The other one to maybe quickly mention on here around digital twins is hopefully a term that you've also come across. It's certainly not a new term. Digital twins, I guess, more stem from the ability to link up IoT devices on physical devices and physical machinery, let's say, within things like manufacturing industries so that you could represent that digitally elsewhere. So it's certainly not a new trend, but it's certainly one that's gaining a lot more pace. And certainly these days from the perspective that we come from, we might classify digital twins of the organization rather than of the actual physical object or system that we're actually looking to define. So certainly not new around physical assets, but definitely something that's interesting from an organization perspective and enterprise perspective of building a digital twin of that particular system. Composable enterprise is maybe another term you've heard. So Gatner certainly has come up with a term which is really trying to define the way that an organization can be pieced together. And more so, I think this is around the way we can actually bundle up specific digital capabilities within an organization. Really, from our perspective, we can kind of leverage that ability to, let's say, slice and dice the organization and build that within the graph database within our tool. And really from a composable perspective, we might also think of this as driving down into things like microservices from a more of a software or technical background. But certainly from a business perspective, we'd also think of these as decomposable capabilities or processes internally as well. And I guess it wouldn't be an automation webinar without mentioning machine learning or AI in particular. So we'll be looking at some specific machine learning capabilities that exist today and also I guess the way we're pushing the envelope around machine learning and AI in general as well. Just before doing that, it's probably worth giving you a quick overview of who we are as a company as well. So Avalution, the company, Abacus, the product. Like Steve said, I've been here for nearly nine years or so in our Avalution working with customers across various industries, various regions globally as well, making sure that people can develop something like an EA platform within our tool and helping customers really try to achieve specific objectives within the organization. Those objectives or use cases really span across many different industries in many different sectors, all the way from things like application portfolio management, technical debt, managing things like regulatory compliance metrics within their organization, all of which hopefully is something as an enterprise architect, we will have the capability of doing. And as it mentions here, we support multiple frameworks out of the box. I'm going to come back to that towards the end specifically to talk about the Open Group Toolkit itself and how that can actually help leverage some of the automatic capabilities around automation within enterprise architecture. So for today, a few areas to quickly go over. It's going to be around streamlining data maintenance, automating the repository, looking at how we might actually move beyond just diagramming and then how we can actually communicate this throughout the wider organization. That's a critical point I think you will find throughout all of today's webinar, which is when we actually build content, whether it's visualizations, data points and KPIs, we have to make sure we're building that for the right stakeholder at the right time. And certainly as you'll see, diagramming has had its place and there is potentially a time and place for diagramming, but more so these days, we really want to be looking at the analytics side of things, looking at the metrics and how we actually measure things within the organization. So firstly around streamlining data maintenance, there are a couple of aspects when it comes to data maintenance. The first is most organizations that we speak to and certainly from my experience, that data might exist in multiple systems. It's very rare that we come across an organization that uses one single system for all of their content. They're using content from suppliers like Service Now or Workday or Content and Jira and Excel and PowerPoints and physios. That data is strewn across the entire organization. And so what we actually want to achieve is not necessarily to go through the process of extracting that data and loading it into another tool, but more treating tools as a platform that can kind of connect these data sources together. So within Abacus, we tend to leverage the ability to connect to these other sources. Whether that data is mastered in something like Service Now, companies might master that data in Excel, might not always be the best way forward. However, if that is still the case, and of course we want to make sure that we can automate some of the content being brought into the tool. Again, the word that you hear very often is a single source of truth here. We want to make sure that we can leverage all data points within the organization and make that a connected model within the tool. The other key thing then comes to the maintenance of that data. So if it's coming from other systems, we can certainly integrate and automate that. But we also want to make sure that we provide data ownership across the enterprise. And that means people being involved with contributing content. So when we think of building things like application registers or technology portfolios or capability maps, we want to make sure that there are owners across that data. And we want to make sure that they then have the power to maintain that themselves. It's very likely if we took something like an application portfolio, 80% of those attributes that we're managing might come from an external system. However, maybe 20% of those are specific to certain scenarios that have been modeled. Whether it's current states and future states that will be modeling in Abacus, we want to make sure that people can actually directly update that content. The other thing as well around the data is it of course doesn't just have to be list based views. As much as list based views in terms of portfolios are useful, dynamic charts, dynamic visualizations and diagrams are of course another great way of allowing people to control the data that they own and make sure that there's a collaborative approach to actually automating some of that input as well. Now we might classify this as maybe a manual data maintenance. There are other areas that we should also consider. So when we look at integrations, one of the things that we really want to be emphasizing is the ability to leverage something like an API. So we're certainly not going to go through all of the intricate details of an API. However, keep in mind that the important aspect here is a read-write API. So it's useful whatever system you're using to be able to see what data is in that system and pull that out to be able to visualize that content. It's more important that you can then write back to that system. So certainly one of the things that we would look to provide is that read-write API where users can actually pull data into Abacus from those external systems, leveraging external master sources while at the same time maintaining that data within the current platform. So a two-way API is a big advantage these days for allowing communication across different systems and to allow users to actually understand that data within the product itself. Now there are many different, let's say languages that we can build the API on and certainly when we think of things like .NET APIs and REST APIs, there are specific notations that we might use to bring that data in. More so, we want to make sure that we're bringing that data into a platform or a tool that can really mould itself around external sources. So not necessarily using specific structures, but making sure that we can also customize those structures dependent on the external data. Now when we do bring all of that data into a tool, the manual maintenance approach of course is very useful from an ownership perspective. The integration from other tools is also key. However, from an automating perspective, we want to really start looking at some more of the cutting edge platforms out there. Now machine learning, again, of course isn't something that's new within enterprise architecture. It's certainly not new across many other organizations. Amazon uses it for building reviews. Hotels use it for understanding customer benefits throughout the organization. Some customers would use it for strategic pricing and throughout the models that they build. Machine learning is really there to help fill in the blanks. The key thing around machine learning is to leverage existing data. So what we essentially would allow customers to do is once we bring that data into a platform, so we bring data in from external sources, maybe users are maintaining that manually, we're using the APIs and now we want to run some kind of machine learning algorithm against that data. So it means taking the guesswork hopefully out of any of these blank values that might be in our system portfolio. This is important specifically around things like application portfolio management. If we take the idea that a specific application has a high criticality value and it's supporting very critical processes internally, machine learning might suggest that that particular disaster recovery for that particular application should be gold or high. And likewise, if there are applications that are owned by specific owners and are contributing towards processes run by let's say the HR department, then the machine learning algorithm will learn that any other applications by the same owner may likely also contribute towards that same department as well. So really it does take the guesswork out of any of the content that you want to build but it's also worth remembering as architects, we make the final decision. So we have to make sure that we can accept and reject some of the proposed values so they're not set upon us. It will give us some kind of confidence score in terms of the output and we as architects then have to evaluate those outputs to see if they're appropriate and then decide to approve or reject those as well. We're certainly kind of leading the market here to some extent on machine learning and hopefully in the future we will almost see entire architectures potentially being suggested through machine learning. Certainly now it's around the attributes and the metrics we're trying to automate but we can certainly see how that can be pushed even further to automate specific scenarios as well. And then we have the automating analytics and diagramming and road mapping. Again when it comes to the actual metrics within the organisation it is important to look at specific metrics. So we might think of things like total cost of ownership we might be looking at costs aggregating up and down the model and we might be building specific algorithms and abacus to automate some of that work. Specifically around road mapping we should also consider things like life cycle dates throughout the organisation because of course that's going to be important when we actually manage to build different projects or programmes internally and those road mapping capabilities are just going to extend throughout the organisation as well. I mentioned at the beginning diagramming of course is important as well as an enterprise architect but hopefully from what you've seen so far if we can actually import data and automate the data imports and data maintenance through things like machine learning then perhaps there's no need to draw diagrams manually. We should be looking at leveraging the graph structures provided to actually have automatic visualisations that we can drill down into and if there's ever a need to actually draw something then we should leverage some of the automatic layout methods that are also available. So again trying to shift the conversation away from drawing things to actually analysing the data that we have. Now there's always going to be a time and a need to draw things of course and these days again customers are seeing more of a need to support different types of notations outside of course of the very good notations that the open group provides and these are notations that might be specific to a specific domain. So whether it's customers who want to build content using something like AWS libraries those who want to build content using things like Microsoft's EOS library. There's a whole bunch of different notations out there around cloud architectures that should also be leveraged when we come to actually build these types of views. And then one of the key points which I have mentioned at the beginning is around how we actually integrate this data. So communication is of course key throughout the entire enterprise throughout the entire organization and we should really make sure that we can embed some of the analytics that we build and automate some of the analytics that we build into existing systems that are being used. So a lot of users might use teams and Jira and Confluence so make sure that we are able to dashboard that data build a specific view for a specific stakeholder and then embed that within those areas. Make sure that that data is filterable of course when we provide a view for a stakeholder we're providing them something that they can dynamically drill down through. So at the surface level we have highlighted those analytics and metrics we want to show with the option for those users to keep drilling down through that content collaborating with other users actually building ownership around the data and ultimately like hopefully one of our customers quotes which you'll see is they don't actually need to know that they're using an EA tool. I think this is an important point and I think especially for a lot of tools and maybe even just generally the concept of course of enterprise architecture maybe less so these days in terms of the alignment between business and IT I think that's probably one thing that's being pushed which is very good but they don't need to know that they're using an EA tool. You know an EA tool should be an enabler to allow people to actually build confidence in the outputs that you're building. It should be open to the entire organization if we're building views in terms of what's the impact of this change, what's the impact of this regulation that's coming in we need to make sure that everyone in the organization is aware of that. So move away from kind of back end tools that may be useful for architectural style work and surface that through different types of views for different types of users. Now importantly of course that leads us to different standards and frameworks that we can use from the toolkit. Specifically, there are things from TOGAP, IT for IT and of course the general open group library around ORT and RA that we can actually leverage for analytics. So TOGAP my usual rule of thumb is search for the term if it's there it's obviously going to be very useful to dive into that area of the TOGAP standard and certainly from a TOGAP perspective we look at the different phases and we look at how we can actually perform gap analysis. So current states, future states and how we actually manage those, evolve those and then assess the outputs. From an IT for IT perspective there are different performance indicators that we can use across different domains and different industries. So meantime between failures we look at things like complexity values and reliability. These are all key from when we actually build the services that our customers are then using. And then more generally around the open group library we can also think very specifically around things like risk analysis. So what's the effect of a risk? What's the likelihood? What's the impact of that risk across the entire organization? All the way from legal to applications to technologies to capabilities and of course ultimately to the service that we deliver. So take all of these individual aspects from all of these different areas within the open group and have a combinations of them to make sure that you do have a fully automated architecture and a fully automated way of them producing roadmaps throughout the organization. Now I appreciate of course all of everyone's time today on this particular session and I hope it's been as useful as I have certainly found that the previous tool could choose these sessions as well. It is worth quickly pointing out that we also have an EA Summit coming up on the 8th and 9th of June that we ourselves are hosting and we are very fortunate to have a special guest from the open group so Mark Dixon. He will essentially be presenting what's new and what's next in Togov Standard 10th edition. So if you wanted more information on that please feel free to check out the website below and certainly get in contact with me if you have any more details that you'd like to go through. But again I do just want to thank the open group for giving us the opportunity to present today and of course with time permitting and we'll be happy to address any other questions that have come up on today's webinar. Andrew, thank you very much. Thank you very much. You can hear the virtual round of applause from the audience. Great job as always and an interesting topic. And yes, as Andrew mentioned there's a summit coming up on June 8th and 9th run by Evolution so do sign up for that and you'll hear more about a lot of things including the Togov Standard 10th edition as Andrew just said. So encourage you to sign up for that. And to answer a question that we always get on Toolkit Tuesday, yes you will have access to Andrew's presentation after the event. It takes us a little while to just get them up but you will be notified when they're available. They'll be on the open group YouTube channel. So just a couple of questions maybe Andrew in the time remaining if we can. A lot of folks are interested in Toolkit Tuesday and our practitioners events because they're new to EA and there's always this big question of how do we start? How do we actually get going? There's all this potentially useful stuff but how do I do it? So any advice for folks who are new to EA about how they would start down the journey of analytics and automation? Yeah, sure. So I think there's a couple of aspects maybe specifically from today on the analytics side of things. When we hear the word machine learning one of the first things we might think is okay, well I need a lot of data to even pass through a system like that. So I think at the beginning it's a case of starting relatively small. It doesn't actually take that much information to build a confidence level within the data specifically around the machine learning aspect but for most organizations that are let's say new to EA and new to the analytics side some of the data perhaps it's important to focus on a few different types of metrics. So I would say there are generic metrics that can be used such as things like revenue complexity, performance of architectures and you can take that from other sources. You've got Jira and you've got tickets that are being raised in Jira that gives you a good level of confidence in the performance of those systems. More importantly I would say if you go through each of those phases you'll notice there are specific and key points that you can pull out of those phases and you don't necessarily of course have to use the entire standard. There are critical parts of something like ToeGraph and IT for IT where if you're new to EA you should be focusing on some of those aspects whether it's focusing on specific domains at the beginning capabilities, processes, applications and then extending that as both the data matures and the team matures. A tool is there to help it's not a silver bullet of course it's something that can help manage that particular progress and that deployment if you like of EA within teams and it's certainly something that we've done for many customers before both small scale and large scale EA teams. So if you want some more guidance on that obviously feel free to reach out and we can certainly guide you in the right direction. Great, thank you. And we have attendees here from all sorts of different industries so when we talk about analytics we very soon get into metrics. Are there kind of in your experience specific metrics that are relevant to specific industries or are there some general ones that are applicable to all? How would you guide people on what metrics to focus on? Yeah, sure. So some generic metrics I guess will depend on the domain I guess. So if we take something like applications we might be thinking of things like cost and reliability and performance maybe around processes and capabilities we can use some generic ones such as capability maturity index or the time it takes for a process to complete those span across any industry and then of course there are industry specific ones so manufacturing might be more concerned with the use of different materials. There are certain industries who think more around energy consumption marketing companies might be more thinking about sentiment and brand awareness and certainly from our perspective around financial organizations we're looking much more closely at new regulations that come into play and so each of those areas will have individual metrics that we can use and calculate and then hopefully measure and it comes back to this idea of current states and future states across any industry you want a baseline of what you actually look like today, a benchmark and then you want some kind of target state to try to achieve. It's not always the case that you'll achieve that target state but what that does do maybe specifically for the finance industry is it shows regulators that progress of how to get to that level of compliance whether it's around GDPR or MIFI 2 and that came in a few years ago those specific industries were using tools to understand how they would actually become more reliable and more compliant let's say with those regulations so a mixture of both let's say generic ones that are very useful and then industry specific ones that you can also find as well. Thank you Andrew, there's one question that's just come in which I know is one that we hear a lot about and it's the issue of data quality and the question is the issue we run into regularly is around data quality. Do you have any recommendations around that specifically costing data? Yep, it's the age old thing which is bad data and bad data out I guess. So there are I guess a few things the democratising of the data I mentioned at the beginning and that becomes quite useful but the more eyes you have on the data the more likelihood that you'll notice I'm going to say mistakes but data quality issues that can be rectified and certainly around cost cost is always an interesting one we often come across customers who think they need very detailed cost information and if they don't have that they don't bother with entering in any cost information but I think it's a scale you know we can start with very simple low medium high costs for systems and services we produce and when we do have more detailed information we can bring that in so it's about not trying to get to 100% that quickly but take a phased approach to saying the results I'm delivering the dashboard I'm building you know we're 60% confident in the outputs here if we can have more detailed and more higher grade data quality we can be a bit more confident in the decisions and the outputs that we want to achieve so there are some tips and tricks around that but certainly don't be afraid of using some basic metrics at the beginning and evolving those over time Great, very good and last word on the summit a question came in is this a virtual summit Andrew? Yes it is a virtual summit so you will find a bit more information on our website to be able to attend that summit and it's a mixture like I said of some great guests from the open group and also some customers giving their feedback on both their use of tools and also industry specific conversations as well Andrew we'll leave it there thank you very much for your time and expertise today and look forward to seeing you again soon and good luck with the summit No problem, thanks for your time Steve Thank you So just before we wrap up we have another Toolkit Tuesday in two weeks time continuing with our regular cadence there and so join us please on May 31st we're going to stick with the data theme whether data is your new gold or your new oil it's important stuff and getting your arms around it and being able to analyze it and make use of it is important and we have a couple of speakers in two weeks time who will be talking about the data integration Toolkit and these guys have dedicated their careers to the challenges of data Ron Schult manager at data harmonization NLC and Dr Chris Harding who is chief executive of Lassibus Limited longtime members of the open group and real experts in the data area so please join us in two weeks time folks meanwhile be well wherever you are and thank you very much for taking time out today and joining us on Toolkit Tuesday Goodbye for now