 From the CUBE studios in Palo Alto in Boston, connecting with thought leaders all around the world, this is a CUBE conversation. Hi everybody, welcome back to theCUBE. This is Dave Vellante and you're watching a special presentation, Data Ops and Acton made possible by IBM. You know, what's happening is the innovation engine in the IT economy has really shifted. Used to be Moore's law. Today it's applying machine intelligence and AI to data, really scaling that and operationalizing that new knowledge. The challenge is that it's not so easy to operationalize AI and infuse it into the data pipeline. But what we're doing in this program is bringing in practitioners who have actually had a great deal of success in doing just that. And I'm really excited to have Etumale Monale is here. She's the executive head of data management for personal and business banking at Standard Bank of South Africa. Etumale, thanks so much for coming in theCUBE. Thank you for having me, Dave. You're very welcome. And first of all, how are you holding up with this COVID situation? How are things in Johannesburg? Things in Johannesburg are fine. We've been on lockdown now, I think it's day 33 if I'm not mistaken, lost count. But we're really grateful for the swift action of government, we only, I mean, we have less than 4,000 cases in the country and infection rate is really slow. So we've really, I think been able to flatten the curve and we're grateful for being able to be protected in this way. So we're all working from home all learning the new normal and we're all in this together. That's great to hear. Why don't you tell us a little bit about your role, your data person, we're really going to get into it. But share with us, you know, how you spend your time. Okay, well, I hit up a data operations function and a data management function, which really is the foundational part of the data value chain that then allows other parts of the organization to monetize data and leverage it as the use cases apply. We monetize it ourselves as well, but really we're an enterprise-wide organization that ensures that data quality is managed, data is governed, that we have the effective practices applied to the entire lineage of the data. Ownership and curation is in place and everything else from a regulatory as well as opportunity perspective then is able to be leveraged upon. So historically, you know, data has been viewed as sort of this expense. It's big, it's growing, it needs to be managed, deleted after a certain amount of time and then, you know, 10 years ago, the big data moved it, data became an asset. You had a lot of shadow IT, people going off and doing things that maybe didn't comply to the corporate edicts probably drove your part of the organization crazy. But talk about that, what has changed the thing in the last, you know, five years or so just in terms of how people approach data? Well, I mean, you know, the story I tell my colleague who are all bankers obviously is the fact that the banker in 1989 had to mainly just know debit's credit and be able to look someone in the eye and know whether or not they'd be a credit risk or not, you know, if we lend you money, can you pay it back? The banker of the late 90s had to then contend with the emergence of technologies that made their lives easier and allowed for automation and processes to run much more smoothly. In the early 2000s, I would say that digitization was a big focus and in fact, my previous role was head of digital banking. And at the time we thought digital was the panacea, it is the Beall and Endall, it's the thing that's gonna make organizations competitive. Lo and behold, we realized that once you've gotten all your digital platforms ready, they are just the plate or the pipe and nothing is flowing through it and there's no food on the plate if data is not the main focus. So really, it's always been an asset. I think organizations just never consciously knew that data was that. Okay, so it sounds like once you made that sort of initial digital transformation, you really had to work it. And what we're hearing from a lot of practitioners like yourself is challenges related to that involve different parts of the organization, different skill sets of challenges and sort of getting everybody to work together on the same page, et cetera. But maybe you could take us back to sort of when you started on this initiative around data ops, what was that like? What were some of the challenges that you faced and how'd you get through them? I think first and foremost, Dave, organizations used to believe that data was IT's problem and that's probably why you then saw the emergence of things like shadow IT. But when you really acknowledge that data is an asset, just like money is an asset, then you have to then take accountability for it just the same way as you would any other asset in the organization. And you will not abdicate its management to a separate function that's not called to the business. And oftentimes IT are seen as a support or an enabling function, but not quite the main show in most organizations, right? So what we then did is first emphasize that data is a business capability. The business function, it resides in business next to product management, next to marketing, next to everything else that the business sees as core, data management also has to be core to every role and every function, to different degrees and varying its sense. And when you take accountability as an owner of a business unit, you also take accountability for the data in the systems that support the business unit. For us, that was the first big shift. And convincing my colleagues that data was their problem and not something that we had to worry about and they just kind of leave us to it was also a journey. But that was kind of the first step in terms of getting the data operations journey going. You had to first acknowledge. Oh, please, Darion. No, you just had to first acknowledge that it's something you must take accountability of as a banker, not just leave to a different part of the organization. That's a real cultural mindset. You know, in the game of rock, paper, scissors, you know, culture kind of beats everything, doesn't it? It's almost like a Trump card. And so the businesses embrace that, but what did you do to support that? Is there has to be trust in the data? It has to be timeliness. And so maybe you could take us through how you achieve those objectives and maybe some other objectives that the business demands. So the one thing I didn't mention, Dave, is that obviously they didn't embrace it in the beginning. It wasn't a, oh yeah, that makes sense. Do that type of conversation. What you had was a few very strategic people with the right mindset that I could partner with that understood the case for data management. And while we had that as an in, we developed a framework for a fully mature data operations capability in the organization and what that would look like in a target state scenario. And then what you do is you wait for a good crisis. So we had a little bit of a challenge in that our local regulator found us a little bit wanting in terms of our data quality. And from that perspective, it then brought the case for data quality management to the fore. So now there's a burning platform, you have an appetite for people to partner with you and say, okay, we need this to comply to help us out. And when they start seeing data option action, do they then buy into the concept? So sometimes you need to just wait for a good crisis and leverage it and only do that, which the organization will appreciate at that time. You don't have to go big bang. Data quality management was the use case at the time five years ago. So we focused all our energy on that. And after that, it gave us leeway and license to really bring to maturity or the other capabilities that the business might not well understand as well. So when that crisis hit, thinking about people process and technology, you probably had to turn some knobs in each of those areas. Can you talk about that? So from a technology perspective, that's when we partnered with IBM to implement information analyzer for us in terms of making sure that then we could profile the data effectively. What was important for us is to make strides in terms of showing the organization progress, but also being able to give them access to self-service tools that will give them insight into their data. From a technology perspective, that was kind of, I think the genesis of us implementing the IBM suite in earnest from a data management perspective. People-wise, we really then also began a data stewardship journey in which we implemented business unit stewards of data. I don't like using the word stewards because in my organization, it's taken lightly. It's almost like a part-time occupation. So we converted them, we call them data managers. And the analogy I would give is every department with a P and L, any department worth of salt has a FD or financial director. And if money is important to you, you have somebody helping you take accountability and execute on your responsibilities in managing that money. So if data is equally important as an asset, you will have a leader, a manager helping you execute on your data ownership accountability. And that was the people journey. So firstly, I had kind of soldiers planted in each department which were data managers that would then continue building the culture, maturing the data practices as applicable to each business unit use cases. So what was important is that every manager in every business unit who's a data manager focus their energy on making that business unit happy by ensuring that their data was of the right compliance level at the right quality, the right best practices from a process and management perspective and was governed through. And then in terms of process, really it's about spreading through the entire ecosystem. Data management as a practice can be quite lonely in the sense that unless the core business of an organization is managing data, they're worried about doing what they do to make money. And most people in most business units will be the only unicorn relative to everybody else who does what they do. And so for us, it was important to have a community of practice, a process where all the data managers across business as well as the technology parts and the specialists who are data management professionals coming together and making sure that we work together on specific use cases. So I wonder if I can ask you, so the industry sort of likes to market this notion of dev ops applied to data and data ops. Have you applied that type of mindset, approach, agile, continuous improvement? Is, I'm trying to understand how much is marketing and how much actually applicable in the real world. Can you share? Well, you know, when I was reflecting on this before this interview, I realized that our very first use case of data ops was probably when we implemented information analyzer in our business unit, simply because it was the first time that IT and business as well as data professionals came together to spec the use case. And then we would literally in an agile fashion with a multidisciplinary team come together to make sure that we got the outcomes that we required. I mean, for you to firstly get a data quality management paradigm where we moved from 6% quality at some point from our client data, now we're sitting at 99%. And that 1% literally is just a timing issue. To get from 6 to 99, you have to make sure that the entire value chain is engaged. So our business partners were the fundamental determinant of the business rules for apply in terms of what does quality mean? What are the criteria for quality? And then what we do is translate that into what we put in the catalog and ensure that the profiling rules that we run are against those business rules that were defined at first. So you'd have upfront determination of the outcome with business, and then the team would go into an agile cycle of maybe two-week sprints where we develop certain things, have stand-ups come together, and then the output would be dashboarded in a prototype kind of fashion where business then gets to go double check the outcome. So that was the first iteration, I would say. We've become much more mature at it and we've got many more use cases now and there's actually one that's quite exciting that we recently achieved over the end of 2019 into the beginning of this year. So what we did was, Dave, I'm worried about the sunlight coming through the window. It's okay, you look great, it's human life. Is it fine? I just thought it faded glare. The sunset in South Africa, we've been on CubeSat sometimes, it's so bright, we have to put on sunglasses, but... Okay, all right, no problem, awesome. So the most recent one, which was in late 2019 coming into early this year, we had long kind of achieved the compliance and the regulatory burning platform issues. And now we are in a place of, I think opportunity and luxury where we can now find use cases that are pertinent to business execution and business productivity. The one that comes to mind is, we're 158 years old as an organization, right? So this bank was born before technology. It was also born in the days of no integration because every branch was a standalone entity. You'd have these big ledges that transactions were documented in. And I think once every six months or so, these ledges would be taken by horse drawn carriers to a simple place to go reconcile between branches, et cetera. But the point is, if that is your legacy, the initial kind of ERP implementations would have been focused on process efficiency based on old ways of accounting for transactions. And allocating information. So it was not optimized for the 21st century. Our architecture has had huge legacy burden on it. And so going into a place where you can be agile with data is something that we constantly working towards. So we get to a place where we have hundreds of branches across the country and all of them obviously selling to clients servicing clients as usual and not being able for any person leading sales teams or execution teams, they were not able in a short space of time to see the impact of their tactic from a data perspective, from a reporting perspective. We were in a place where in some cases based on how our ledgers roll up and the reconciliation between various systems and accounts work, it would take you six weeks to verify whether your tactics were effective or not because to actually see the revenue hitting our general ledger and our balance sheet might take that long. That is an ineffective way to operate in a such a competitive environment. So what you had are frontline sales agents literally manually documenting the sales that they had made but not being able to verify whether that or not is bringing revenue until six weeks later. So what we did then is we sat down and defined all the requirements from a reporting perspective and the objective was to move from six weeks late since the 24 hours. And even 24 hours is not perfect. Our ideal would be that by choice of day you're able to see what you've done for that day but that's the next epic that we'll go through. However, we literally had the frontline teams defining what they'd want to see in a dashboard, the business teams defining what the business rules behind the quality and the definitions would be and then we had an entire analytics team and the data management team working around sourcing the data, optimizing it, curating it and making sure that the latency is cut down. That's I think our latest use case for data arts. And now we're in a place where people can look at a dashboard with a cube of self-service, they can log on at any time and see the sales that they've made which is very important right now in the time of COVID-19 from a productivity and the execution of competitiveness perspective. So those are two great use cases of cumuling. So the first one going from data quality, 6% to 99%, I mean, 6% is all you do is spend time arguing about the data that stills productivity. And then 99% you're there and you said it's just basically a timing issue with latency and the timing. And then the second one is instead of paving the cow path with an outdated ledger, barret data process, six weeks you've now compressed that down to 24 hours you want to get to the end of day. So you've built in the agility into your data pipeline. I'm going to ask you then, so when GDPR hit were you able to very quickly leverage this capability and comply and then maybe other compliance edict as well? Well, actually, what we just did now was post GDPR for us and we got GDPR right about three years ago but literally all we got right was reporting for risk and compliance purposes. The use cases that we have now are really around business opportunity less so the risk. So we prioritized compliance reports a long time ago. We're able to do real-time reporting from a single transaction perspective, suspicious transactions, et cetera to our reserve bank and our governor. So from that perspective, that was what was prioritized in the beginning which was the initial crisis. So what you found is an entire engine geared towards making sure that data quality was correct for reporting and regulatory purposes. But really that is not the be all and end all of it. And if that's all we did, I believe we really would not have succeeded or could have said we succeeded because data monetization is actually the penesty. The leveraging of data for business opportunity is actually then what tells you whether you've got the right culture or not. You're just doing it to comply and it means the hearts and minds of the rest of the business still aren't in the data game. I love this story because to me it's nirvana. For so many years, we've been pouring money to mitigate risk and you have no choice to do it. You know, the general counsel signs off on it, the CFO begrudgingly signs off on it, but it's got to be done for years, decades. We've been waiting to use these risk initiatives to actually drive business value. It kind of happened with enterprise data warehouse, but it was too slow and it was too complicated. And it certainly didn't happen with email archiving. That was just sort of a checkbox. It sounds like we're at that point today. And I want to ask you, Timmy Lang, we're talking earlier about a crisis going to perpetuated this cultural shift and you took advantage of that. The mother nature dealt up a crisis that we've never seen before. How do you see your data infrastructure, your data pipeline, your data ops? What kind of opportunities do you see in front of you today as a result of COVID-19? Well, I mean, because of the quality of time data that we had now, we were able to very quickly respond to COVID-19 in our context where the government put us on lockdown relatively early in the cycle of infection. And what it meant is it brought a little bit of a shock to the economy because small businesses all of a sudden didn't have a source of revenue for potentially three to six weeks. And based on the data policy work that we did before, it was actually relatively easy to be agile enough to do the things that we did. So within the first weekend of lockdown in South Africa, we were the first bank to proactively and automatically offer small businesses and students with loans on our books, a instant three month payment holiday assuming they were in good standing. And we did that upfront. So it was actually an up-out process rather than you had to call in and arrange for that to happen. And I don't believe we would have been able to do that if our data quality was not where it's supposed to be. We have since made many more initiatives to try and keep the economy going, to try and keep our clients in a state of liquidity. And so data quality at that point and that doesn't show is critical to knowing who you're talking to, who needs what and which solutions would best be fitted towards various segments. I think the second component is working from home now brings it entirely different normal, right? So if we had not been able to provide productivity dashboards and sales dashboards to management and all the users that require it, we would not be able to then validate or say what our productivity levels are now that people are working from home. I mean, we still have essential services workers that physically go into work but a lot of our relationship bankers are operating from home. And that the baseline and the foundation that we set for productivity tracking for various metrics being able to be reported on in a short space of time has been really beneficial. The next opportunity for us is we've been really good at doing this for the normal operational and frontline type of workers but knowledge workers have also not necessarily been big productivity reporters historically. They kind of get an output and the output might be six weeks down the line but in a place where teams now are not co-located and work needs to flow in an agile fashion. We need to start using the same foundation and data pipeline that we've laid down as a foundation for the reporting of knowledge worker and agile team type of metric. So in terms of developing new functionality and solutions there's a flow in a multidisciplinary team and how do those solutions get architected in a way where data assist in the flow of information so solutions can be optimally developed. Well, it sounds like you're able to map key metrics that business lines care about into these dashboards you're using a sort of data mapping approach if you will which makes it much more relevant for the business as you said before they own the data that's got to be a huge business benefit just in terms of again, we talked about cultural we talked about feed but the business impact of being able to do that it has to be pretty substantial. It really, really is and the use cases really are endless because every department finds the own opportunity to utilize in terms of that. Also, I think the accountability factor has significantly increased because as the owner of a specific domain of data you know that you're not only accountable to yourself and your own operation but people downstream to you as a product and an outcome depend on you to ensure that the quality of the data you produce is of a high nature. So curation of data is a very important thing and business is really starting to understand that. So the CARD department knows that they are the owners of CARD data and the vehicle asset department knows that they are the owners of vehicle data linked to a client profile and all of that creates an ecosystem around the client. I mean, when you come to a bank you don't want to be known as a number and you don't want to be known just for one product you want to be known across everything that you do with that organization but most banks are not structured that way. They still are product houses and product systems on which your data resides and if those don't act in concert then we come across extremely schizophrenic as if we don't know our clients and so that's very, very important. To be like, I could go on for an hour talking about this topic but unfortunately we're out of time. Thanks so much for sharing your deep knowledge and your story. It's really an inspiring one and congratulations on all your success and I guess I'll leave it with, you know, what's next? You gave us, you know, a glimpse of some of the things you wanted to do pressing some of the elapsed times and the time cycle but where do you see this going in the next, you know, kind of midterm and longer term? Well, currently, I mean, obviously AI is a big is a big opportunity for all organizations and you don't get automation of anything right if the foundations are not in place. So you believe that this is a great foundation for anything AI to be applied in terms of the use cases that we can find. The second one is really providing an API economy where certain data products can be shared with third parties. I think that's probably where we want to take things as well. We are already utilizing external third party data sources in our data quality management suite to ensure validity of client identity and residents and things of that nature. But going forward, because fintechs and banks and other organizations are probably going to partner to be more competitive going forward, we need to be able to provide data products that can then be leveraged by external parties and vice versa for us. It's like thanks again, it was great having you. Thank you very much Dave, appreciate the opportunity. Okay, and thank you for watching everybody. We are digging in to data ops. We've got practitioners, we've got influencers, we've got experts, we're going into the crowd chat. It's the crowd chat dot net slash data ops. So keep it right there. We're back with more coverage. This is Dave Vellante for theCUBE.