 Hi, everybody, we're back. This is Dave Vellante, Jeff Kelly. We're with wikibon.org, and this is the Cube, Silicon Angles flagship production. We're here at the MIT Information Quality Conference. It's a symposium, really, for the chief data officer. We're on the MIT campus in the Tang building. And it's a two-day event, and the discussion is really around information quality, data governance. It's a topic that is really not spoken a lot about in the big data circles that the Cube covers. But increasingly, like security, it's becoming a more top-of-mind subject. Tony O'Brien is here. He's a practitioner, turned to the university community. He's an associate lecturer at the Sheffield Business School. Long time practitioner in the financial services world. Tony, welcome to the Cube. Thank you very much. Yeah, so tell us a little bit about your background and how you ended up here in the world of university. Well, having spent 40-plus years in finance, over the years, you have problems. And eventually came to the conclusion that a lot of these problems manifest themselves from poor data, and always, it's always bad news. And it always seems to hit the PNL account. And then, about 10 years ago, I was following a job role change. I was asked, do something about finance. And then the question was, say, well, what do you do? Fortunately at the time, I was doing a doctor of business administration degree at a university in the UK. And we ran the data quality improvement program alongside a doctoral degree. And one of the ideas that we tried to do was to combine theory and practice with improving managerial and professional practice, whilst also at the same time, informing the academic community. We actually generated quite a bit of success over a period of time. And one of the things that we actually managed to develop was maybe concepts that we felt that could apply to a wider range than just outside my organization. So as a business finance practitioner, you found that poor data was often the root of the problem. In many instances, yes, wrong decisions, but management tended to blame the application or the systems rather than the data itself. Ah, okay, so they're pointing their finger at the wrong place. They're blaming the innocent, as so to speak. And certainly the accountants, when they were also blaming the people providing the information. Well, you're an accountant, so you know how it goes. When things are good, you really don't get too much of a patent when things are bad, you hear about it. You rush for the hills. Yeah, that's right. So talk a little bit about the practical case study. How did you resolve such issues, such problems? Talk about whether technology, the people, the organizational, the process, take us through that. Well, we tattled it really from the softer issues in terms of maybe the non-technical side. The organization implemented an ERP system in the latter part of the 90s. And the implementation went well, but we were having the data problems. So basically, one of the things we realized we needed to early dose was to, if you want to resolve a problem, then you have to start measuring it. Because you can't know if you're improving unless you have a benchmark or progressive, some form of inverted pump, maybe score to know whether you're improving or whether you're actually declining. You had a baseline. Exactly. And also what we decided to do, initially, we decided to try and cascade it through the organization. The organization comprised of 12 different business streams across then 85 factories in the UK and six business offices, which was later reduced to 54, factories. And we originally cascaded the data quality improvement initiative through the finance side because certainly that was where the problem, we thought the problem. And by measuring, we were able to show a fairly, well, yes, very good improvement over a period of time. However, we then had, for certain reasons, say a decline. And really the big breakthrough, what sort of switched on the lights was that over a period of five months, I visited 48 of the 54 factories. And what we took the bottom up approach, we organized meetings and focus groups with the people in each of the factories, or collection of maybe three or four factories, spoke to the people who processed the orders, who purchased the goods in, who actually manufactured the goods, who dispatched them, who chased the invoices, who paid the customers. In fact, the people who processed the actual operations in terms of the organizations. And what we found is that these were the people who really wanted to make the improvements. They were the people. They've been screaming for years. Exactly. And what we tried to do, when we used, in collaboration with the degree, we used something called Action Research, whereby we, like the Plan Do Check Act, we had a debate, we circulated the notes of the debate. And whilst I was sort of like the semi-invigilator, we tried to work it so that one was not necessarily directly involved, not leading them, but to get them to come up with ideas. And then, and I think one of the great advantages of having worked for the organization for 21 years, you were known, so you had a degree of trust. And it wasn't as though somebody was, research was parachuted in. And basically, a lot of the groundswell and support for the improvement of the data quality initiative came from the very people who, as you say, may have well have been screaming for some help. And what we then found is that champions developed. I was maybe the, I won't say project leader, I sort of tried to initiate it. However, champions emerged from within the businesses who then took it forward. Because any data quality initiative is okay. The first speaker today talked about data quality is not a project. And what we, I'll say the criteria was that any data quality initiative we have has got to be sustainable. So in other words, it had to be sustained when I walked away. And what happened, champions emerged and said a factory had a production controller who were about six or seven staff. They were setting their own targets for their own staff without me asking them because they saw that the improvements they were taking place. And we measured it on a daily basis. They could measure their improvement by factory, by business, and they then set targets for there because they realized that getting the data to improve data made their life better. So you did these forensics essentially. You went in to 48 of the 54 factories and you were setting these baselines. How did you measure the quality of data? How did you take a fact and then? Well, we had an ERP system. And we said basically, we start off with a sales order coming through. We finally, at the end of that transaction, the customer pays. And also, we may need raw materials or we may need services. So we've got to pay, we've got to interact with suppliers. So we had three performance indicators. So we measured the incidences of credit notes to invoices, the incidences of supplier payment problems, overdue production orders, sales orders and purchase orders. Incidences of invoices not being generated. So we had 11 different indicators. And each of which could actually comprise of data quality because within the ERP system, if you've got say a purchase order that hasn't been receded or production order that hasn't been completed, demand is still shown out there. And it was interaction of all these different transactions. But also, we were then able to use this to actually, the factory manager or a manager was able to use these information as a daily expediter because behind the KPIs, there was a list of all those invoices or all those orders that hadn't been processed. So instead of having to acknowledge to go into the system to actually listen, it's daily KPIs, listen to them all. And they were easily accessible in that area of the BI information area because they were adjacent to all his other factory and business information. So it was there, it was there every single morning. And so these KPIs was to assist the day-to-day operational as it was to measure, but not something to stick to beat them with. And those KPIs were developed in some kind of consensus? They were and what we actually basically did, we waited them. So if you had a purchase order that was overdue a month, it had one score. If it was overdue over 30 months, that score was multiplied by 13 times. So we actually waited it. And so therefore there was a bit of a measure. And what we also found is there was a consensus across the businesses. Okay, not everybody bought into it. But the other point is what we also had from the start with the executive report of the finance director. I didn't need to use him all the time, but do you knew that he was fully supported? Yeah, talk a little bit more about that. You mentioned going to really the people on the ground that are where a lot of these issues are first visible. And then finding those, or some of those people kind of develop into kind of project leaders who are really talking up the importance of the initiative. To keep it ongoing, it's an initiative, not a project. It's a continuing evolution. So how do you develop some of those people, those roles? How do you identify who are those people going to be? Does it just happen organically? And once you walk away, how do they continue the momentum? It emerged, if we'd say over the period of say the four and a half years. On the daily measurement, we had an improvement of about 28% in the first six months because it's easy. Not easy, but if you have any improvement program, you can get the quick wins, can't you? And then, but also, it was actually then publicizing those quick wins to make sure you keep the momentum going. We then had a dip because of a structural, restructural program. And then the champions sort of emerged because I think people were being switched on. And what transpired was that I then, for the last 12 months before I retired, I then took a back seat and my only involvement was to publish the statistics on a monthly basis. And that period, when I stood back, we'd had an improvement of about 50% over then the three and a half years. In the 12 months before that, before I retired, it improved by another 6%. So the momentum, if you've had a 50% improvement and over three years and then have a further 6% improvement on top of that, not only have you sustained the level, but you've actually pushed it forward. That's a cumulative effect of all that benefit. Exactly, because the quick wins are easy, aren't they? Sure. And it's very easy to fall back into all customs, but it looked as though the guys then are the technique forward. And then after I'd left the organization, the news came back that they'd maintained at that level. So you're now taking this knowledge, the story, many other stories, I'm sure, and you're bringing them to university. Yes. How's that going? What's your role there? Well, what's happened is having retired, I sort of, with the doctor, I then got a phone call, would you like to work two or three days a week at Sheffield Hall and University, Sheffield Business School? And it's sort of like a post-retirement, but it's challenging, really challenging. One of the things, several things as well, one, you're able then to, basically, it's undergraduate level, UK, and partly on an MBA course, but you're then able to relate theory and practice and actually tell them what happens in real-life business circumstances and pass on lessons with stories, et cetera, et cetera. From the things you've learned over the period of time, and what we've also done is to collaborate with a couple of organizations to run part of their management development group, and one of which is in data management, and we're now speaking to people who are, what, in the 30s, 40s, 50s, introducing the concepts of data management to them, and we're actually spelling out some of the horror stories which you can get from various research, and lights have been switched on there. Excellent. All right, Tony, well, listen, thanks very much for coming and sharing your insights and your practical experience. Good luck with the lecture circuit and your retirement. It was really a pleasure meeting you. Thank you very much. Thank you, guys. Thank you. All right, keep it right there. Jeff Kelly and I will be back. We're live. This is The Cube. We're live here at MIT at the Information Quality Symposium. Keep it right there. We'll be right back.