 Hello, and welcome to Dataversity Talks, a podcast where we discuss with industry leaders and experts how they have built their careers around data. I'm your host, Shannon Kemp, and today we're talking to David Kowalski, principal consultant at Orteca. More and more companies are considering investing in data literacy education, but still have questions about its value, purpose, and how to get the ball rolling. We're doing the newest monthly webinar series from Dataversity, Elevating Enterprise Data Literacy, where we discuss the landscape of data literacy and answer your burning questions. Learn more about this new series and register for free at Dataversity.net. Hello, and welcome, my name is Shannon Kemp, and I'm the Chief Digital Officer at Dataversity, and this is my career in data, a Dataversity Talks podcast dedicated to learning from those who have careers in data management to understand how they got there and to be talking with people who help make those careers a little easier. To keep up to date in the latest in data management education, go to Dataversity.net forward slash subscribe. And today we are joined by David Kowalski, principal consultant at Orteca, and normally this is where a podcast host would read a short bio of the guest, but in this podcast, your bio is what we're here to talk about. David, hello, and welcome. Thank you, Shannon. Always a pleasure to chat with you. Likewise, you're a longtime Dataversity friend and speaker at our conferences, which we always appreciate. Always great to see you at the events and so excited that face-to-face events are up and running again, where we do get to see each other. So tell me, David, so you're the principal consultant at Orteca. What is Orteca and what is it that you do? Well, Orteca is a, we describe ourselves as a specialist consulting company. We are, our main headquarters are in London, but we also have offices in Nashville and Princeton in the US. And we're dedicated to helping companies enable and exploit their data. We're also known as, you know, widely recognized as leading experts in DCAM and CVMC. I'll tell you in a second what those are for anybody who doesn't know the acronyms, but, you know, and the other thing that distinguishes Orteca is most of our senior staff have actually had hands-on kinds of jobs. We've set in the executive seats. We've run large portions of large organizations. We're not career consultants who approach things theoretically. We've walked the walk and talked the talk and have the battle scars to prove it. I'm a principal consultant there, which basically means I, you know, the senior lead on most of the projects I'm involved with, you know, so I will go in and talk to data management executives at our clients and just look at ways that they can come up with better strategies, better policies on how to manage their data. I also act as a subject matter expert internally, especially in DCAM and CDMC. And since that's now the second time I've used those words, I will say that for anyone who's unfamiliar, those are both industry standard frameworks created by the Enterprise Data Management Council. I've been a major contributor to them. DCAM is the data management capability assessment model, which is a framework that is about really best practices in how to manage data. And then the CDMC or cloud data management capability framework is a more recent extension to that, which builds on framework and builds on the DCAM framework and then looks at the kinds of things that you need to do additionally to manage data effectively in the cloud. I like that. So, David, tell me when you were very young, is this what you dreamed of being when you grew up? I'm going to be a data management consultant. Not even close. What do you want to be? I mean, in the interest of embarrassing myself or not embarrassing myself too much, I'll skip over the parts where at the age of 10, I really wanted to be James Bond and at the age of 16 when I wanted to be the next chief Emerson. But the second one is a little closer to the truth. I actually spent most of my, well, pretty much all of my academic career and most of my early adulthood studying classical music composition. I always assumed I'd end up teaching music at some kind of prestigious institution and that I would basically make my life about teaching people about classical music. As it turns out, I got my doctorate from Princeton in 1985. And in 1985, really, nobody wanted to talk to you unless you had an MBA. It was certainly next to impossible to find teaching positions in the arts. And so at that point, I had done a lot of work with computer analysis and computer modeling of music as part of my dissertation work at Princeton. And AT&T had just been broken up a few months before I got out of Princeton. They were desperate for anybody who even knew how to turn on a computer. And I ended up looking up with another Princeton grad who was doing a lot of consulting work there. And so for nine years, almost all my time was spent rewriting old systems for AT&T. That's a pretty dramatic change. It really is. Yeah. But, you know, I have to tell you, when I first started to try to talk my way into computer positions, and I realized that academics career was really not in the cards short term. I liked the computer work. I figured, let's see how I can get my way into that. But this was also pre.com and people don't remember that. I mean, once.com hit in the early nineties, if anybody remembers the old IBM PCs, they had that big red switch to turn it on and off. And we used to say, if you knew what the big red switch did, you could get higher. But five years before that, a lot of the big companies, I mean, I'd go to places like IBM and they would tell me, you don't have a computer science degree. You can't possibly work here. So when I hooked up with this other guy from Princeton, we started talking about things. And I'll be honest, I literally did not know what a database was at that point. But then when we started to talk about it, I realized, well, what's a database? One way to describe a database is a bunch of disparate data points upon which structure has been imposed. And what's a piece of music is a bunch of disparate data points upon which structure has been imposed. So I liked playing around with computers. Once I made that kind of connection, it seemed even more relevant. It certainly paid a heck of a lot more than I was going to make at an entry level teaching position. And so I figured I'd try it out for a few years. And then once I bought my first house, I couldn't afford to take a teaching position anymore. So the rest is history, as they say. You know, I don't think so. A lot of people understand how much math and science is involved in music. And, you know, and and that's I love how music brought you to technology. Absolutely. And it's not just the math and science. I mean, it's the general thinking of structure. And one of the things that I like to think has always been a little bit out of the box thinking about data for me is one of the things you deal a lot with in music, especially classical music is if you know, if you know anything about fractals, patterns that repeat on larger and larger and larger scales or going in the other direction are embedded at finer and finer and finer levels, that's very much a characteristic of a lot of classical music. But it's also very much about the way a lot of data structures work. I mean, the kinds of things that you would do to define an individual data element are in many respects expandable to look at a model of how you would build a table. And then from there, how you would build a database. And from there, how you would build a system of databases. And then you start coming out of the technical or architectural part of it. And it really begins to have parallels in how you build the business side of the data management structure. So I have always said that, you know, at some points in my life, I've been closer and more involved in music than others. But even when I've not been actively doing music, a lot of musical thinking always informs the way I think. I love it. So you're at AT&T and then where do you go from there? Well, from there, you know, I start I was in there under the ages of a very small consulting firm. And because we were so small, I had to wear a lot of hats, you know. So I was I was the programmer. I was the database administrator. I was the system architect more significantly for my longer term career. I was also the person who could actually talk to the client. I was equally comfortable talking to people on the business side. And the technology side so I could bridge that gap. And that stood me in really good stead. I bounced around from one part of AT&T to another for almost 10 years. Finally, didn't actually like where my consulting company was going at that point. So left there, moved into a completely different line of business. I'm still doing, you know, system development work. But this time for Church and Dwight, who are better known as the parent company of Arm and Hammer, the baking soda people. Although these days sell a lot more than baking soda. And, you know, and there, you know, I kind of hit a glass feeling there and ended up going over to Merrill Lynch. Actually, no, I'm sorry. Before Merrill Lynch was Morgan Stanley. And, you know, I was there through a consulting company, but it was a consulting company that specialized and worked with Morgan Stanley. So I got very, very immersed, not only moving more and more on the business side, but specifically into financial services. And by the time I left there, I was kind of really very much branded as a financial services guy and also still kind of primarily on the technical side. But from there, I moved over to Merrill Lynch, which was in the process of being gobbled up by Bank of America. And shortly after starting there, I completely abandoned the technical side of things and got involved with the enterprise level group that was writing the policies, writing the strategies, writing the standards for our first data management program. Fascinating. That's interesting. Definitely a lot of data in financial institutions. Absolutely. And the thing is it's not just one of the things that I always get reminded of when I go to a data diversity event and talk to people who are not in financial services. If I sometimes forget just how heavily regulated it is, you know, so we have a different set of challenges there. I mean, in a financial environment, very often you have very stringent regulations that you have to comply with. You have federal regulators that you have to keep happy. The good side of it is because there are huge fines involved in not complying with those things, you don't very often have to justify yourself to management. They know if we don't do such and such, we're going to be paying. It finds that in some cases, a literally amount to hundreds of thousands, if not millions of dollars a day or a week, depending on the degree of infraction. And so it becomes, you know, when you start getting out there, like I say, I spent time not so much in data management when I was at church in Dwight, we didn't have any of those kinds of regulations. So it became much more the internal people skills became much more important. And it became much more a case of talking to people about why this is a good idea, you know, just from in terms of benefits to our company, you couldn't use that stick of it's going to cost us a bundle and fines if we don't do it. So it's so then leading into your consulting role, you know, what is it? How are you helping your clients with data? What's the thing that you're you're working on the most? And that's the most critical for your clients. And we we our clients tend to come in two very distinct tiers. We have a lot of regional usually banks, regional banks who are starting to grow very rapidly and just all of a sudden realizing the way we have been doing things just doesn't scale. So in those cases, we very often go in and it's like, OK, let's think strategically about this. Do you have a data management strategy? No, what's that? We have gone in. We have actually written the drafts of data management strategy for them. In other cases, we've looked at what they have is the beginnings of them and help them shape it into something that will actually give them a solid strategic direction. Beyond that, we also build out policies with them. We build out operating models. We sometimes actually get down to helping them write their initial standards. And those are all things I mean, we do, in a sense, even if they've got a chief data officer, we essentially act as a kind of chief data officer or chief data officer advocate for what kinds of things need to be put into place around the company. We also the other extreme work with a lot of really large global and probably indiscreet for me to name the names. But let's just say we've got a lot of clients that you've heard of, you know, and because we're based both in the US and London, we have a lot of our clients are based in the EU, a lot of them are here. And also, as it turns out, Australia is really kind of getting religion about data management. We're doing a lot of work in Australia right now. There's a lot of new regulations coming out there. And with those larger companies, very often what we will do is we come in, and I mentioned VCAM earlier as a framework. It's also got a scoring mechanism involved with it. It's a very effective assessment model of what are you set up to do. We are frequently brought into these large organizations to do VCAM assessments of how established various parts of their data management activities are. Very often we find gaps and there, no matter how long they've been doing data management, it turns out it's like we never really thought about that aspect of data architecture. Or we never really thought about centralizing such and such a kind of activity. I mean, there's a thousand different things that could show up in an assessment like that. And so at that point, we then start working with them. And these are usually companies that have been doing data management on a large scale for years, if not decades. So they have established strategies and policies, but we will then go in and start doing deep dives with them and look at, okay, what needs to change? What is it that has been working and is no longer working? We had a client last year. We were doing a lot of work. They were really starting to make a strong move into the cloud. And that was creating a lot of problems for them. And so really, it's just kind of looking at that. Like, what are all the pieces you need to get in place in order to make everything function as a unified whole? And we go in there. We do use these EVM council frameworks, DKAM and CDMC as a starting point. But as I said, most of us in Orteca have also spent a whole lot of years actually running organizations. So we have a lot of industry experience, which is further augmented by the various clients we've had around the world. And we just, we can go in and talk to people about, this isn't just a good idea. We can show you real world use cases. Even if they're anonymized, we can really show you. It's like, these are the various companies that have had problems with this. And we work with big banks that had this problem. And we work with small consumer firms that had this problem. And this is the way they dealt with it. And here's how that looks at those companies six months later, a year later, two years later. So that gives us, I like to think it gives us a lot of street cred in terms of what we bring. It's not, we don't go in there just with an academic model of how things should look. It's really a whole lot of experience in terms of what we know does. That's fantastic. So with that experience, what is your definition of data? Oh, you gave me a heads up. You were going to ask me about this one. And I don't know if I think I could probably spend a good hour talking about a really brilliant and insightful definition of it. But most fundamentally, it's like, when I think of data, it's really just any kind of measurement or description of some kind of concept or some kind of action or some kind of thing in the real world. And I think it's distinct from information. In fact, a lot of what I do as a consultant is help people turn data into useful actionable information. I mean, the data is just out there. And at a fundamental level, it's very important to make sure that you understand it. There is agreement in terms of what a given term means across the organization. And in terms of how you get to that, do you have a single source of it? Or is that same piece of data replicated in numerous places around the company? So there's a lot of things that go into just making sure that the data, whatever it's describing, whatever it's measuring, just making sure that it's consistent in terms of its appearance and what numbers and text are. But also consistent in terms of what people understand it to be. And then, of course, the next step is taking it up into coming up with the ways in which you can ensure everybody does have that common understanding, not just at the individual data element level, but also in terms of what does it mean in any given context. And that's where you start to bridge the gap into the data becoming information and the information being actionable. And your organization's leadership actually being able to look at reports and things and analytics that come out of that data and information and actually trust that it's something that they can make decisions based upon. Makes sense. It makes a lot of sense. So do you see the importance of data management and the number of jobs working with data increasing or decreasing over the next 10 years and why? Oh, I think it's definitely increasing. But I also think the nature of those jobs is going to be changing fairly dramatically, certainly on the technology side. I mean, and not necessarily in the ways that people immediately think. I mean, yes, there's constantly new programming languages come out. There's new kinds of paradigms for programming. But ultimately, programming is programming. I mean, yes, I mean, back in the days when I used to write in a simply language in Fortran, we had to specify a heck of a lot more detail than modern languages require. But ultimately, it boils down to being able to describe something in individual discrete steps and then describe that in a way that works for whatever kind of technical platform you're working upon. The one thing, you know, so those jobs, I think will change in details, but not necessarily fundamentally. I think there are some major kinds of ships, though often technology side, artificial intelligence and machine learning. You know, we've been talking about those terms for actually in some cases for decades, but it's now coming to a degree of fruition that most people can't even imagine. Folks who aren't in this field have no idea how often they are interacting with some kind of AI. Everything from, you know, the fairly simple act of Amazon telling you that it's like, oh, folks who bought this also bought this. I mean, that's been around for ages since the earliest days of Amazon. You know, but then you've also got more subtle things. I mean, especially in the advertising world, Google is famous or perhaps infamous for the fact that they gather information about what you're doing all over the web and start to make predictions about what products are you going to buy? What kind of news stories do you want to read? And, you know, they start to try to predict things based on your past behavior. There's an awful lot happening in that space. Now, I mean, very, very recently, I mean, literally within the last couple of months, at least in the broader literature, you're starting to see a huge focus on intelligent chatbots, you know? And we've already probably all have some experience the various kinds of voice systems that we interact with when we call the airlines to get help with our reservation or, you know, when we try to get directed to the right person at a bank or at a utility company. You know, some of those have been around in very crude ways for ages where they just sort of some poor voice recognition and then they just branch down a tree, possible paths you can take. But now we're seeing true artificial intelligence in there where they really parse and try to understand what you're saying and then really don't know where they're going to next. It's all sort of determined dynamically. There's a lot of new kind of work being done there. I think just as some of those chatbots are going to be taking away a certain kind of job, you know, building them is going to take a much greater degree of education and a lot more new kinds of jobs. I think one of the things I mean, we're starting to talk about it in Orteca right now we're seeing these various kinds of AI agents out there that will write code for you, that will write articles for you, that will post your LinkedIn posts for you. And it creates not just jobs, I think around how do you train those bots so they're not just spitting out a bunch of ignorant boilerplate text, but also that it really starts to ideally you would want to see those things have an understanding of data ethics, of data privacy. A lot of these things require a degree of processing power that almost demands they sit out in the cloud, but especially if you're in a highly secure industry like healthcare or finance. These are companies that aren't going to feel too good in general about data going out into the cloud without even with all the guarantees that the cloud gives you. What do we know about some of these newer bots? I was on a meeting just earlier this morning one of my guys is doing a study of this. He now has a list of 47 different intelligent chat bots that will do everything from actually chat with you to write articles for you. And he's doing a deep dive on them from the standpoint of how much can we trust these various players. So I think there's a whole series of jobs that are going to be growing in that domain. I mean, not just in how you program them, but how you govern them and data ethics, I think becomes a big piece of that. Very often we govern things from the standpoint of what we can legally do. But what we can ethically do, there's some things that are legal, but I like to say about ethics, it's the sort of stuff, if your customer knew you were doing that, how happy would they be about that? And it's still, I think there's a lot of spotlight being shined on that lightly, but I also think that there's a lot more work that needs to be done, frankly, especially in the U.S., in integrating data ethics oversight into AI and ML in particular. And I don't want this to become a seminar on ethics in AI and ML, but I mean, on the ML side in particular, as you start to build analytical models and you look at the data that you're using to train your models, I mean, that oven in itself brings up a whole realm of things that you need to address in terms of understanding bias and what's the ethical use of what comes out of those things. Now that much said, I mean, those are things that are somewhat on the technical side, somewhat on the governance side, but I don't want people to lose track of the fact that there's still a lot of real fundamental kind of things that we will always need people in there guiding us on. We can automate a lot of things about data governance, but ultimately we have to describe how we want things to govern. We can basic processes, we can define stuff as processes and then automate them, but if we've got garbage processes defined, then all automation is going to do is give us a lot more garbage at a lot faster rate. And so I think there's also going to continue to be a role for, especially at senior management levels, the people who truly understand the ethical as well as the legal ramifications of what we are doing in any of these fields and then ensuring that any kind of automation in addition to any kind of manual processes around those things are in line with whatever are, I mean, really, everybody should have a statement of data ethics. And so they know, it's like, well, okay, we can get away with this, but do we want to get away with this? Is this who we are? Is this the face we want to present to the public? Is this what we want our customers to know us as? Ready to share your knowledge and network with your data peers? Join us in San Diego this June for the Data Governance and Information Quality Conference. Five days packed full of new perspectives, new colleagues, and new approaches are yours when you register at dgiq2023west.dativersity.net. Lock in early bird savings when you register by May 5th. We'll see you there. It's very, very appropriate and right. We've seen a lot of companies who have stood up machine learning and stood it up on data that was not prepped and did not contain quality and it went horribly all right. Exactly, exactly. Yeah, and another thing I want to say about that, I mean, it's almost why I think things like psychology become, you know, as an undergraduate, I primarily studied music. I also studied a lot of English literature and psychology and physics for that matter, which are three other fields I also continue to dabble in. But I think the psychology part of it starts to become very important as well. And in fact, it's something that I would actually think anybody who's serious about data management in the future should have at least some kind of psychology involved because you never know how people are going to react. You need to understand that. But another piece of it coming back to the question of how you're training your models is that there's always going to be bias. I mean, we talk about how do we mitigate for bias in building out an analytics model. But the truth of the matter is you can never get rid of it. All you can really do is train yourself on becoming hypersensitive to the assumptions that we make. Because really when you get right down to it, that's all bias is. But be aware of where we're making assumptions and recognize them as assumptions rather than as just the way it is. If we had had machine language or machine learning rather 50, 60 years ago, it would be typical to train certain kinds of job growth algorithms on the assumption that women are only going to do a certain kind of work and only if they have the appropriate mail oversight. Nowadays, and rightly so, we bristle at that. But that was just built into the fabric of how people thought. It wasn't even a bias. It's just the way things were. And I would posit that 50, 60 years on, it may not be that, but we have equivalent things that 50 years from now will be every bit as appalling when we look back on it. And the more that we can actually understand the human element of what goes into data management, the more we can actually start to come up with solutions that are fair, equitable, ethical. And basically, we can actually make the machine's behavior better than we as people ever did. For people just getting into data and thinking about these rules that you've been talking about that are going to become more available as time goes on, what advice would you give to those people? Is it continued just to make sure you understand the human elements? Are there classes? Are there books? Or is it what additional advice would you give? Well, I got to tell you, I have a kind of unique, well, I like to think it's not that unique, but when I read these articles and I see them everywhere from like local newspapers to the Wall Street Journal about things that basically say, these are the degrees that will make you the most money. If you want a high paying job, this is what you should study in school. It's like, no, I have always, that to me makes sense if you're going to a trade school. If you're going to college, you don't go to college to learn a craft. You go to college to learn how to think. And to me, one of the best ways to learn to think is study something about what you're passionate. I like to think I'm pretty good about what I do in data management. And I'm doing it all having an undergraduate and three distinct graduate degrees in music. And it was something that I have always been passionate about and still passionate about. And so I would spend hours and hours and hours digging down there, learning how things were taking stuff apart at the smallest detail until it got to the point where I had that almost automatic recognition of things and see how they replicated on larger scales. So, I mean, I don't know is advanced basket weaving great training for data management. It might be in somebody's cave. But really, I think that is the key. It's like, go to school to learn how to think. And yes, there are certain kinds of things, especially if you don't aspire to advanced graduate work. Yeah, there are some basic kind of skills you're going to need. Even if you're not a program, there's basic programming concepts you have to understand. You've got to know some basic stuff about how data is structured. You have to understand concepts like data quality and data architecture. Not necessarily well enough to do them, but even if you're going into the business side of data management, you really need to be able to talk to those people. So you do need exposure to that. But I think fundamentally study the things that will teach you to think. And I think more to the point, build the muscle. I mean, I always thought that most schools should have a course called BS 101. You want that ability. Some people, I'm appalled at what I sometimes see fundamentally intelligent people accepting a face value. It's like, you really need to start asking the questions. I had an automatic level. It's like, to the point where it's so fast and automatic, you don't even realize you're asking the questions. But it's like, who's saying this? What's their vested interest? What are their sources? Why are they saying it now in this particular form? All those kinds of contextual things. I mean, don't accept it as fact until you've really done the research. Ask the questions, push back. If you're making assumptions, be where you're making assumptions. We're back to that again. But as much as you can, don't make the assumptions. Do the research, find out what you can. Use things like AI and ML to do the heavy lifting on things. But don't believe everything that you see. I mean, it's become kind of a joke that it's like, well, it must be true. I saw it on the internet. But don't forget, not that many years ago, people used to think that about TV. It's got to be true. They said it on the news. And it's like, even more so if they said it on CBS or NBC or one of the big networks. Now, you could trust it better back in the days of Walter Cronkite, but now I'm really dating myself. But the thing is, even then, it was like, we trusted Walter Cronkite back in the 60s. And I will say, I'm just barely old enough to remember that. But he was trusted because the thing is like, even if he was saying something that seemed incredible in the moment, we knew from his track record that there was a very high probability that he was giving you an objective, fact-based report on something. You know, and I think as a population overall, we have lost a lot of that ability. We are too far too likely to latch on to the opinions that are the same as ours and to automatically reject the things we don't want to hear. And I know this, and this is an obvious conversation that goes way beyond data management. But to bring it back to that topic, I think these are the kinds of things you need to get trained in. I think college is an excellent place to do that because not only is it a place of learning through classroom activities, it's a place of learning. It's one of the reasons why I personally am very down on remote college attendance. Because to me, I think of all the arguments and the heated discussions and drinking coffee at three in the morning while trying to debate such and such a point with other graduate students. You know, whether I read with them or I didn't, I mean that kind of passionate level of inquiry was something that came out of it. And it's when you become highly immersed in that degree of inquiry, when you can be that critical, when you know the questions to ask. I think those are all things that ultimately train you to have not just an entry level position and data management. But I think those are the kinds of things that ultimately have you leading. If it's in your career path or your desired career path, you want to run a really big kind of company and manage data for some international firm. I mean, that's what gives you the kinds of skills you need to come up with. Makes sense. And just kind of paraphrase and summarize. And I'm hearing this from a lot of data practitioners is be curious, ask a lot of questions. And then you're adding to that and saying, basically build context around the data that you're receiving. Data with context, so important. And that's really great advice. I know law schools have had courses in ethics. Of course, they've had to for so many generations now, right? But maybe we do need that additional. We have a lot of it that topic on our conferences now talking about data ethics. But that maybe needs to be more prominent in a lot more education resources. Yeah, I completely agree. I mean, just given the nature of the data diversity event, I mean, specifically data ethics. Because I mean, yes, almost any large firm has mandatory behavioral ethics. You don't accept gifts over a certain value. In some companies and industries, you don't accept gifts of any value from a client or a potential client. You do the things that avoid even the appearance of conflict that interests. But there are all those behavioral ethical things. And that's very often the kind of ethics that get taught in law schools. But data ethics really boils down to it's like, what data am I collecting on you? What am I disclosing to you about how I'm using that? How am I using that? And not only just kind of like, what kind of reports am I running on your data? What kind of things am I trying to infer from your data? But how am I acting on that? Am I selling that data to other companies? Or am I only using it for my internal use? What are the internal uses? And so I think there are, so even narrowing it down specifically to data ethics, we very often deal with large firms that have a code of ethics in place. And they govern in compliance with that. But you say, okay, well, what about data ethics? And they go, okay, well, what's that? You know, they haven't really thought. I mean, in a modern society, that is every bit as critical as behavioral ethics, so much so that some of the people I talked to, it's actually hard for them to think of them differently. It's like, well, I wouldn't do that because that violates this behavioral ethics. Yeah, but let's look at the ways we actually put guardrails in place to prevent people who aren't thinking as carefully about it from accidentally stumbling into these kinds of things. Absolutely, and then, and the other thing that I heard you say is be passionate about what you're doing. And that's a common theme as well. So don't just do it for the sake of doing it, but be passionate about it. Love what you're doing. Exactly, yeah, yeah. So important, yeah. Years ago, I did, I was on a consulting job with somebody from another firm who told me, I've forgotten exactly how she put it, but she, prior to becoming a consultant, she had worked in a company where she had a small team reporting to her. And she told me, I hated passionate people. All they did was argue. And I get it, you know. The thing is, is like, you know, as long as it's controlled and it's not arguing for the sake of arguing, I, you know, I mentioned I was at Bank of America for several years. I had a boss there for most of the time I was there, who we, a lot of what we did was write standards and policies and internal white papers. And we would review each other's work. And then like debate, like every single thing in there that we disagreed with. But we would never stop the discussion until the other person legitimately came around to the way of thinking. And the thing is, even though he was my boss, you know, he never once said, shut up, this is the way it is. It's like it was either he was convinced of my point or I became convinced of his point. And it happened about 50-50, you know, and they were always really, really fruitful kinds of discussions that not only led us to produce higher quality work, but I think really gave us both broader understandings of the environment in which we worked. As the other piece of your advice, keep learning, right? Keep thinking, learn how to think and keep thinking. And yeah. Yeah. So which you can't do that without debate. And learning from others. Yeah. And I think and that loving to learn is, I think that's something that's critical because we are in a time when so much of our industry is changing at, I mean, sometimes almost. I mean, it doesn't happen every day, but it's not uncommon every year, a couple of years for something to happen that so drastically alters it. Who, I mean, what was Zoom in January of 2000? Our 25th anniversary. You know, it's like, yeah, people knew about it. It was just like, it was an alternative to Skype. I mean, when your friends moved to London, it was how you called them because it was cheaper than using the phone. Who would have thought that it would become so part and partial to how we operate as a business? And then as data management people and data governance people, it's like, how long did it take us then to start to realize it's like, oh, wait a minute, this is essentially data sharing. How do we control what gets said on a Zoom meeting? How do we know if somebody is recording a transcript or recording the whole thing? That's when we started to see the various kinds of warnings pop up, not just in Zoom, but in the other video conferencing tools. So there are these game changers that come about at an ever increasingly rapid rate. And those of us who aren't willing to learn about the paradigm shifts and to learn about, you don't have to learn about the new technologies to the level that you can program them, but you've got to be aware of what they offer and what the ramifications abuse of them are. And so that lifelong passion for learning, I think stands anybody in very good stead for a career in data management. That's great advice, David. Ah, I love it. So important, such important concepts. And I'm hearing that a lot from a lot of different people. And it's, which is really good. It's fun to see the patterns of all from these interviews because I do love data as well. So if somebody were to want to reach out to you or get involved with Orteca, how would they go about doing that? Um, yeah, um, well, I don't expect anybody to write it right down and I don't have it printed out to show you, but david.coalski at Orteca.com will reach me. You can also find me on LinkedIn. There are a bunch of other David Koalski's there, but I'm the one, I never remember the format of that personalized address they give you on LinkedIn, but it's whatever it is, LinkedIn.com slash in, I think. And then, but the last part of it is David Alcoalski. And we'll get that posted to our page as well. So make sure people can, can reach you if they, if they want to solicit your services. Yeah. And I'm always interested in talking to people about this, you know, whether it's somebody who wants to work with Orteca, work for Orteca, or, you know, just chat about new concepts. You know, I, you know, A, because I love learning about it, and B, it is part of my job, but I just am always interested to hear about emerging trends. You know, and not just trend, I'm interested in the technology trends, but you know, I'm also very interested in what kind of business drivers are emerging. You know, we are seeing a lot of businesses that are now being forced to align, even if it's not regulatory pressures. I mean, there's a big conversation for data privacy. I mean, the U.S., I'm embarrassed to say, is woefully behind the curve in terms of having, you know, a good data protection act at the international level. And unfortunately, that means we're now getting into this situation where, you know, one by one, the states are writing their own. And so far, it's, well, this one's a little bit stricter than this, and this one, you know, lets you get away with such and such, and these two are almost the same except for this. But eventually, if that continues, we will get to the point where this one says do X, and this one says do Y, and you can't do them both because they're mutually contradictory. And at that point, it literally becomes impossible for certain companies to operate in different parts of the United States. And I know there have been efforts to come up with a national standard and still not going anywhere really substantive yet, but, you know, those are kinds of things. I mean, to some respect, I always find people who are tied directly into federal government or state governments, for that matter, but especially at federal government levels. You know, I'm always very interested to find out what kinds of things are happening there because ultimately that impacts the businesses and the businesses, you know, are what it's really driving how data gets managed in any particular firm and even across any particular industry. Absolutely. Very, very important. And I agree that I hope it unifies a little bit here soon. It would make my life a lot easier. Yeah. Well, David, thank you so much for joining us today. I really appreciate the time. It's really been an enjoyable conversation. And for all of our listeners out there, if you'd like to keep up to date on the latest podcast and the latest in data management education, you may go to dataversity.net forward slash subscribe. Until next time. Thank you for listening to Dataversity Talks brought to you by Dataversity. Subscribe to our newsletter for podcast updates and information about our free educational articles, blogs, and webinars at dataversity.net forward slash subscribe.