 Welcome back to Think Tech. I'm Jay Fidel. It's a Monday morning, 10 o'clock. And today, on the middle way, we're going to talk about the future world. We're going to talk about personalization of automation. And we have Chang Wang and Martin Handman. This is a continuation of the discussion we started before. We're going to talk about how automation will affect society at large and, you know, the political structure of society at large. We're going to talk about the use of personal information and preferences and how important they are in determining complex decision-making processes, the role of machine learning, inferences based on predefined persona types, and privacy concern. Talk about that too. There's a lot to talk about and there's a lot to be concerned about. And the question ultimately is should the pursuit of automation alert social policymakers and the law potentially intervene through regulation on what legal grounds, what interactions or events should be monitored? These are really difficult questions. So Chang, can you talk about, you know, how you see this topic unfolding? And can you make a reintroduction of Martin Handman, please? Absolutely. Martin is an expert in automation and machine learning. And he has been a senior manager with Thompson Reuters for many years and later with other big companies like 3M. He was my first boss when I entered the corporate America and I learned so much from him. And today as you can see there, he's my boss. But I have the privilege to continue to learn from Martin and invite Martin to your show to talk about his domain of expertise. And I'm fascinated by automation, machine learning and AI and I want to just a quick, I have some burning questions for both of you. First of all, I want to share some of my thoughts from my most recent two-month trip overseas. So what to strike me to see that there's a sharp contrast between the United States and beyond the border. So in this country, there has been always a very deep distrust of government. You probably remember Ronald Reagan said that the government cannot solve the problem. Government is a problem. And this is just a completely, this statement that's completely understandable from non-Americans. And so what do you mean government is a problem? But because of this distrust, there are two sides of the sword. On the one hand, we can't really have heard community have everybody vaccinated. On the other hand, our privacy are very well protected. Don't think about the vision of privacy and to sell your personal data. And I had a chance to talk to a general counsel of international cooperation in non-American. And he's addicted to sell personal data to commercialize the personal data. And it's just a completely contrast. And they don't really worry about the control, dictatorship, whatever. They think that personal data is gold. Data is a new code. So I just curious what's your general comments on this deep contrast between the American attitude toward privacy and a government control versus overseas other countries attitude? Go ahead, Martin. Answer that. We only have six hours. Yeah, exactly. Thanks, thanks, John. I better just set the scene here. I'm a product developer by background. I'm not an attorney. I'm certainly not a expert in privacy, security, governance, governance regulation. But I've been working with customer data for over 30 years. And I remember back to the 1990s when the internet was really just a stumbling platform where I was having to take customer data in relational tables and work it, massage it, clean it, manipulate it, and then use it for like was considered considering to be evaluated service. Now, past 30 years, I don't think I could get away with any of those tasks right now. And in some fashion, it's a good thing. But it's certainly limiting innovation. So from where I'm sitting, the question I think that you, John, is what is the role of innovation between those two systems, the US driven commercial exploitation system, and then China as a system of government? I mean, I think both countries are innovative, perhaps they're innovative in different ways when you look through the lens of privacy of customer and individual people's data. Would you agree? Yeah, I would agree. And there are definitely different purpose or ultimate purpose of how to use the data and what ultimately you want to achieve, you want to accomplish by exploiting or utilizing those data. So I'm not an expert of automation and machine learning. That is your expert. I just feel like we don't really need to commercialize. It's highly the big innovator of that. But the United States is a big innovator of that. But I think to me as a customer, as a consumer, almost most innovations are not impacting, in my view, is unnecessary. So I think to go back to the US, I think all companies have driven by a concept called digital transformation. And I've got a slide which I like to use, which I think underpins most organization investment strategies of trying to get closer to their customer by collecting data on those customers. And it goes without saying that the more you know about your customers, the more you can add value or develop products that your customers will want to buy. But there are some digital transformation concepts about the way that you collect that data, the way that you normalize that data, the way you deploy AI or what I prefer to use machine learning on that data, which I think is very much driven by the goals of the organization. And Mark Benioff, who is the CEO of a company called Salesforce, is very much driven by the need to focus in on individual requirements, individual needs. You have to know much more about those individuals than you do at the moment in terms of purchasing behavior. There's a lot of behavioral data in the way that you visit websites, the way you use your mobile phone, which on the surface may not tell you too much about the individual's goals and aspirations. But when you collect it over time, you aggregate it, you connect it to other data sets. On that individual, whether it be their Facebook page, other social media platforms, you start to get much more of a what I'll call a snowflake vision or a snowflake picture of that person. And then being able to ask, what do I know about this person in terms of providing additional value around the products they already buy or potential products I can develop for them in the future? You know, I get a creepy feeling down the back of my neck with all this. You know, there was a time when privacy and the right of privacy was a real right. That you could say no, that you could holler and scream and say, get away from my private information. But I think most people have succumbed to that now. They have gone into the forest here knowing that there are traps all around, people gathering data on them and not only gathering data, but selling the data, putting the data out there. So, you know, it's like you're under a microscope your whole life. And that that phenomenon is increasing. I was telling you guys before the show began that I had an unpleasant experience with Amazon Web Services today. All my experiences with Amazon Web Services are unpleasant. And what I take out of that, by the way, is that you talk about Benioff and trying to tailor his pitch to the individual. That's the exception, not the rule. The rule is that I am not a customer. It's not a customer oriented society anymore. It's a demographically oriented society. In other words, if I have, you know, 10 million users, 10 million people signed up, members, what have you, I don't care about the one. I don't care how people feel. If they don't like me, I don't care. All I care is having the most of the million and then I'll sell my company. I'll sell my data. It doesn't matter if people like me very much. All I have to do is speak to the masses. The individual becomes less important. I don't know if you guys agree with me, but that's my personal experience. I do not feel that the system we have achieved now, using data, gathering data, using AI, examining hundreds of millions of people and personal data records is oriented toward the individual. The individual has been lost in all this. I think we're going through the bumpy road of a new industrial revolution where user data, as Chung was saying, is the new coal or the new oil for the system. We haven't really got any good examples to show that the future is actually bright and does protect the individual. It's very much in the early stages of trying to work out what that industry looks like, behaves like, and how it's regulated. It's very much like the early days of the internet. The same concerns were around back in 93, 94, 95. I had plenty of people telling me that it was both a blessing and a curse, but both extremes. I think you're talking to the same tune, Jay. Well, I think in our introduction to this, you raised, Chang, the possibility, the need for regulation. This is a problem in a First Amendment society, theoretically. I think our First Amendment is going away, but hey, it's going away in a lot of places, so I can't get too upset about that. But in a First Amendment society, it's very hard to regulate free speech, isn't it? And you talk about the word regulation, how exactly in a perfect world or in a world that we think might be perfect, can we regulate the collection of data, the right of privacy, which is not entirely recognized. How can we have governments step in on this? Because government hasn't effectively, may I say this? May I say this on the air here together, gentlemen? Government has not effectively stepped in on this at all. Following up on what Jay just asked, I also have a tough question for Martin as well. Are all the innovation needed or they are basically innovated to create more desire to buy and to consume? And so if the innovation is geared toward to improve people's life, to make our life easier, to make medical better, to make the environment better, and I'm all for it. But if the innovation is mostly to create more desire for the consumer to spend, to buy, and more opportunity for the big packs to sell, and I'm not sure about that. For example, yes, go ahead, please. Let's talk about a specific example. I mentioned this on the last call. Autonomous vehicles will have the capability to interact with signs, signage, road markings, and other roadside devices. And a use case that was recently discussed in Minnesota was what happens when an autonomous vehicle is crossing eight lines between, say, North Dakota and Minnesota or Wisconsin and Minnesota. What would various agencies like there to be between an autonomous vehicle and signage, right? Now, it's not just it's not just directions, obviously. It could be local hotels, restaurants, it could be health providers. The list of companies that would like to be in that conversation goes on and on and on. But from the driver's point of view or not the driver, but the person in the car, what are the use cases that they would find valuable as they go through Minnesota, Wisconsin, state line or something happens to them? They break down or there's a medical emergency. What data would they like to transfer between your autonomous vehicle and the IoT or the device which is going to connect that vehicle to Minnesota's infrastructure? That I mean, you can see where I'm going with that, right? It's all positive. It's all positive. I, you know, cars come with that now, right? If your car breaks down, they'll send a message and you know, before you know it, there'll be a tow truck or an ambulance, whatever is necessary. It's this is wonderful. Nobody would argue with that. Where I get stuck on it is that they get inside your mind and it goes further than retail. Let me let me open a subject with you guys. See how you feel about it. It was recently an article in one of the tech journals that I get out. It was about a social media organization and I forget which one it was that played movies. And if you click on a movie and you watch it for 10 seconds or 20 or a minute or two, it keeps a record of that. And the next time you get a message from this particular social media, it's going to be based on what you did in the previous movie. It is developing a developing a profile of you on your taste. How how much interest do you have in this subject and the movies test you, they test you on your subject. So as you go down the path, all the movies you get are based on the movies you've already seen. It knows you. It knows your personality knows you know what you're interested in and it knows your politics. It knows your philosophy. It knows your ideology. It knows everything about you by virtue of the express tastes that you that you act on when you select these movies. Now, I don't mind retail. I have most people talk, but I do mind that they're keeping book on me and they're going now. They're going to send me like Vladimir Putin is going to send me stuff. OK, that that tends to divide me from others. It tends to play on that taste. And we know already that Mark Zuckerberg has sold him information through Cambridge Analytica. We know that he's collecting it in other ways through ERA. And you know, he knows too much about me. OK, it's not just me, though. Remember, this is a demographic experience. It either me alone, I'm not consequential, but 300 million people are consequential. And if I know how they feel as a group, if I can do this kind of what do you want to call it survey of public opinion, I can shift public opinion, I can sway, I can create public opinion. That is much more of much more concern than retail and and having someone send a tow truck. There are definitely tools needed to protect your own data. And there's an article came out a couple of years ago by Tyler Wellmans from Deloitte, that talks about needing to bring in the element of trust in your interactions with whether it be Netflix, AWS. And you shouldn't have to do that yourself. You shouldn't have to decide whether or not you you trust these vendors on an individual basis. You need your own version of automation, which has been watching the way that you react to things like your neck. Exploits is you can you can downgrade them as a trusted vendor if you want to. And then your automation refuses to connect or even share your user data with that vendor next time you connect. So those tools don't exist right now. You need protection. You need automation to protect your user profile and the data. So in a way, it's one sided right now. Any time you interact and there's not enough regulations to prevent the vendor just taking whatever they like. So you're you're right, but I think those there is a commercial angle to this. What would happen if at some point you felt the problem was so with so acute that you needed a service to protect you? Would you think there would be vendors interested in developing that service for you and would protect you protect your data? And then would you buy it? I'm already at that point. I would buy that service today. The problem is that commercially and conceptually in the business world, you have to somehow insinuate that service between me and all these people who want my data. And I would agree, but were they? How do you actually insinuate it? And, you know, to your point, Chang, is government necessary to facilitate that insinuation to encourage support, incentivize somebody come along and be my gatekeeper? Interesting. In terms, I did ask a friend of mine who was a security expert, which area is moving forward with the most regulation around privacy right now? And he pointed me towards the federal regulation for student privacy. It's a federal agency and there have been a hundred and forty new laws passed in the last eight years, which are protecting student privacy. Is that federal or state, Martin? Yeah, is it? OK. Is that federal or state? OK, I'm not. Yeah, I'm not familiar with it other than that's very active and it puts the the responsibility of protecting students and students data on the school district or the school institution. And it is really only guidance. It's the laws have been coming out at the local level, state level, and and it's it's it's being it's frenetic. Does it and it's causing problems for the educational technology companies because they're just not able to get the data now that they were getting, you know, five, ten years ago to normalize their own their own technology and product. So that seems to be moving the fastest by now. So if it starts an education, maybe it will spread to other other other sectors naturally. I think there's money to be made here. If somebody came to me and said, look, we will be the gatekeeper for you. We will protect your privacy. We will let you choose whether they're, you know, getting data from you. And for this, you have to pay ten dollars a month or whatever. I would pay that handily because because I'm, as I said, I'm getting that creepy feeling. I'm not sure that it would work without government intervention. What do you think, Cheng? Is government intervention necessary to make the gatekeeper work? It is a terrific question, but I don't think I have an answer for that. And for in this country, we don't trust government to do. We protect ourselves mostly from the government. So it's very hard for the government to the federal level, from the federal level to reach to the state level and the local level. And He's right, isn't he, Martin? He's right. Government government's not the one to do that. If I'm looking for a gatekeeper, a commercial gatekeeper that I pay, a business, a public company, for example, that's all over the world. That says we will protect your data. That has to be a competitive, effective, high technology, AI kind of software that will come and protect me. It's sort, you know what it's like, it's like spam, the spam companies, none of which are that good, by the way. But it's like that same kind of protection they might offer you. And they got to be good. Yeah, I would like, as a product developer, I would like for companies to take the responsible line and to protect and to promote it as part of the service, we protect your data. And this is this transparency on how we're using your data to provide better services. However, as you may have seen in the news recently, this is concept of coded bias where organizations who are not particularly up to speed with machine learning and what and how you model data and the ethical and moral issues of protecting the individual within these models who are blatantly misusing this data to get the result thereafter at the expense of the welfare of individuals and recent examples where algorithms have been developed to provide insights on a person's ability to do their job. And these algorithms have been weak in certain areas and the data has been used to essentially fire what are considered to be underperforming individuals. But actually, when you when they look through the records of the individual, the algorithm is saying they are underperforming, but the actual evidence suggests otherwise. Or it's racist or it turns out to be racist. They've been a lot of a lot of press about that. Exactly, exactly. So you these tools like any tool can be used for good or bad. And in the hands of people who don't understand the data and it comes down to the data at the end of the day. Mistakes can be made and they can have, you know, costly consequences. So now in those those instances, I think there is a role for the government or state legislative. Well, you know, the thing is that you talk about algorithm. Okay, an algorithm could be, you know, millions of lines of code. And you can see easily how it could spin into millions of lines of code. How can you get people, anyone, a regulator, an individual, government, somebody who look at the algorithm to have first to have access to it, because Mark Zuckerberg is not going to give you his algorithm voluntarily. Okay, but somebody should look at it and see if it's racist or unfair, or he's, you know, selling conclusions that are, you know, destructive. How do we open the algorithm up? Yeah, no, it's actually possible, but you use training data. So these algorithms don't do anything on their own. They have to use utilized data. So what you tend to do is you take a model where the data has either been corrupted on, you know, on purpose, and you test those algorithms against the corrupted data to see the output. And you test the algorithm based on that, both from a positive and a negative point of view. So those algorithms can be tested, as long as the data is transparent and understood, where it gets a bit tricky is if these algorithms are running on what I'll call unstructured data, where the data is not really understood. Then, then that's more difficult because it's you can't really reverse engineer back into the data, because the algorithms are sort of trying to make sense of the data, if you like, and then using predictive algorithms. And that was coming up with an outcome based on, you know, the logic, if you like, of how they're making sense of the data, but you can test algorithms which have been organized, probably to interrogate structured data, data which is understood. I want to ask you both this question. You know, we need to address this, I think, I think that's unanimous. But is it too late. Is it, you know, is it too late, for example, to deal with people who use this kind of data in a political environment as a political only I say weaponized as a political weapon for elections to change minds to create divisiveness, because it's hard to reach them. It's hard to lobby against a company that has billions to spend in lobbying against you. It's hard to change the law in favor of the public good, when there are companies that are so big and so wealthy, so cash loaded that you can't compete with them in a legislative forum. How can we, how can we make progress on this. What's the leverage against a company that is so big, or a bunch of companies that are so big that they're virtually untouchable. I don't, I don't think it's too late. I think we just started the journey of knowing what our individual digital identity is in an interconnected ecosystem. Social media is part of the ecosystem, but there are other data driven platforms like government platforms and professional network platforms, which I think will ultimately supersede all of the, you know, the, the things that are going on but we don't, we don't agree with I think I mean, I go back to the start of the publishing industry in the 17th century. I mean, sorry if I told this story on the last talk, but after the Great Fire of London in 1666, the insurance markets were completely reinvented around the coffee shop houses because they were the first to get back into business after, after London was was burnt to the ground, and those coffee shop houses had a strict policy of who could go in. You couldn't just go in as a general public you had to be a professional broker or professional underwriter, go into the coffee shop and interact with other professionals in the insurance industry and that's essentially the beginning of Lloyds of London which is a national institution, a large scale insurance contract, which is actually they've been in trouble in recent years haven't they. Yeah, the point the point that I'm trying to make is that when when there is lack of trust, you will find a new networks emerged to reestablish trust amongst this members. And so I think that's what that's what will happen in this case. And it really about transparency I mean you get to trust through transparency. So for example, why, why can't I get on. Why can't I get on the, the, the internet and ask Amazon Web Services for all the data they have on me everything. Every, every piece of data they've ever collected on me. And the same thing with Google which probably has much more data than Amazon does. And the same thing with Facebook and social media. I want it all and you know I may be lazy not go through it all this but at least I have it all. I get to a point where they would do that voluntarily. Yes, I think I think they, I think they have to. I think also it has to be a red delete button. If you feel that, you know, for whatever reason, they've got too much or they're using it in the wrong way. What would happen if they said fine, you can delete it. Here's the button. But before you press that delete button here are the services which will no longer function on your behalf. It will become essentially somebody with no credit history. You'll have to, you know, build up all this user data, get the kind of predictive algorithms to work in your favor. Maybe you're not impressed with those services so you'll go straight to that delete button and press it every couple of weeks. But it's the, then the challenge is up for Google or AWS to provide those services, which prevents you from deleting that user data because otherwise it won't work. Yeah. Well, on the one hand, you may not have a tow truck coming your way. But on the other hand, you might you might achieve something we all we all envy and that's what was it so row and Walden pond, living quietly, isolated from society. Oh, God, if we can only get back to that now. It's no longer available anywhere in the world. Just to give you a small, a small story from my own recent history. I've started coaching at the University of Minnesota. I'm an assistant rowing coach. And I got, you know, I've been growing for 40, 40 years. So it's something I've always wanted to do, you know, as I, as I, you know, getting into retirement. And I was contacted by the university office to provide metadata on the COVID, COVID boosters. And not just that, there's also something called US safe sport, which they need at least in 1520 years history of interactions with teenagers. Well, I've been in corporate America, I have no interactions. A red flag went up. And they asked me, you know, if I'm so passionate about coaching, you know, Tina, essentially 18, 19 year olds, how is it I've not had any US interactions. And so I'm in this red flag situation, where, and I am actually a qualified high school teacher in the UK. I actually have had interactive interactions with the students. But it's not in the US, it doesn't count. So the fact there's an absence of data has created a red flag to attach to my, my University of Minnesota employee, employee record. You might, you might somehow have a knock on the door from the absence of data police. They may be coming for you. What is wrong with Martin. So let me, let me, let me offer this though. Right now we have this all this strange stuff going on around vaccines and masks. What's political arguably, some people truly, you know, sincerely believe in some religious exemption or they believe in liberty and freedom from vaccines and all that. But the better common policy is, if you want to save, you know, seven or 800,000 lives. Then, you know, you have to do something, and you have to require things and you have to make sure that everybody follows the science. And people don't want their medical data, you know, to collect it, either individually or as a demographic, you know, anonymously. They don't want to collect it because they have political or religious, you know, objection to that seems to me that there is no question. If you're going to save seven or 800,000 lives, you have got to collect that data. In a way, I mean I would say you have to collect it individually. That's the whole point of testing. You know individually who may be exposed. So where are we going with that Martin. And it seems to me there are situations, especially in public house where we really need to have it. We need government to get it. Yeah, I mean this is a great topic. I wish I had something insightful to say I'm watching on the sidelines as well J. I think it's going to be fascinating to see how the US government squares the circle around all of the freedoms that we expect to have around our data. And then the public health debates. I mean it's still in progress right still. This is still game on. It's still in progress while people are dying. That's, you know, the question there again is, we have the time. Right. Now, where should it be James should it be at the federal level or should individual states weigh in on this. Well I think one of the failures of the Constitution. Sorry, was this thing about federalism. And I think the GOP now is driving a truck through that has been driving a truck through it for the past 50 years. Federalism is just an escape from the common good. So my own view is that if we had a constitutional convention or a reform of our Constitution, that would be one thing I would look at right away. I don't like it anymore. If I ever did. Two foreigners on the on the call here Jay who who have a ton of respect for America's Constitution so I don't know about you Chung but I couldn't criticize anything that's going on could you. Well, let on a positive note I just want to tell you that I do not share the deepest trust in the government. I'm not a big fan of the government but I do respect the government and I can tell you my personal experience, how it feels when the government works very well when I landed in San Francisco International Airport from a flight from Hong Kong. When I landed, I walk out of my take my luggage and then walked in front of a chaos or global entry or trusted the traveler program. They have my face recognized print out a little slip and the slip to the officer nearby and they just walk out and then take a cab taking my hotel on touchdown of landing would walk in my hotel room 45 minutes. And that is just the future word for me should look like that you know you have you have total confidence in government or total confidence in technology. I'm not sure that is just a short live day dream, or that is our future, but I'm going to bet on the positive side of that. That actually takes me to a thought I like to leave you both with. And that is, I've always felt that this this goes, this goes to the practice of law that you could have a judge in a little black box, about the size of a Rubik's cube. And you feed in the evidence you feed in the arguments, and the little black box makes an unerringly unerringly correct decision. I know that doesn't take into account the frailties of humanity, and all the you know the special aspects are a species, but you could build that in. You could make an algorithm with millions of lines of code, and that little black box will give you a better percentage of correct decisions okay, take the same idea to government. Okay, now I have a bigger black box, and I put it in Washington DC and I feed in all the data, including the data that's appropriate from, you know, our transactions in our lives, and, and I have that make policy. And I think, a, it would be able to do that. Right now we have trouble doing that. And be, it would be correct. Most of the time. If you put the norms in the box, the box will give you the norms back. Don't you think that's the future. Martin, what do you think I do, if it's a process that can be written down. If there are rules, which can be defined around the process, and what you're describing is absolutely possible. Wow. And Chang sounds great to me. We're looking forward to that. Yeah, what about your law practice. Yeah, it's a, I, I know AI is replacing law parties in some areas, but there are two areas. I have a total confidence, and the AI will never replace Martin can correct me. One is constitutional law. And, and one is immigration law, because of both part of both areas need a very strong level, very high level of empathy, which the machine learning hasn't be able to achieve the machine have very high IQ, but they don't have a lot of EQ. So whatever you need to EQ, the machine is at a disadvantage advantage, but I'm going to leave that to Marty. Human, the human condition can never be fully automated. I think is what you're saying. Yeah, if you agree. Martin, Martin, can't we, and empathy is rational. Empathy is something you can, you know, you can actually write it down what's empathetic and what's not. Why can't I put that in the algorithm. If, if there are rules associated with the decision process you make, when you empathize, then that's true. But is the, it's a level of complexity around these cases on something that can always be written down, and you see it repeat over and over again or it's always something that comes up at the last minute, new variable or a new piece of information, which changes everything. I keep thinking of the words of in 2001 a space oddity. How I can't do that for you how you remember. Okay, I think it's time to summarize. That's not going to be easy for you. And Carlos the show check. Well, I will go leave it at the wall where I thought Martin. You do summarize that. So I think we're starting a really fascinating journey, where the, there's been innovation coming from your technology, clicking data in new ways, access to data, which has never been possible before. And then society at large has to prioritize. How are we going to use these new tools to benefit us as a society as a collective, and then maintaining that that personal freedom at the heart. I don't think there are any, any immediate answers right now it's going to be fascinating to see and whether it comes from the educational angle or it comes from health care. There's going to be some, some event. I know I don't think it's actually covered there's going to be some other event, which brings us into that new industrial revolution that we're all that we've been promised from Tesla and others but we're not there yet. So it's, you know, stay tuned, I guess is the summary and and see how the world is going to unfold over the next couple of years in this space. Yeah, and we'll see how Amazon web services treats me. The next time I wait on hold for an hour. Okay, up to you Chang now you can close. Well I have just one thing to say the best is yet to come. Let's wait for it. Wait for it. Okay. Thank you Chang Wang. Thank you Martin Hinman really appreciate this discussion. I really enjoy working with you guys on these things. Appreciate it. Thank you Marty.