 So Matt, everyone knows what a credit score is, right? You need it to get a loan or buy a house. And we generally hope the companies that are creating them don't spool our data onto the internet. But China is kind of taking things to the next level of something called a social credit score. Yeah. So these large tech companies like Tencent or Alibaba actually creating their own credit scores. Alibaba's is called Sesame Credit. And there they don't just look at the bills you pay, they actually look at the degrees you might hold or even the friends you have. Yes, Matt, I hope you've been paying your bills. But the thing we cannot really overstate over these Chinese social credit scores is just how much data companies like Alibaba and Tencent have. You know, in America, you're used to companies knowing about a segment of your life. Uber knows where you've traveled, for example. Google knows what web searches you've done. And Kroger knows what, you know, groceries you've bought. But in China, these tech giants are so big, they really have information on absolutely everything you do. For that reason, these social credit scores are kind of like FICO scores, but thrust into a big data age. Yeah, I'm sure libertarians are up in arms over this. But right now, it's relatively benign. Your example of the Kroger Club card works perfectly. It's a volunteer system. And the benefits are just minor. Things like, oh, if you have a great score, you can rent a rental car without a deposit. Yeah. So the scores are voluntary right now, but I would point out they originally came from a 2014 Chinese government dictate. So in 2014, the Chinese government came out with a white paper, which you can read online. But they basically said they wanted to create a social credit system for individuals, businesses, and even government officials. So Alibaba and Tencent were two of eight pilot programs that were created to kind of fine tune these scores. And you actually see that the Chinese government has started to become more involved in these programs. So for example, they've integrated a blacklist of people who have defaulted on court fines into the Sesame Credit database. So if you didn't pay your court fine, you'll likely have seen your Sesame Credit score tank. And you may have lost a bunch of friends as they've seen their scores go down too. Yeah, you know, I kind of saw this downward spiral as a similar thing that happens in the US system of a FICO score whereby you file for bankruptcy and your score goes down. It's very difficult to actually get the credit you need in the future to improve your financial situation. So in this regard, I thought, I'm pretty similar to the system we have. The second point about, you know, blacklisting somebody in one of the articles we saw a journalist was actually blacklisted by the system was unable to travel throughout the country. The only thing I would say there is, you know, obviously, that's not a great system. I don't love that. But the Chinese government can and has been doing these types of things. Anyways, I'm not sure that the Sesame Credit score made it easier to do this. Yes, but the Chinese government is sort of known for being able to deal with, you know, journalists or political distance in, you know, any manner they see fit. The one thing I would say about the kind of changes of a social credit score is that you move from this very sort of heavy handed authoritarianism of maybe you throw somebody in jail to kind of a soft authoritarianism where you can control people's actions by adjusting the privileges they're given as a result of their social credit scores. And for that reason, I kind of see this as a big sea change for China. And if you look at quotes from people like Ali Baba, they really push the social engineering angle in a way that you don't see with, say, an American credit system, where they say that these scores are for good people to do what they want to be able to do and bad people to, you know, be blocked from doing it. Yeah, I think there's a lot of the nudge theory of behavioral economics going on here, but I think a question we should ask is why people in China might want this system. And there's a lot of good reasons for it. If you are a farmer or even a factory worker, you don't have a bank account, you don't have access to traditional credit, then this system could be great for you because all of a sudden banks and other financial institutions are willing to lend you money when before they didn't really have any proof that, you know, you were able to repay this loan or whatever it may be. Now, of course, the other side is that China doesn't have a great system of legal recourse today, which is that if you are someone who is done harm by a government official or by a company, it's very difficult to actually get that company to give you, you know, recompense for what they've done. And so the great news is that in this system, government officials seem to already be held task over what they've done. And also companies are being held accountable for their actions. So I could actually see it being a quite positive thing for China. Oh, definitely. I certainly agree about our benefits to the social credit system. I mean, like you said, you have these government officials who have essentially been labeled as corrupt, you know, it's public knowledge now, but they should probably not be dealt with if possible. And certainly, if you're looking to get credit, if you're a farmer who didn't previously have access, you know, this thing is really useful for you. I think the counter to this is to say that, you know, even if people weren't in favor of the social credit system, you know, there's not a whole lot they could do about it. And so as this program kind of evolves and changes and more data is included and becomes more or encompassing, you know, I think people might become less favorable towards it, but there's still not much they can do about it. I think that what would concern me is that, you know, when I compare it to an American credit system or American credit score, you kind of have a set number of variables that are included. So in your credit score in America, you'll have, you know, have you defaulted on a loan? Have you paid your bills on time? Have you ever filed for bankruptcy? And everyone's like, you know, I may not like my particular score, but the variables that go into it are pretty reasonable. But with social credit score, you have a much broader array of things that can be included. So not just did you pay your bills on time, but you can have things like your race or your ethnicity, your social credit score of your parents, your friends, you know, like we've mentioned. And there's always other things like your behavior, you know, are you a potential political dissident even, that could be included in these social credit scores. But I don't think from a Western point of view, we would really deem sort of acceptable or necessary. Yeah, there's certainly a minority report angle to, hey, we predict you're going to be a dissident and so we've downranked you a bit. That is worrisome to me. But, you know, overall, I would say, again, this is a question we struggle with in AI in the US all the time with, you know, even a Google search. Should it use certain feature searches, such as ethnicity, when it does that search? You know, the reality is, this is a question we'll struggle with a lot, both in the US and China. I'm sure we'll come to some sort of understanding about what's the right set of features to be used. But, you know, if we throw that aside, I think it could potentially shine a light on some of the US credit system. So one of the things we often don't think about is how odd our own system may be. Right? So we do these exact same things where we take people and we say, oh, well, you know, your parent signed you up for a credit card when you were a child and didn't pay it. And so now you have a bad credit score. We seem to have no issue with this in the United States and yet are wanting to call China to task over doing a pretty similar thing. A kind of a funny anecdote I thought about this story is that, you know, you could potentially wind up with some sort of feudal system where you have these, you know, landed, you know, nobility, if you will, who will sign up to have people become their friends and improve everyone else's scores despite not doing much themselves. Yeah, there's definitely sort of a black market of like friend trading going on to potentially boost your score. But I do think what the social credit scores kind of bring attention to is, you know, the broader debate I would say on privacy. So when you look at, you know, a social credit world, you're essentially taking every action in your life, every piece of data, even, you know, data about your friends and people around you and combining these in a giant algorithm to create sort of a single score about you that is, you know, used to decide if you can get a rental car or a bank loan. But I think a lot of people would want a higher degree of privacy than that. They said, you know, not every element of my life needs to be recorded and tracked and used to decide, you know, future benefits I may attain. And in America, I think we have a lot of debates about this kind of privacy, you know, we debate privacy versus security, for example, or privacy versus convenience. And, you know, things like privacy versus security versus come up a lot since 9-11, we've had to decide, you know, how much surveillance are we willing to accept, you know, in exchange for a slightly smaller terrorist threat, you know, we have these minor inconveniences of things like airline security that really add up over time. And it's really this big debate where we have to try and find this, you know, media ground or medium ground between privacy and security. Yeah, absolutely. I for one am completely tired of taking my shoes off at the airport. It feels ridiculous. But, you know, I have to say, if the U.S. put it to a vote of how much more security you could get to give up more privacy, I would not want to be part of that because I think people are more than willing to give up privacy to get more security. And we actually saw this topic come up a lot in our book review last, which was Nick Bilton's American Kingpin, which looked at Ross Ulbricht, who's the leader of the Silk Road, which was an anonymous marketplace that allowed you to buy absolutely anything on the internet. And this is perhaps one of the downsides to privacy, whereby, you know, because it required this anonymous transactions, you could sell, you know, pipe bombs and heroin to kids. And that's absolutely something I think most people would agree is not a great thing. On the other side, as you mentioned, there's convenience. So oftentimes we're willing to give up our privacy to companies like Uber, because we want them to be able to, you know, find out where we are, and we're absolutely okay with them using that information as long as it makes our trip just a little bit quicker. Yes, I personally would probably be kind of embarrassed to admit just how much privacy I'm willing to give up in exchange for just a teeny bit more convenience. And you certainly see that with Uber, where I know that they can track my location, you know, having stories about them tracking your location, even when the app is closed, they have things like God View, whereby a group of engineers could potentially be watching your movements at every second. Obviously it's more concerned if you know you're a really prominent celebrity, but still it's somewhat disconcerting to know that they have these kinds of features built in the app. But we're still kind of willing to use them because he's absolutely convenient and it'll be really kind of a hassle to try and live without them. Getting back to more of a social credit system, I think the thing we have to think about is that this is sort of just theoretical at this point, you know, it's limited to a trial basis. But when you actually roll out an algorithm this complex into a society, you know, you sort of have to think about the unintended consequences that could result. And we've seen that again in America when we look at things like the algorithms that have been applied in public policy. There's a compass algorithm, for example, that has been used to estimate the recidivism of offenders, in other words, how likely they are to re-offend when released from jail. And it's run into all kinds of issues, you know, there have been things like it being racially biased against certain groups. And there was also the embarrassing finding that it could not beat a simple linear regression when tested against it by some researchers. Yeah, you know, another fancy machine learning model that loses to the simple linear regression. But regardless, you often see this applied today in the realm of policing in the United States, whereby we have models that tell police you should go to these neighborhoods because there's lots of crimes being committed. Unfortunately, this becomes a feedback loop, whereby the police respond to crimes that go to the neighborhoods. It then detects more crime because they're actually making arrests. And you wind up with a pretty nasty system that's biased in a way that I don't think any of us really want. Yes. If you have a system where the algorithm is essentially affecting its own training data, you could end up with all kinds of nasty feedback loops and unintended consequences. And as the social credit system becomes more and more widespread, I mean, there have been talks about sort of a 2020 full rollout, you kind of wonder if there are going to be things going wrong, but maybe the designers do not originally intend. I think it's also interesting to talk about the designers themselves. These algorithms, at least not yet, do not build themselves. They need designers to train the data, you know, validate results, make kinds of tweaks, check everything is working properly. And if you're a top AI or deep learning researcher, I think you sort of need to be responsible about what you're working on. I mean, you could really work on some exciting, you know, object detection type algorithms. But if these are being used for things like surveillance and to potentially, you know, imprison political dissidents, well, then your life's work is maybe not the best thing to do. Yeah, this is absolutely an issue that data scientists are going to have to wrangle with as they think about how these algorithms are applied. And we actually got into this topic a little bit in our review of Nick Bostrom's super intelligence. But I will remind you, one of the outcomes of that is that scientists are motivated to see changes happen in their lifetime. And so if they can build something today, they're pretty much going to do it. Unfortunately, I do accept that, you know, if it can be built, it probably will be built, but that doesn't mean I have to like it. And I would just urge researchers, you know, take a second and think about the ethics of what you're doing, and if it's really going to be beneficial for society at large. But of course, there's a ton of money riding on this AI research for a million billion dollar contracts being handed out by the Chinese government to firms like Baidu. And for that reason, you know, the research is going to continue at an absolutely great neck pace. But to kind of, you know, take a step back and sort of summarize the segment as a whole. I think we just talked about how the social credit system combines an awful lot of data on you. But Chinese tech giants like Alibaba and Tencent have access to. And while it's kind of a gimmicky thing right now, you know, Sesame Credit is opt in, it's really just used for things like a better deal on rental cars, has the potential to become a much more ingrained part of Chinese society, especially since it has, you know, the government's backing. Now, I look at this as being kind of nefarious thing potentially, where it could be used to engineer society, you know, suppress dissidents in ways that are less heavy handed and just sending them to jail. But I think you might have done a nice job of kind of pointing out the positive, you know, aspects like giving access to credit for people who didn't, you know, previously have a way to get it and just kind of increasing the level of trust in society. Yeah, absolutely, Adam. So next week, we're going to stick to this Chinese theme. When we review a book found on Chinese leader Xi Jinping's bookshelf, the Gray Rhino. Now, this is not a story of going on Safari. It's actually a response to another book, The Black Spawn. So thanks for tuning in. Please subscribe on YouTube. This is Random Talkers.