 Good morning, John. In 1911, in a very early science fiction novel, Hugo Gernsback became the first person to predict personalized news. When he wrote, the morning newspapers were transmitted to the sleeping subscribers by wire at about 5 a.m. The newspaper office, notified by each subscriber what kind of news is desirable, furnished only such news. More than 100 years later, that future is more than now. Our personalized news sources would more accurately be called information and entertainment feeds, and they contain everything from pictures of your cousin's baby to the unfiltered thoughts of the President of the United States. But in Gernsback's future, it was you who decided what news you wanted to see, and that idea on its own may be troubling. But now, who decides what ends up in your feed isn't you. It isn't even a person. It's a computer program, and increasingly a program that built itself in ways that no one fully understands to satisfy not your needs, but the needs of the platform. These machines are given desired outcomes, but how they get there is up to them, and their goals are tightly held industry secrets. But here are two things that are very clear. One, the information in your feed and how it is served to you can have a lasting effect on who you are and how you feel. And two, the business model of social media relies on selling that power for a profit. Due to a failing in Facebook policy and enforcement, Cambridge Analytica was able to create psychological profiles of tens of millions of Facebook users. We have somewhere close to four or five thousand data points on every adult in the United States. And then they could create not just advertisements, but news stories, and even entire news outlets, specifically to amplify and exploit the fears and anxieties of voters. This was a violation of Facebook's rules, but not of any laws, because in the US there are so few laws regulating these companies. In fact, Facebook doesn't want anyone else to have your data, because your data is their most valuable asset. But as artificial intelligence gets better, even small amounts of data will likely be enough. Smaller and smaller signals are becoming clearer and clearer. Machines are now able to tell with 91% accuracy what a man's sexual orientation is from profile photos. If you're wondering why Facebook is worth half a trillion dollars, it's because of its ability to capture your attention and because of what it knows about you. When fed enough data, machines can determine shocking detail with tiny amounts of input, and they will use what they know about you, your world view, and even your emotional state to help others manipulate you, as long as those people have the money to spend. This is not a secret conspiracy. It's their business model. Facebook isn't the product that's being sold. Facebook is free. You are not the customer. You are the product. On the social internet, your attention is their dollars. And for them, that means that anything that gains attention is good. And whether they decide to have any regard for truth, for the stability of society, for the health of their users is at the moment entirely up to them. And while some of us will dive into these issues to at least understand, most people won't have the time to do that. And ultimately, we need Facebook. It's socially expected, it's necessary, and it's a useful platform. We're trapped. So the only thing to do is remember and to help others learn. If you don't control the feed, the feed is controlling you. John, I'll see you on Tuesday. John, our new media literacy series on Crash Course with Jay Smooth is so good and fascinating. And I want to tell anyone who goes to watch it that it's going to change the way you see the world around you forever. It might be a little less cut and dried, but it's going to be a lot more accurate.