 Computers make a lot of decisions for us. Sometimes it's as simple as guessing what news we want to read. And sometimes it's as complex as guessing whether somebody will go on to commit a violent crime. When the algorithm used to make those decisions isn't transparent to the user, we call it a black box. Episode one, profiling. I'm Julia Anguine, Senior Reporter at ProPublica. Over the course of the next few weeks, we're going to show you some of the most common black box algorithms that you interact with on a daily basis and what they might mean for our future. First up, are you being optimized or monetized? Two-thirds of Americans own a smartphone. One in five Americans own a smart TV. One in five Americans say they go online almost constantly. Over a billion people across the world log on to Facebook every day. That's a lot of data. And these companies that build our favorite digital tools don't just collect data. They monetize it. Companies are competing to see who can get the most data on you and then offer it to advertisers. You are not their customer. You are what they sell. Case in point. Facebook knows who all your friends are, has seen all your photos, uses facial recognition, knows all your devices, knows where you live, and on and on. But it's not enough. They still buy data from data brokers about your offline life to enhance their file on you. Like what car you drive, the cost of your mortgage, and what you buy at the supermarket. Even though this data is often sloppy or inaccurate, it's a multi-billion-dollar industry. Why do this? To serve you targeted ads, of course. They call this the optimization of your Facebook experience. But maybe more accurately, it's the monetization of your behavior. We live in an era where more data is available about human behavior than ever before. As computers take in all this personal information, they increasingly spit out predictions about our lives. This can create what we call machine bias.