 Obviously, Facebook can manipulate its algorithms to attract users, and I guess my question would be do you feel, in your humble opinion, that simply maximizing profits, no matter the societal impact that is justified, and I think the question then would be, that's the short question, which I think I know the answer, what impact Facebook's bottom line would it have if the algorithm was changed to promote safety, and to, instead of, changed to save the lives of young women rather than putting them at risk. I'm learning about the talk button. Facebook today has a profit, it makes approximately $40 billion a year in profit. A lot of the changes that I'm talking about are not going to make Facebook an unprofitable company, it just won't be a ludicrously profitable company like it is today. Engagement-based ranking, which causes those amplification problems that leads young women from innocuous topics like healthy recipes to anorexia content. If it were removed, people would consume less content on Facebook, but Facebook would still be profitable. I encourage oversight and public scrutiny into how these algorithms work and the consequences of them. I appreciate that. Obviously, I think the Facebook business model puts, well, poses risk to youth and to teams. You compared it to cigarette companies, which I thought was rightfully so. I guess the question is, is this level of risk appropriate, or is there a level of risk that would be appropriate? I think there is an opportunity to reframe some of these oversight actions. When we think of them as these trade-offs of it's either profitability or safety, I think that's a false choice. Then reality, the thing I'm asking for is a move from short-termism, which is what Facebook has run under today, is being led by metrics and not led by people, and that with appropriate oversight and some of these constraints, it's possible that Facebook could actually be a much more profitable company five or 10 years down the road because it wasn't as toxic, not as many people quit it, but that's one of those counterfactuals that we can't actually test. Regulation might actually make Facebook more profitable over the long term. The idea that 20 percent of your users could be facing serious mental health issues, and that's not a problem, is shocking. I also want to emphasize for people that eating disorders are serious, right? There are going to be women walking around this planet in 60 years with brittle bones because of choices that Facebook made around emphasizing profit today.