 What we're seeing more and more is that states which see that so much debate and reporting and Descent takes place on social media platforms. They're often going to those platforms directly outside of any legal process in order to ask those platforms to take down content and we should be okay with Platforms that for example take down child sexual exploitation or incitement of violence I think it's completely fair for the companies to regulate that kind of content and they have a role in I think ensuring that their platforms don't become cesspools of hatred because that in itself interferes with the ability of people to express themselves But when the companies are moderating content, that is when they're deciding what kind of content is appropriate and inappropriate I think we expect a few things we expect that they're transparent about their process for deciding what rules are going to be applied transparent about what those rules look like and how they're made clear about the nature of those rules and they provide a kind of Appeals process and accountability for Wrongful takedowns of contents