 With information technology we've gone from a world that was private by default to a world that is public by default. In a pre-digital world our lives were primarily private by default, by the fact that most of our communications were not mediated by tools of mass communications, that our conversations were bounded by the physical location and thus it took extra resources and effort to publish publicly. Now it takes extra effort to make things private. This massive amount of data that is being generated by people can of course be used for beneficial outcomes or for detrimental outcomes. This social data can be used by researchers to understand society better. It can be used for beneficial security reasons. It can be used to enhance public services. But there are growing concerns that our data is or might be used against us in a multiplicity of ways. Today our platform societies are engaged in a dubious and questionable relationship with their IT platform providers. If the interests of the platforms that control this data were always aligned with the interests of the user then there would be no great concern but the issue arises when the two are misaligned, which often happens given that these platforms are private organisations. Individual people give over their information willing me to online platforms in exchange for the services they provide but few fully appreciate the negative externalities of this when taken on aggregate. The ubiquity and complexity of surveillance are very difficult for people to grasp. For example one of the largest data brokers, Axiom, claims to have 1,500 pieces of information on 200 million Americans while the company Hunt says that it can predict people's consumer preference from just 5 data points about them. Our private information is being traded all around us and ever more sophisticated technology is being used to predict and alter our behaviour without even knowing it. As we become more digitalised we start to leave an endless trail of data dust behind us that is hoovered up by companies and used to predict and alter our every next step. With the next generation of technologies, the internet of things, advances in big data storage, advanced analytics and smart systems this data economy will greatly expand and so too will the predictive capacities of these organisations creating a significant imbalance of power. The first question though is why should we care about privacy at all? Social systems always engender a complex dynamic between the group and the individual, between the public and the private. Functioning social systems require diversity of individuals that are able to come together, find commonality and coordinate both differentiation of individual and commonality of the whole that are required. Individuality and diversity require subjectivity that is inherently private, it has to be developed autonomously. This requires individual space and privacy to develop ways of being that others may not like or may not be aligned with the group. Social dynamics emerge out of the interaction between the private and the public. It's an inherent part of how social systems evolve through individuals developing new solutions in private to overthrow the commonly accepted norms. This is how we got rid of slavery and achieve women's rights. When these movements were started the society of the time would have rejected them if they had not had the private space to be incubated. Without that privacy there would be no incubation of new solutions, limited diversity and societies would become stagnant. The same is true today, for gay marriage in marijuana, they're currently in the process of being legalised around the worlds we would never have got there in a world of perfect information and complete surveillance. Frank Riga of the Chaos Computer Club states it clearly when he says if you have your privacy guaranteed, technically or politically, then you can come up with political dissent, ways to change the world. When you don't have privacy, when the things you do are completely transparent, knowable and predictable, then you cease at that point. People name places where they can be free from the judgement of others in order to develop underdeveloped subjective dimensions themselves, to develop into new people or for societies to develop into new societies. Privacy is a space where creativity, exploration and dissent can thrive. What we do away with privacy, we limit those valuable resources required to sustain a society. A measure of how well a society is doing is not how well it treats its favoured, compliant and obedient citizens, but how well it treats its dissenters and rebels. Mass surveillance dampens our freedom in all sorts of systemic ways that are largely unnoticed. It removes many behavioural choices without us even really being aware that they've been excluded from our options. There's plenty of research to corroborate the fact that when people are in a public setting where their behaviour is being observed or they know that they might be watched in some way by others, the behaviour they undertake is greatly more conformist and compliance. Human shame is a very powerful motivator shaping behaviour towards conformity. When people are being watched, people make decisions that are not necessarily of their own innate agency and are in fact the expression more of the will of others and their society's orthodox. Surveillance creates a conceptual set of constraints and conformity. The idea that government regulation is somehow going to solve the privacy equation is somewhat naive given the current context. It's important to appreciate that the power of the technological process of change that's underway fire out strips are existing institutional capacities to deal with it. Professor Josy Van Dicht of Amsterdam University notes this when she says What's at stake here is not one platform or one thing, it's the credibility of the system in which commercial and state public interests are becoming increasingly intertwined and very hard to discern. The core of the problem of the paradox is that public values are no longer rooted in public institutions. Deinstitutionalization, deregulation, globalization have really caused the erosion of what public value is all about. I think and I regret that public institutions are alarmingly underprepared for the questions raised by this global information influx. If you're serious about finding solutions to such issues as privacy then you really need to be working with the technology and not against it. There is clearly a shift that needs to take place in data ownership and privacy for the platform economy to arrive at a more sustainable model. The current model where data becomes public, the property of private organizations and stored in centralized data centers by default needs to change to one where it becomes the property of the individual and is public to the extent that it needs to be public, thus reducing risk and negative externalities. The blockchain can and probably will play a central role in this. The next round of internet applications built on the blockchain can enable the creation of distributed social platforms without centralized management where data is secure and owned by the user with it then being up to them as to when and how they share that data. The blockchain actually gives us the technological means to build platforms that would take us to a much more sustainable data future. The algorithmic regulation in the form that is currently emerging in contemporary modern democracies seems to be providing a one-way mirror that allows institutions to look down to survey those below, but those below have no real prospect of peering into let alone understanding and challenging these algorithmic black boxes that increasingly regulate our lives, but it doesn't necessarily have to be like this. As Kevin Kelly of Wired Magazine says, our central choice now is whether this surveillance is a secret one-way panoptagon or a mutual, transparent kind of covalence that involves watching the watcher. Through open source software and blockchain technology, we can shift this balance of data power from centralized institutions to individuals to build systems that put data in the hands of people make it secure and only accessible under their consent. We can now build systems that have a two-way transparency and accountability to them.