 We know in Brussels how the work of the Institute as a thought leader in politics and international relations is impressive and therefore it's a great honour for me to be here. I'm a member of the judiciary but I'm not here to share with you legal comments. I have two basic messages for you today. The first one is we cannot approach the challenges of the future with the tools of the past. And the second one is new rules to harmonise the European Union legal framework are needed but much more is needed outside the legal dimension. Let me start with a personal comment to say that I had the honour to be in the business of regulating data protection processing for over 20 years. As academic, part of full-time, as a regulator, enforcer and now at EU level. And I had the unique privilege to be part of different generations of data protection approaches. And this is why what I'm saying builds on the basis of this experience. For most of this time, so more than 20 years, data protection has been treated as more as a technical abstraction, therefore on the margin of political discourse. Yesterday, I'm sure you would like to speak about this judgement and ruling by the European Court of Justice on a case concerning the very well-known commission decision. Perhaps less known in legal details but this is a farber, I mean data sharing made headlines in every major news outlet around the world. I don't recall EU data protection ever commanding such a level of global attention. So the timing and also location of this discussion is more than impeccable. I'm convinced that we are now at the crossroad point in time where low technology and public opinion seem to converge around a set of core principles and values. I will not introduce this discussion with detailed comments about the farber. But let me say that I suspect that for many of you the judgement was not a big surprise. The SHREM case needs to be situated in the distinct circle of jurisprudence which has put leaving flesh on the study skeleton of the Charter of Fundamental Rights. Legislators, governments, businesses and citizens can be no doubt about the rights of privacy and to data protection are important and can be ignored. And please do not read the judgement alone. Two other important cases are to be considered. They come from two judgments released last week, the Beltimo and the Barra cases. Focusing on transparency, focusing on the challenge to ensure proximity and protection to citizens in a globalized world. But I'd like to touch mainly on three aspects of the digital ethics which we want to promote at the European Data Protection Supervisor. This is why I have adopted an opinion in September before travelling to the Silicon Valley. And some of the comments I'm making now builds on the observations and the impressions I had coming back. The first comment is what is the ethical dimension to data processing and why is it important. The second one is what might digital ethics would look like. And the third one is how do we build the consensus to get there. But let me to explain why we're speaking the ethical dimension, let me begin with a very brief overview of the institution I represent. I lead a very small institution currently composed of no more than 50 employees. We are likely to grow because of the reform. And like the organization which Ellen Dixon here in the audience leads, we are an independent data protection authority. So we are part of the group of regulators, but we are an independent institution. We are responsible under the European Union Law-specific Regulation adopted in 2001 for monitoring and enforcing compliance around the processing of personal data by all European Union offices, bodies, agencies, main and small entities included. For instance, the local agency which is established here in Ireland. But we are increasingly responsible for advising EU institutions, particularly Council, Commission and Parliament on all matters concerning the processing of personal data. More specifically, we have to ensure that when introducing soft and hard legislation, they take on board existing data protection principles. So we have been, my colleague and I, pointed in December last year with a specific remit of being constructive and proactive. And this is why in 88 days, this was our priority. We published in March this year a five-year strategy with our vision for the EU to be a beacon of best practice and forward-thinking for how people and information about them should be treated with respect and accountability in the digital society. We would like my colleague and I to be transparent, predictable and we would like to be evaluated in five years from now on a basis of objective key performance indicators about the results we will achieve. This allows me to come back to the main takeaway of my today's contribution. What is the ethical dimension to data processing and why is it important? As with the 18th and the 19th century industrial revolution and the genesis of the human rights movement, there is in my view a similar imperative now against big data to safeguard dignity in the digital revolution. Dignity was present in the Treaty of Rome as amended in Amsterdam and Maastricht. But now in the Lisbon Treaty is placed in a very different place. It has been upgraded. Dignity is considered by many as one of the fundamental rights and freedoms. But lawful oppressors may say that it's not exactly a right or a freedom. It's something more. Fundamental rights and freedoms may be subject to interferences which has to be necessary and proportionate in a democratic society to preserve or promote other fundamental rights and public interest. So the Lisbon Treaty doesn't speak about any interference on human dignity and we would like to reflect on this. So we know that ethics is an established factor in medicine and it's also driving notions of corporate social responsibility and environmental responsibility towards future generation. And my sense is that scientists and interpreters are realizing that the explosion of personal information and powers of computer artificial intelligence, robotics, virtual reality raises now more profound questions about what it will mean to be human in the future. What it will mean respect for the individuals who will be responsible and liable for decisions which will affect people in the daily life where we would like to be everywhere ubiquitous computing by using similar platforms, similar devices. Even the Pope Francis has written about the so-called mental pollution of the mere accumulation of data divorced from genuine human exchange and self-reflection. So in practical terms, I see now in closed services where there is a general reluctance for anyone to take responsibility for what happens to the data, nobody wants to hold the encryption keys. And many data controllers claim that they simply don't know where the data is. Surveillance is the business model of the internet. But most people have not subscribed to this. The European Union now is approaching a new strategy concerning the digital single market. And some players seem to be tempted to impose and import the West Coast approach where there is no principle of data minimization or purpose limitation where any obstacles to data flows are considered bad for innovation. And meanwhile there is a sort of a security rat-chat where it is now assumed that law enforcement and the security services to keep pace with terrorists and cyber criminals must simply collect, store indefinitely and indiscriminately all data available. We see this with data retention. We see it now with PNR which will be subject tomorrow of an important meeting of the European Council in Brussels. Security, which used to be a concept closely connected to privacy, is now deliberately left value and undefined so that it can be used to justify more and more intuitions into the private lives of more and more people. So in my view an ethical approach looks at the longer term implications for society and the individuals of this trend and seeks to identify new norms to prevent unintended consequences. But what might a digital ethics look like? The EDPS 2015 opinion on the ethical dimension outlined what we call a big data protection ecosystem, sort of an interdependency consisting of regulation, businesses and organizations processing information, engineers, designers and also individuals themselves. So it was a recognition that good laws are, as I say, necessary but unfortunately insufficient. So controllers need, in our view, to be aware and much more accountable for the impact of their business decisions on individuals. Let's think for a second to the Google Spain judgment of last year. Google were not providing a public or charitable service. They were making business decisions through their algorithms. They are driven by legitimate business interests. And these legitimate interests can be considered of course but they cannot take precedence over the fundamental rights of the individuals concerned by the data including the right to be informed. So technology in a nutshell is not value neutral. It is the result in my view of human ingenuity, reflective of the value system of those men and women. It is only now that people are beginning to realize the prescience of Lawrence Lessig's declaration around 15 years ago that code is low. In other words, that the rules and standards which are devised to govern cyberspace like the anonymity or traceability of individuals are at least as powerful as the formal legal framework applied by the court. The problem is that the internet which has emerged was devised by brilliant designers, scientists, technicians who did not necessarily understand or reflect or translate into practice on fundamental values like human dignity, privacy and freedom of expression. So we are starting to change that by bringing together legal experts and engineers will be crucial to the long term and exercise in terms of sustainability and competitiveness of the digital market in the EU. Our motto is that not everything which is technically feasible is as such morally tenable. So we are thinking on how we can introduce more transparency, how we can empower more individuals in a world where we would like to not slow down information and innovation but to allow people to be more in control of their information. We would like to see people treated more as individual, as human beings and less as users, consumers, subscribers. People are prosumers of content online and therefore structures must be in place to address power and information imbalances. But how do we build the consensus to get there? We aim to simply give a kickstart by pulling together data protection ethics advisory board at our institution. This panel will operate transparently, will be a resource for the EU generally and will include visionaries from outside the data protection community. I really count to have a preliminary result in no more than one year and a half and to organize an international event in 2017 to enlarge the discussion. Data protection authorities will play an important but not exclusive role in this area because they will have more resources, of course, they will have more expertise. Therefore we have to protect more accountability, independence, transparency and encourage capacity for forward thinking to be announced. Therefore we have to speak more as data protection authorities with one voice and we must be free as data protection authorities for unnecessary prescriptive procedures. We are insisting a lot to de-bure organize as much as possible requirements in the reform. We think that flexibility is needed in terms of formalities. Stop to the bureau privacy. Let's make a distinction in between safeguards to be implemented in an innovative way and a formal prerequisite which are not necessarily essential as such. And on a global scale we need of partnership at international level. Therefore we need to build bridges with other regions and countries where perhaps they have a different approach but values are shared. Of course this week the focus is largely on EU and US. One commentator on the Guardian on Wednesday said the Atlantic just got wider. But in fact we still have a great deal in common. US is still a strategic partner for the EU and therefore I think building now on this important judgment I think we should start with a new process which could be a platform for the rest of the world. I counted weeks ago over 120 countries with data protection laws and therefore Europe is leading by example but we are a minority. At the same time they are following more the European approach. Not in terms of a minority law but in terms of approach to the protection of individuals. So this is more or less the challenge and therefore it's simply a pleasure to look forward for our discussion and I thank you for your attention.