 It's a fascinating subject, and I think it's a subject that we're just starting to get our minds around It's driven a lot by the fact that the technology is changing so fast and the ability to store Calculate compute link data has become so easy and so inexpensive There are a lot of risks associated with it too, and that's the Invasion of privacy the misuse of data and so the regulator to the extent they have or the policymaker to the extent they actually have the Privilege to control it has to balance both of those things the opportunity versus the risks There's several but probably the most important is what's often been focused on is personal identification of the individual or as it's called PII the identification of an individual sometimes is driven by the individuals right for privacy and also the individuals right to Actually decide and to have its opportunities its freedom to decide on its own the Other threat that we sometimes concern ourselves with is the fact that analytical work on data can actually do Predictive technology I can actually determine or predict that an event might happen in the future Hasn't happened yet, and so that could be quite dangerous So certain protection should be put in place to make sure that this predictive technology doesn't get out of hand It's interesting because in the paper we wrote we actually said that is actually more science than fiction we actually cited that movie the 2002 movie that Tom Cruise starred in and The if you look at the research that's been done really in the last two or three years not in the last 12 years, but in the last two years. It's frightening in terms of What research is able what actually analytical work is able to predict and determine? One of the concerns that we always have is that if the predictive technology gets out of hand Then certain kinds of things also about the freedoms we have might get out of hand, too I would and in the paper we actually recommended that there should not be but there should be a set of balanced and responsible restrictions that that recognize that Predictions that analytical work should be guarded against part of the problem You have is that if those controls are too strict and too onerous You might actually lose the opportunities to and always keep in mind that there are many stakeholders involved in this So the so the purveyors if you will the people that actually collect this information and use this information Have a stake in this too because if they violate that trust they could wind up actually losing customers and in the Discussions and in the paper that we wrote for the GSR this year We actually talked about that and we talked about the migration away from certain social media into more Protective social media. So the market is actually starting to work possibly at the risk of quoting an excellent book that I just finished it was Released about two or three months ago. It's called the naked truth by Patrick Tucker wonderful treatment of What the future will look like? The author actually spent about eight or nine chapters Leading up to what I will call the final chapter and there was a marvelous Example or a marvelous anecdote at the very end where he talked about what will it be like? boarding an airplane in 10 years so that is essentially what the future is and what he said which was I thought fascinating is the fact they Will be very similar to what it was like 30 years ago where we will get out of our taxicab We will walk through the airport. We will not walk through security We will walk right onto the aircraft and we will take off the big difference is with big data Everybody will be known the with facial recognition with intention predictions and things like this So the risk you run and I thought he was quite clever in how he said you hope you don't get tapped on the shoulders You're about set to board your plane and somebody says I'd like you to step aside But what he was predicting is a lot of the analytical work will be behind the scenes The thing is that that work really should be responsible You