 Welcome everybody to this our second seminar in the Health Law Institute seminar series for 2019-2020. My name is Sean Herman. I'm a policy analyst and part-time faculty here at the Law School. I teach health systems law and policy next semester so if there are any health law students in the crowd who have not yet registered please consider doing so. For those of you who don't know the Health Law Institute is a collection of faculty, staff, researchers and students who are interested in advancing health care and health system improvement in Nova Scotia, Canada and beyond for more about the institution. Please look at the the HLI website. The seminar series is a platform for sharing research and ideas around issues relating to health law and policy and I have the pleasure today for this seminar of introducing Professor Kim McGrail. Professor McGrail is a professor at UBC in the School of Population and Public Health and the Center for Health Services and Policy Research. She is scientific director of population data BC and it goes on and on. Data director for the BC Academic Health Sciences Network, PI for the Spore Canadian Data Platform, founding member of the International Population Data Listed Network and founding deputy editor of the International Journal of Population Data Science. Her research interests are around quantitative assessment of policy, aging in the use and cost of health care services, learning health systems and all aspects of population data science. Professor McGrail will speak for about 40 minutes and then we'll open it up to questions and comments and at about 140 we'll thank Professor McGrail because another class will be coming in. Without further ado, Professor McGrail. Thank you very much for the overly kind introduction and for the opportunity to be here and we're gonna be talking about a topic I absolutely love and probably could speak about for days, hours, weeks, I don't even know. So I'm gonna give you a view of the way I see the world changing and what that means for how we think about the use of data and so on. So disclosures before I get underway, I have no conflicts to declare but I would absolutely love to acknowledge my colleagues who have really shaped my thinking and a lot of the things you're going to hear about today. There's lots more colleagues that probably could be named but Mike Burgess and Kieran Audorty have been particularly influential in the public engagement piece of what I'm going to be talking about and then also specifically want to thank the deliberation participants that we've been working with and you'll hear more about that as the talk goes on. Maybe before I get underway I really enjoy interaction and questions so if there's anything that needs to be clarified immediately as I'm talking please don't hesitate to let me know but I will make sure not to go over 40 minutes so that there's lots of time for questions and discuss them afterwards as well. So here's my main messages for anybody who just wants the shortcut and then you can get up and leave. This is kind of the main thrust of it. This also provides the overview for how I'm going to work through the material I have today. So first of all society is changing this kind of data data everywhere world that we live in and research is changing as well including different ambitions as well as greater possibilities for research that we might be able to undertake. There's the known unknowns and the unknown unknowns that we have to think about as we're kind of delving into this brave new world that we're all in together and I think the implication is that we can't rely on doing things the same ways we've done them in the past and I do firmly believe that one of the ways forward has to be around public involvement true involvement and policy setting around data use and so on. So this is part of the citizen involvement with the framing as this digitized world. Okay so let's get into the content here. So this is a I'm quoting myself so but I wanted to put this ethical imperative up here as a starting point because I think it's important for you to know my frame and coming in because this obviously will influence everything else I say in this talk. I absolutely recognize that this is at some level of value laden normative kind of statement but I really do believe that if we're if we have a society that's offering particularly in the realm of public services and we're collecting data in the process of providing those services then we have an obligation to use that information to make things better that's kind of that's why I frame this as an ethical imperative the data exists they should be used now they should be used in a way that is privacy sensitive and consistent with societal norms and that's kind of but that and that informs the rest of what I'll be talking about today. Okay so this is the society is changing part of things here so this is but this graph shows is both so by here across the bottom and then the total amount of information that is available in the world and then the red line is the percent of information that's in digital form meaning that it's in computers and readily accessible and usable shareable movable etc and the important thing here that you hear funny statistics around this like 90% of the world's information has been developed in the last two years and it'd be a pretty hard thing to prove or put something around but it's it's the it's trying to emphasize the pace of change in this and the fact that everything that we talk about now is readily available in some way to be used by someone using computers and so I mean it's and we're getting to the point that as this graph shows that virtually all of the information that we have is in computable digitized form so this is a sea change and you can see that just by the fact that that uptick really is just happening the last two decades so and then we live in this world of data everywhere follows us everywhere and in this kind of messages out there in the world about data is an asset this is the one that gets me the data is the new oil I think we're all sort of sick of the old oil so I'm not sure why we need a new oil but this is this is one of the messages the implications the resource that is available to us and we can we can maximize it or and data can fuel the global economy that's kind of other thing that we hear quite often and so this is a kind of part of our world now that's changing rapidly both in the availability of the data but also the way we think about data in our interaction with them so for I'm deliberately not going to talk more about the big data which I think is a terribly overused term but I thought it was probably important to talk about the aspects of the definition of big data that actually are relevant and just a lot of the things we're talking about so and this is why it's the VVV so it's big data is defined by variety so there's just a lot of different data sources out there the volume because it's massive in terms of how much storage space it takes up and velocity in terms of how fast we're collecting it how much new data is coming all the time and then the in the parentheses there's these other things that that talk about value and veracity but I feel like that gets into the realm of starting to advertise about big data as opposed to actually defining what big data are so stick with the VVV now the other way that people talk about big data is that it's the exhaust of our daily lives again terrible analogy why do we want exhaust but it is true even sitting here we're generating data your phone on GPS somebody's keeping track of that at UBC we use GPS information to control lights of buildings that there's nobody in the building the lights go up so there's good uses that can come from that but it's just generations ongoing constantly there's another way that people have starting to think about the sort of changing societal context that we're in so with this picture is showing is different industrial revolutions and so we had the steam revolution with the electricity revolution with the computing revolution all things that have changed society quite dramatically and now into these what people are calling at least a few people I'm not sure this is like general parlance at this point but this cyber physical systems and what that's really talking about is this kind of blending of physical digital and biological worlds and that I think the important part of this quote is this revolution is certain to alter the way the human race lives works and relates to one another so there's a there's a much much broader social context and structure and that we have to think about so we're I'm working mostly in the realm of health data and population health data wellness that in that sort of thing but that but the situation that we're working in is far more broad than that and we can't think about this only within the health care system or I think we're missing an important piece of what's going on and fundamentally I believe this this is going to alter society in ways that we don't really understand yet that's the unknown unknown part of what I'm talking about in change I'm going to try this little parallel with you we'll see if it works I think part of what we're dealing with is what my parents generation thought about with the introduction of television so television was a complete foreign concept to my mother when she didn't grow up with television and partly because she was you know on a farm etc and so when I was a child this television was this foreign thing that was scary and weren't sure it was going to do and it was going to melt your brain if you sat too close to it or watched too much and I think we understand now more about both positive and negative aspects of TV and the wonderful things that are now being made and available to us on things like Netflix but at that time this was a scary new innovation and we just didn't have experience with it nobody could truly predict what was going to happen I think we're in that same situation now with digitization of society and maybe even to a greater degree but I think we always have a habit of thinking that the changes that are happening now are always somehow more cataclysmic it then changes in the past I don't know retrospect will tell us that but it is a fundamental change now I'm going to switch now and talk a bit about the changes in the research realm so what does all of this mean for research and research possibilities so there is this big data thing and it really is as much about the variety and that's going to be what I kind of concentrate on the variety of different data sources we might be able to bring together for research purposes so this could be you know if we even if we think about health and health care what we might want to bring together could be what do we know about the circumstances of your birth what was your socio-economic status growing up do we know something about your parents what is your health care experience what is your health profile what are your habits now are you employed what's your education all of those things can inform both your health status and what you might need from the health care system they also can inform how you're going to respond to treatments that are offered to you in the health care system and take that health kind of angle on this so all of those things while they're not strictly speaking health data necessarily they can be very very informative for research and it means that we can actually understand research a lot better so for those of you who are not researchers by training a lot of the things that we do are based on regression analysis predictive models and in the end we have a huge amount of unknown we have an error term on that and most of what we're talking about is unpredicted variant by individual and so on and the idea is that big data these kind of new data sources linking a whole bunch of different things together can help reduce that create more certainty about understanding how you got to where you are and what might help you achieve a wellness and the ability to do the things you want to do in your life so that's this combined with the computing systems that are now available to us so it's not just the fact that we have big data so that we have these computers techniques this is you know it's just the ability to actually harness the data put them together in a way and and use them to produce analysis so we'd have the big data but computing infrastructure of the 1980s we would not be able to do the things that we were able to do today artificial intelligence deep learning neural networks and all sorts of things take a lot of power so if you put these things together you get really really happy researchers right this is a that's not me it's that but it's because that because it's so possible to investigate things that we've always dreamed about but it hasn't been possible because the data wasn't there or was not available was locked away I was on paper too hard to put together etc so the possibilities are enormous then I think we have to think about if we're pursuing this and we want to think about research in a new age we have to think about the balance of rewards and risks so I said the ethical imperative is to do research but I also hope that the emphasis was clear that it has to be done in appropriate kind of way so there are risks and we have to be aware of them so but I'll start with the rewards so there's a reward of frankly this kind of nerdy reward of the ability to figure out how to find data that are relevant and how to link them together and how to do analysis that's that's a it's a probably a very academic pointy headed way of thinking out of there but there is a reward in there there's a reward of of new discovery like the fact that we as researchers might be able to break something that in advance our understanding of what makes societies healthy what makes populations healthy why we have such a difference in the health status across our populations that's really really compelling and exciting to think about the discovery and that can lead to things like understanding what drugs we should be developing but it also can lead to understanding what the implication of pollution weather patterns the communities that we live in how that affects our health those are really really important things that we could understand and if we understood those then we can create better public policy and all those other things to help even out health distribution and so on and then this is a Clyde Hertzman a mentor of mine who really was my inspiration and thinking about that that ethical imperative but also the importance of understanding his his focus was on early childhood and what he was really pushing to understand with the use of data is how early childhood experiences affect your entire life so this is not just about how you do in kindergarten or how you do in second grade it's about what your chronic disease pattern is as you're you know entering your 50s 60s and 70s and what your resilience is and all these other things so these these certain periods of your life certain circumstances have huge effects so understanding that with the use of data can be really really powerful but okay then then we do have to acknowledge the risks so again there's a risk in terms of identifiability or potential identifiability of data so when we think about being privacy sensitive in the data world it's about ensuring that the data we're using can't re-identify somebody and can't be used to harm an individual person in any way so that's the that's the kind of imperative there if you and we have great ways of de-identifying data but once you start thinking about linking all the different data sets together that I described I would argue that that becomes just about impossible to really say that something is de-identified and there's an awful lot of examples out there where a data set's been anonymized quote-unquote and put on the internet and within 48 hours some computer scientist has gone and identified the governor of Massachusetts or some other individual based on a little bit of external information and enough kind of digging in the data so I don't think we can rely on that I don't think that's a safe thing to say that we're just going to say we can de-identify the data and carry on as we have in the past now it gets even more complicated if we start thinking about linking in things like genomic information because geneticists who are honest with you will tell you that genomic information on its own is not de-identified it's just there's too much richness of that information that's in there and it's it's possible again it's going to take some effort to reverse engineer but this is about the promises we make to society and I don't think that's a supportable promise but there's another layer here which is once you add in genomic information and perhaps sometimes even without it it's not just about the potential re-identification of the individual it's about also the ability to identify family members community members and so on so this is a and I don't mean that to sound scary I think it's just something we need to acknowledge that we can't rely as we have in the past on de-identification being the way that we're going to proceed and our crotch for saying what's good and what's not good what's okay and what's not acceptable so this is my kind of summary of that that culture and society and research changing as we're not in Kansas anymore and I don't love Kansas so that's okay I'm from Michigan I feel like I can say something about Kansas and the you know the new world in the Wizard of Oz is colorful and exciting and has danger but also friendships and you know so that's maybe that's fine but I think you can't operate the same way in the old world in the new world and I think we need to take that to heart and do something about it so how many of you heard have heard the story about target and the use of data okay just a couple so I'll tell this very very quickly so this is this is one of those kind of relates to a risk but I think it starts to relate to how are we how are you going to think about this in the in the future so target collects a whole bunch of information anytime you shop there like any place really probably these days collects massive information and then it uses the information about your shopping patterns runs predictive models and figures out what it should advertise to you and sends advertisers and they had a specific interest in advertising to people who were early stages of pregnancy because the idea is if you can capture a customer in the early stages of pregnancy you're going to get all the expenditure for baby stuff when the baby comes and onward and so the story is that a 15 year old girl received coupons advertisement for cribs diapers all this sort of thing and the father of this child flipped us lid as you would and went to target and said who are you to be sending advertising to my daughter she's 15 blah blah blah and a week later called back and said you knew before I did she's pregnant so that's a bit I love that expression it's wild hey but I mean we've all experienced this like how many of you have done a google search and then had something that you searched for chase you around for the next until you did your next google search right this is how it's done this is how your data is being now how many how many people are happy about that sam toppy okay okay so but this kind of prediction algorithm all this is really it's possible with the data that we have so let's let's think about i'm going to give you a couple scenarios so one we're going to link a whole bunch of data together and we're going to use it to predict how you're going to respond to a specific drug and we're going to target your drug prescribing accordingly now how many people you would be happy to hear about a data use of that sort okay more than the target example now we're going to take the same data set and put it together and then we're going to give it to an insurance company and the insurance company is going to risk rate your premiums and it's going to start charging people who live in neighborhoods that are high at a higher proportion of people who are sick greater premiums and in a lower neighborhood your other neighborhoods will have lower premiums how many people would be happy about that okay not even sam so this is not a question i think oh okay yeah i'm not sure i think we have a correlation causation issue there i don't yes they did fail yeah maybe they didn't collect quite the right amount of data so the point of that those two examples and the in the latter one the insurance one is actually a true story that did happen in the uk not in canada but the the point is that it really matters the context matters we can talk about the same data set the same data sets being brought together but who's using it and for what purpose informs how we feel about that in the end and the last thing i want to say just about how this whole world is changing is that it's a bit of a pandora's box and the reason i say that is because in the digital world a mistake can't be undone once one state is leaked out or it you know what happens sometimes is there's a leak and that goes to the dark web and things are sold that's gone forever the damage is done so you can't put the stuff back in the box but for those who know the story of pandora's box what's left is hope so i remain an optimist to think there's something we can do here we just need to be really honest about what's going on and then i so now i'm going to move into the you know the the futures needs to be different from the past i want to acknowledge because i'd be foolhardy not to it the health law institute that we have a really robust legal framework around allowable uses of data and we have also an ethical framework that researchers and others live by and these things together kind of create the the system that creates our data use enterprise i guess what i'm trying to argue is that that legal framework likely needs to be updated and for the moment is probably not enough in fact i would say it's certainly not enough we need to go beyond that to think about how we're doing things and that this is this is my way of saying that perfecting adding layers to the way we're doing things now is not going to be the right answer i have a whole different sort of talk or thoughts i could provide on the limits of individual consent and i won't go further than that today but i don't think what we need is additional pieces on top of what we have i think we need to think fundamentally different about how we're doing things now and the other this maybe is the fundamental piece in the and a switch to the thinking about the public's involvement in all of this what we in the research world we we think about think of the hat as kind of the empirical the knowledge base the the things that we could understand if we could only do some data analysis and if we could do that that create the right evidence then we can make a decision that's the right decision i don't think we're in that world we're not in a world where analysis of existing data is going to tell us how we should run our data governance and use of data systems in the future so we have a what i would call the positive hat the empirical kind of hat and we keep trying to to pull a normative rabbit out of that so what i'm saying is that this is a in the end there's values based to this there's not a study that could be done to tell us the right way to go and in the absence of that being the case if we're talking about values and norms we really really need to involve the public this is like the fundamental way i'm thinking about this so there's a there's a lot of debate in civic democracy about we elect representatives we elect politicians they're supposed to be our representatives they make decisions and i think that's true in general but where we are all in embarking on so there's no way a politician understands this new digitized world any better than the rest of us so do we want to cede all of that to representative agents or do we actually want a wider civic discourse and public involvement in all of this you can see where i land on that okay so then now i'm going to talk about how we might think about bringing the public into these conversations and actually giving true involvement and leverage to public opinion so i was part of this international group that worked on a consensus statement on public involvement in an engagement with data intensive research and i i just want to quickly assure you that there were actual public members involved in this this was not researchers writing things on behalf of the public which would be terrible the key that came out of this is that the public should not be characterized as a problem to overcome but as a key part of the solution and then we talked about things about how how do we do this we've raised an awareness and enable people in many different ways to participate in research and in data governance we want to bring down the distance between data users and the people to whom the data relate i'm a quantitative health service and policy researcher who does a lot of work with existing data sets and only recently have i started to do that in combination with having members of the public as part of my research team in part because it seems kind of odd at the first blush i'm not doing interviews i'm not doing qualitative work what is the public going to bring to this kind of dispassionate data analysis but in the end it's not dispassionate i'm making decisions about what outcomes i want to measure i'm making decisions about how to interpret the findings and those all can be informed by people who actually have the experience of the things that i'm studying either themselves or through family members and so on there's also bridging a gap between members of the public in the data about them which i you know is different side of the same coin and what we want in the end the ultimate goal is to have a social license or another way of saying this public acceptance of what we're doing and how we're doing it so this this represents a possible way forward this this framework is one that we're starting to use more and more in the data use and data governance world it comes from the UK but it's being adopted in a number of countries internationally so the idea here is the framework says here's the thing we need to think about when we're thinking about use of data for research and other analyses are the people who are doing the work safe safe as in do they know what they're doing are they accredited in some way are the projects safe do they have scientific peer review is there some public value that might come out of them are the settings where the research is happening safe so will the data be protected not just as we're putting it together but as actually people are using it for their work as are the data safe have been at least de-identified to some degree without making promises that it's anonymous or fully de-identified so this would be things like taking the names off we would never do research with the data stuff that has names attached to it is the output safe so the the things that we're letting out into the world could they be used to identify people even though they're supposedly in aggregate form so that seems like a pretty good framework i haven't run across anybody who's suggested there's something missing from that and it does incorporate legal and ethical standards if we were thinking about how to operationalize it but i think it's also pretty clear that there's a lot of room for a lot of different kinds of rules this just gives you the categories to think about it doesn't tell you how to do something and there's going to be edges of uncertainty new things that come along new tech that comes along new data that comes along that's not going to fit into the existing rules and buckets that we have and there's not going to be necessarily the legal framework or the ethical framework to fall back on it's going to be something we're going to have to develop and implement ourselves so in the end i think the the path forward will be not abandoning the legal and ethical pieces that we have but adding this framework to them but in order then to really operationalize that framework we need to add the public voice as well okay so i'm i'm living this i'm trying to do it i'm trying not to be just a theoretical academic we're trying to so we have a data system in british columbia and we're trying to think about how we're going to handle some of these new ideas and ambitions and we're doing some public engagement work to to put some frame around that so the the photos here on our on the left is professor kieran adority from university of gulf and on the right is like professor mike burgess from the university of british columbia this is the two people that i'm working with on this particular project there's lots more involved these are the these are the people who've developed the methods that we're using so the deliberation that we're doing so this is this is a because i don't think we can ask a survey question of people about how they feel about this it's too complex you really need to know something about this area and to get kind of caught up to speed on what's going on and some of what i've just been talking about before you could really deliberate on these things so we think that this long form kind of way is the right way to get public input at least at this stage of the game so this actually runs over four days so you get people to come for four days of the event we work in small and large groups we ask people to come with diverse perspectives and we ask them to come with passion but when they state their when they state their pins we ask them to provide reasons behind that so it's not just that i agree or don't agree but can you tell me why can we because that's how we're going to understand diversity and more to the point how we can use that diversity find rules that can accommodate all of us living together so this isn't about building consensus this is not going to a meeting where everybody has to agree on the same thing in the end we actually want to both find the areas of agreement and the areas where the agreement starts to fall apart so it's really not a focus group it's very different from a focus group and it's certainly not about making many experts out of people we're not trying to change people into lawyers or privacy experts or researchers we're trying to give them enough information to be cognizant of enough of the information to be able to deliberate with some background and not just an off the top of your head sort of response so the way we run this is as 25 or so um i mean it's demographically stratified participants participants and what we're really trying to do here is not we're not a representative sample it really is trying to maximize diversity so you want people from and we we we do this by getting people who live in rural areas and urban areas who are different age groups different gender different so-called economic position so we're not asking about attitudes and values we're just trying to pick demographic information that has been either assumed to be or has been shown to be related to different attitudes and values so we bring that diverse group of people together we give them a booklet in advance that lays out the area that we're talking about some of the background information and we let them know the general area for the deliberation we have speakers that come and then of course there's this is like planning a wedding honestly this is a big four-day event with 30 people doesn't maybe sound like a lot but my goodness it's a lot down to like do we have enough notepads and pens and paperclips so this booklet is what we've developed developed for our first deliberation which happened in April of 2018 and it is available online I'm happy to share it with anybody that's absolutely out there and at the deliberation itself we had four speakers come so this is the other part of the education component of getting people ready to engage in the deliberation and what we what we were really conscious of in in developing the speakers is that you have to have number one people who are passionate about their position and what they're doing but also people who will represent the array of opinions out there in the professional world about this or and non-professional so at this event we had a data steward come and talk about the challenges around guarding data and making decisions about its use we had a researcher come and talk about the difficulty of putting together a linked data set on opioids and it took her 14 months I think to put this data set together for something that we consider a public health crisis in British Columbia I mean so it seems a bit shameful so that's part of the reason we had that researcher come we had a privacy advocate come from the BC civil liberties association to talk about the dangers of surveillance society and the bad things that can happen with data use we had a person living with a disability come and talk about the importance of research and and that sort of subject orientation and we had a member of an indigenous community come and talked about the stigma that can come with assumptions about you based on analyses that other people might have done so that was a big setup then we have the deliberation and we work with the deliberation panel to create recommendations and at the end we have policymakers come who receive the recommendations so it's very clear that there is a receptor audience we're not wasting people's time we're not asking questions that policymakers aren't interested in because that that is a a disrespectful way to engage public if they in the end believe that their that input is going to go nowhere especially when we're asking for four days over two different weekends so it's just a quick look of what the deliberation itself looked like big U shaped table we record everything transcribet so we can do qualitative analysis later but really what I want to just go quickly through is the recommendations that came up so there were four areas for recommendations from the public one about governance of linked data so first of all very supportive of research this group wants research to happen and it should be so linked all these different data sets could be linked together and made available but with the right context and considerations around data use they did want the access process to be efficient and to be able to fast track but they were not interested in sacrificing quality control for efficiency so this wasn't speed for speed's sake it's just you know once you know what the rules are make it as efficient as possible definitely this is consistent with international literature the involvement of commercial entities is definitely more contentious we did not get strong agreement from participants on whether that was okay or not okay and there was some concern about centralizing both data so they don't necessarily want all the data in one place even though that would be easier and simpler for researchers nor do they want centralization of authority so if you thought about a big data system they weren't really interested in delving that down and having one person to be able to say yes or no just because there's so many things to consider in all of this the security and review process is the second area so scientific review is absolutely essential nobody was wanting to give that up and the data protection during the research process was considered really really key and they and they absolutely identified the risk for populations and communities so even if you have a scientifically sound proposal there still could be a risk to a sub population that you study or a particular community and that needs to be taken into account and then moving on to the responsibilities for researchers and data stewards researchers retain that that responsibility for thinking about their research in vulnerable populations and what that means when they finish their product they also see this essential role for data stewards and asked a lot of questions about the training certification standards for data stewards which i will just tell you are nonexistent there's not a standard job description or training or anything for data stewards it's all over the place they did want formal governance around around data and they thought the researchers needed training too specifically around privacy and ethical data use and then we asked them about public involvement how should we involve the public and and they said one of the things that they said is we just need transparency most of the group said you know probably the vast majority of the population doesn't care they just want to trust that this is all happening and part of earning that trust is being really clear about what you're doing and being transparent about how data are being used don't hide behind anything and our our next deliberation which is happening October and November we're going to be following up on this how do we engage the public in an ongoing way okay so this is my last slide back to main messages society i i hope i've convinced you society is changing data are great i love data um but it it you know it has raised new issues for us researchers absolutely changing i mean we could have a whole different discussion about that um we do have to be conscious of both what we know about the unknowns and also be prepared for the fact that we're going to discover things that we didn't expect there are unintended consequences as well as intended we really can't just keep going as we are and i and i really do believe that public involvement is the way to go and that's all thank you very much welcome to floor questions or comments i'm curious about your stance um how uh how scientific we live they were to be generated um whether there was a sort of self-selection bias there and um like in general do you try to steer away from that i mean it's a great question about um who is our sample and is there some sort of bias and who comes i mean there's no getting around the fact that people who are willing to spend four days um talking about this are going to be somewhat different than people who aren't on the other hand i think i'm not sure that's such a big detriment because all of us have interest in certain kinds of things um certain like i would be more inclined to go to a deliberative engagement around this topic than something on mouse genomics which is not to say that mouse genomics is an important it's just that's not my area of interest so i'm less worried about that piece of things about um were they diverse they were a diverse and they were did not all agree at the start or at the end i mean the the recommendations i showed you those were the the ones that had reasonably strong support but only a few of them were unanimous and we feed that part of things back it's part of why we vote is not because i'm trying to turn this into a quantitative exercise but because it's a way to to give policymakers an idea of where where the agreement starts to to slip apart because those are probably the areas we want to avoid now if only half of our our um deliberate participants are okay with x research then probably we shouldn't do it right as a follow-up on the uh sample you should put a lot of trust in the wisdom of communities and i was wondering if you would be asking to have your research influenced if you were taking the same sample for i'm gonna get me being mean about um i i have you know the fact that i don't like Kansas isn't about the people of Kansas it's about all sorts of other things i do have a lot of trust in community i i think given i mean and we have to have trust in community because it's us right i could i so i think that if given the right circumstances um people can engage in things and there is wisdom that comes from the fact that you're talking to people who have to live with the consequences of the decisions we're making um i have observed i've been part of this deliberation i observed a deliberation in indiana and felt the same thing so yes it's uh it is a trust exercise but there's a trust exercise in putting our faith in politicians too so i yeah i i guess i people are going against that point of interest especially something important uh that's a whole different conversation i i don't necessarily agree with that characterization but like let's follow up afterwards have a question about the other time before in Nova Scotia and elsewhere in Canada people die every year because medical lab results are either lost in this time because they're not available and the patients don't have access to the information in Nova Scotia they started a project that makes access to reports very cumbersome cumbersome cumbersome the consequences that people don't get their lives well and some people die unnecessarily how can you help with denigrating this conversation where there's a balance between privacy and safety involved in the doctor yeah so it's a it's a great question and there's many many different strands in there but i'll just pick up on a couple one is that data about you in the context of the healthcare system should be available to you i is like i don't think that there's like how we do that is obviously up for debate knowing that the results may not be not knowing that the information may not do it quotes insecure emails in Nova Scotia they're suggesting that people can communicate with the doctor electronically other than through the system that has bad swaps and bad swaps as privacy yeah i mean there like there's so much in that i'm not going to be able to do justice to that it's a really great question i will tell you that there have been secure texting processes and rules put in place in British Columbia that nobody follows because it makes it way more complicated to do anything so there's not a point a lot of point in putting a whole bunch of structure and rules in place that people are going to find a way around it because they're too complex so i think we have to and we have to figure out a way and i actually think that this is something that like why why are we just deciding that we should make that decision on behalf of the public why not talk to the public about what's acceptable and who but in the end this is and there's a huge culture change that needs to happen in the healthcare community to get there because we've been afraid to give patients access to their lab results because you know they won't understand it people you know freak out it's not going to be able to interpret things appropriately but we're also supposed to be responsible for our health so i don't somewhere we need to meet in the middle on this so i it's it's an absolute imperative it's quite a different set of scenarios than talking about data for research though that's so i'm not the expert in what you're asking about so i'm going to leave it there go ahead i'm interested in your statement back at the beginning around that we have an imperative to use data in a privacy sensitive matter it's also linked into the other work around it's an individual consent and given all that you told us about it's the way that big data plus computation people brave new world i'm wondering that even that sort of expectation of privacy is reasonable anymore in terms of some of the public engagement that you've done is that something that's come up in the discussion around you people think that they can and or should still have proxy given given all of the things that work against like same working out the all time yeah i mean this is this this is again when i think this is such a societal cross cutting conversation not specifically about health so and again there's a number of strands in there there have been people who are quoted that said you don't have privacy just give up and it's part of the reason that we were really focused in our first deliberation and bringing young people in to the deliberation because that's the other thing you're here is all young people don't care about privacy anymore not true it's not true at all but all of us every day are making a trade right so anytime you sign up for a new app you're making a trade about privacy frankly with the desire for the service that is the app on your phone and there's been some research done that shows that if you wanted to if you wanted to actually read all of those privacy policies it would take you 76 work days over the course of a year to do it so guess what it doesn't happen so i i think that that what what we're up against is this idea that the service at hand is desirable and this vague threat of something that might maybe happen or not sometime later isn't enough to guard against that so this is why i don't think it's a an app by app kind of decision we need a different kind of approach and frankly a lot of the bad stuff if you will is happening in the private sector it's not happening in the public sector so to go back to the previous question we have a lot more strictures on us in the public sector but i'm going to here to tell you there's not been a data breach in the research world it's there's been lots of data breaches but they're from private companies for the most part they're not from researchers so it's there's a there's an imbalance of a bunch of things in here go ahead considering the so that there's principles that kind of crystallize evidence of iterations how about the EGPR legislation fulfills that one of those principles there's a lot of overlapping that kind of core ideas yeah yeah so and here's the funny so GDPR for those who don't know it's the general data protection regulation in the EU and it was it it was a long time incoming and a lot of researchers were really alarmed when the first iteration of this came out because it looked like a lot of the research we do isn't going to be able to be done anymore there was some now there are some clear exemptions in there so it's pretty allow allowing for the kind of research i'm talking about the other interesting connection is our former privacy commissioner in British Columbia is now the privacy commissioner in the UK so there's a and so there there's a there's a lot of similarities in that regulation and expectation to what we're doing now so i would say a lot of us are paying attention to that because it's the most it's the newest and most formally articulated but even GDPR i don't think goes as far as we're going to need to go and so does it it's but it's enabling for the kind of research we're talking about thanks so much i'm wondering if you could describe the relevance of the results and how they're going to be actualized or what lessons you're taking from them yeah so i we haven't done much yet with the data steward recommendations and those are ones that i think i would really love to see our um British Columbia and really across Canada pick up and really come to some better understanding and i'll tell you part of the reason for that number one in British Columbia will all live under the same um legislative authority ethical principles and so on so why is there so much variation in the way data stewards interpret the legislation and decide to enact their principles and as we get to a point where researchers are requesting data from 14 different data stewards we go with the speed of the slowest so one person out of 14 who has a slight disagreement about something can slow everything down maybe there's something legitimate in that maybe not so i think that that piece i really want to push we haven't done much there yet but i think there's some opportunity and we can some of the other things around the expectations for researcher training and the secure environments are things that we are absolutely taking up and refining the way we're doing things based on that so there is there is a there is a clear policy response to this but it hasn't gone as far as we'd like it to quite yet so i wanted to follow up the question that actually the problems are not really in the research field but in private business field uh because yeah that's certainly true right they are the one who are collecting our data they're not providing their money and storage in them and and the problem is that they are not only private business but yeah they are also just national business right who are not accountable to any national government and are making their own rules so look at facebook right so facebook is like regulating itself right it's like putting up the rules that supposedly are going to follow and someone's wondering like to what extent uh do you guys think if your your research outcomes could help us to rethink this how to make this business accountable and how do you account for this transnational perspective into your research because i feel that in science uh it's really is to make the connection between science systems and local government right national government but then when it comes to business like we are we are besides this digital revolution we are also seeing like a political revolution in some ways because like there is no accountability there there is like not a transnational government or rulemaking that's making those agents private agents that are becoming more and more powerful yeah so i these questions are so complex so i'm going to say a couple things so one is um this is where i think the gdpr so i'll connect it back to that question is kind of important not because it necessarily covers all the things we're talking about but because most countries in the world actually paid attention when gdpr came into force and we all felt like we needed to be compliant with it as well so there is a way for national or collective kinds of rules and thinking to spread beyond the nations to which they are explicitly applied now i'm going to push back a little bit on a private sector not because i disagree with the the spirit of what you're saying but i think what the private sector would say is they are absolutely following the law because the law says that you could collect data with consent and when you sign up for that app you've provided consent and if you read the privacy policy they're not doing anything that they didn't say they were going to do and i'll leave facebook aside because clearly they have transgressed beyond belief but and i'm sure others have too but for the most part there we have provided consent for this so i think we have to think about this less in the realm of do we hold them accountable but can we actually create a different set of rules and expectations either around consent or around subsequent data use where maybe individual consent for the collection of data isn't quite enough to say that you can then sell it use and so on and i'll add one more thing here because a little bit of a pet peeve of mine i think the one thing that we could regulate a bit better is some more honesty in what app developers are doing or other things and 23 and maybe that's my favorite example for this because if you think that 23 and me is selling you the service of genomic profiling i got something else to tell you their business model is collecting your data and selling your data and the fact that they can actually charge you for the service of genomic profiling as they're compiling your database is just like icing on the cake but so that's the piece i think we could do a bit more to regulate and make clear and then some of the other things but it's it's god this has got to be a big effort right not a healthcare system problem i would i would just add to the conversations i think you can be are in that case where that legislation is interesting is it's based on the storage of EU citizens data regardless of where the company is doing that it's transnational and also the fine structure is based on global turnover because one of those cases where it's like you just fold the local the local the national version of the company if it gets a large fine the fight is to the parent company and so there's a bit they have a bit of tooth teeth in that respect that they try to attack this particular issue separate separate question about the methodology in use in terms of the delivery of the game and knowing that PC in particular has used that methodology in a number of ways so there's some a set of values that goes around with that so certainly back to like one of you using the same hand is how and sort of transportable that you think that methodology is across you know Canada or globally in terms of just be committed to that process itself the the the methods are absolutely transportable so they have been used by the Canadian partnership against cancer and deliberations that spanned the country that's been used in Tasmania and California and other parts of Australia it's been used in a lot of places there's actually a new democracy foundation in Australia is really alerted to these methods and use them there's two-day versions there's four-day versions the ones I'm more suspicious of for this topic area three-hour version I think is just not going to get where you need to be but there's a lot of receptivity to these methods and they can be improved I'm sure where this is why we're running this as a research project because we want to make sure that the methods are evolving I will say the one thing that stalls people a little bit is they're not cheap so for us to run one of these four-day events for 30 people is probably about $80,000 and that's because we pay for people to part we pay for their time we pay for transportation and we don't want just people who remain coovers we pay for people to come from all parts of the province we pay for food we have to create the booklet like there's just a lot of costs involved and I would say in the context of $25 billion health care system in British Columbia it's you know $80,000 should not be considered an overly expensive investment but it is that's the limit to them the other limit of course is kind of going back to the first question around how do you feel about your sample size and and so on before we wrap up can I just ask two very good questions to one is so what do you think legally so from a legislative point of view what's the biggest barrier to sort of maximal use and then second what as a data as a data holder so as population data be seen what can organization like that do with the absence of I would say that I come down on the on the side of legislation not really being a barrier at this point I'll have one tiny exception to that it really has been the interpretation of the legislation and the fact that it's wildly variable same legislation different people's opinion and everybody's opinion apparently counts the same the one thing that I would say that the legislation if we were going to write new legislation I would poke at a bit is this identifiability not identifiability because all a legislation that I'm aware of kind of hinges on are you using personal information for your research which means is it identifiable once the data are deemed de-identified then the legislation doesn't even apply and I don't think that's sufficient framing in this new world for all the reasons I said I don't think we can de-identify but even in the old world it was awfully hard to draw that bright line between what's personally identifiable and what's de-identified it just it feels way too slippery and to be something to rely on in the longer term. Thank you very much for for coming and speaking to us.