 Our next and final conversation for the day is will health tech ever be hack-proof? We're going to focus on the security, privacy and vulnerability of mobile health technologies and leading the conversation is our my colleague Peter Singer, a strategist and senior fellow at New America, founder of Neoledite, a technology advisory firm and the author of multiple award-winning books and a contributing editor at Popular Science. So take it away. Great, thank you. So the title of this as was said was will health tech ever be hack-proof and I'm going to go ahead and say no because frankly the human body has never been hack-proof and it's got a lot longer lead in terms of compared to medical tech. But the point they were really wrestling with here is that we're seeing an explosion in the possibilities of technology being applied to the field of medicine and they're crossed particularly with IT but also an accompanying explosion in terms of the potential risks as we just heard in the presentation beforehand and that's largely driven by the fact that much of this technology like the rest of the internet of things or the internet of everything, whatever you want to call it, the bits and pieces the physical parts weren't designed with security in mind and including those that are medically related and the very real risk of this first came to attention at least to me at the 2011 Black Hat Cyber Security Conference and at 2011 Black Hat a diabetic man named Jerome Radcliffe showed how someone could do this kind of wireless this hacking into wireless insulin pumps and since then we've seen this play out in lots of different ways as we heard in the past presentation both in terms of the devices but also people going after the various players in the healthcare systems from the front line in terms of the hospitals all the way back to the drug companies where we've seen hacking of them to try and figure out how their drug trials are going so that you can game the stock market around it and so what we're going to do is explore some of the important questions that surround the privacy regulations and security issues in this space and I think what's fantastically interesting about them is that it's a space that connects almost every level of concern that is you can think about this topic as a personal concern there's nothing more important to you at the personal level than your health you can think about it in terms of a business concern an organization whether you're a healthcare company a hospital the the VA and you can also think about it in terms of a national security global security level concern and so we've put together just an absolutely fantastic panel to help us work through this you we already heard from Kevin Fusso I'm not going to give the further introduction I think he basically illustrated that he's the man when it comes to these kind of discussions but fortunately we've got two other really great people to join him we've got Alvaro Bedoya who's an intersection in terms of the looking at privacy law and technology he's executive director of the center on privacy technology at Georgetown and he also comes to this with an experience on Capitol Hill where he was chief counsel to Senator Al Franken and worked on the Senate subcommittee on privacy and then we also have Lucia Savage who works at the intersection of health IT law privacy and public policy and she's chief privacy officer at the office of the national coordinator for health IT and so rather than asking them each to make further presentations we're just kick off in the conversation and so I want to open it up by asking each of you what do you see as the key implications on the security side of the introduction of all of this new health technology particularly when it comes to mobile which is you know the first panel was looking at the tech itself what do you think are the key implications on the security aspects of it and why don't we just go down in the road again so Kevin you've already played a little bit oh okay so sure I'll be in so when I think of what's coming down the line it's it's the word that comes to mind is complexity and the more the devices are interoperating the more complex these interfaces become the more complex the failures become and complexity is basically what a hacker wants because complexity tends to breed the ability to cause problems so my concern is about how to tame that complexity so that we can reduce all the security and privacy issues I think that's sort of the best approach in general start by reducing the complexity in the first place you don't have the problems to begin with so I you know I think Kevin has done a great job speaking to the security side I do want to speak to the privacy side of it because it's very connected and I think the key risk that we have is that we will create a pool of extremely sensitive health data that is totally unregulated and that is shared broadly without our knowledge and used in ways we do not know so we tend to talk about mHealth apps and devices as if they're one thing when it comes to privacy there's two kinds of mHealth apps and devices there's a kind that's protected by privacy law and there's a kind that's not and the kind that's protected is used by a doctor in a medical setting which covers senator hippon high tech act the federal health privacy laws and the kind that is not that is not protected is everything else so and we tend to think that's okay because the kind of health data you have in a hospital and a medical setting is really sensitive and like your Fitbit's not that sensitive right and that's not true with this latest generation of devices you have devices that track not just how far you run the number of calories you burn they also track what your blood glucose level is your heart rate your fertility there are actually wearable fertility monitors right now that will not only track sexual activity but actually whether or not you are at the peak fertile time of the month and unless it is deployed in that medical setting it's not protected and what you're seeing is that it's being shared very broadly and it's being shared with data brokers and other parties and it needs to be fixed it's interesting to compare though in the discourse in cyber security overall in politics today we have created this sense of attention between privacy and security you can see this in the in the encryption debate that's going on where we're saying almost you have to choose one or the other can you apply you know you seem to be taking a very different position when you apply it down to healthcare yeah you know I see I see a potential tension between privacy and national security on the encryption debate but here I think it is an unalloyed good to have consumers no more about the data that's being collected about them and to have some control about where it's being shared and you know I and you see this also this isn't just a consumer privacy advocate talking this is this is the most the largest most wealthy company in the world apple has said if you run your health and fitness app through our operating system you cannot share this data with third parties without your customer's consent and so there's consensus in industry but the law doesn't recognize it and so the big question and in order to fix this you need to have enforcement by companies and you need to have need to have a few key actions by folks like the FDA for example there's a whole other piece to that just a building on Alfredo was saying which is we also have an environment where people increasingly want to make use of their data that's in somebody else's custody and they can't and sometimes they can't because of security or privacy used as an excuse so we have personal story you know I'm the chief privacy officer ONC and I went to an urgent care center and I asked them to email the visit result to my regular doctor and they said privacy law doesn't let me do that I went oh my god I have so much work to do but that that's a really simple version of that but to this point of if you have that fertility collector and you want to now send it to your OBGYN or whoever your fertility specialist is and the company that runs that device won't let you because they own the data or they claim custody over it that impedes the whole part of the patient engagement so it's really both both and we have the situation where on the one hand we need consumers to know and understand better particularly because there's only so far that the sort of paternalistic protections a big company can supply will take you and on the other hand if we really want consumers to take action on their own data and not ownership like property but like ownership like I own my health and I'm going to take care of my health we have to give them the ability to make that data move despite somebody's custody so how do you go after that mean you're saying we need consumers to better understand I mean it's incremental I was just saying to somebody yesterday I feel like one of those ants in the kids movies you know where you're carrying the seed into the rain falls on you and you have to run out of the leaf and come back out but you know we it's really incremental so at ONC the step we've now taken is we propose that for a view download transmit so those of you who are not on your physician's portals you should be because we paid a lot of money for those portals but you can take that portal when this new rule gets takes effect and use it to send data directly to an API of a third party you choose right so there have to be that technical capability so we have to put more power into the consumer hands but at the same time the consumers need to take responsibility for the power they have you know we hopefully are not publishing our pins for our online banking you know in red lights on our front doors we shouldn't do that with the way we keep our own health information secure you do you want to weigh in on the consumer side yeah you know I think there are very simple things that could be done to change the status quo and and let me add one other thing to the to the scare pile that I think Kevin started building FTC last year looked at 12 apps right and they looked at they weren't just you know running apps they were also pregnancy tracking apps and and heart health apps and of those 12 apps they shared all that sensitive data with 76 different third parties this is may of last year this is a year ago right so what do you do to fix it first of all companies need to make more promises and they need to live up to them so apple is this great rule right don't share with third parties without your customer's consent apple had that same rule for geolocation and has had it I think since the existence of iOS basically iOS in 2011 my old boss senator franken asked bud treble how many apps have you kicked out of the apple app store for violating that rule and the answer was zero and so apple needs to enforce its promises and google should should make the same promise it should it should empower users of the android operating system that control an FDA there's one very specific thing FDA can do FDA something has done something that on the whole is probably very positive in order to let folks innovate without having to go through what is a very onerous process for medical devices it said look there are medical devices that are definitely medical devices so you know if there is a smartphone app that literally shows you an EKG on the smartphone while you're treating a patient medical device right there are things that are definitely not medical devices like you know accounting software in a doctor's office right and then there's everything else and for everything else we are going not going to force them through the medical device application certification process because they pose a lower risk to the public FDA could do something very simple they could simply say if you are sharing your users sensitive data to third parties you do not pose a low risk to the public and you will not benefit from this extra coverage and that'll be well in line with what apple's doing and I think it make a big difference Kevin I want to ask this question but in a slightly different way of view which is the consumers that in the in the prior presentation you talked about dealing with were essentially medical professionals as opposed to the consumer field overall so how do you how do we go after the awareness side among medical professionals right so medical professionals are not too different from every other person in the country when it comes to cybersecurity hygiene so they're taught to wash their hands before in between patient encounters but they're not taught as well as for this sort of cybersecurity hygiene I'd say we have a very long way to go I hesitate to say to put all my bets on education because education alone is not going to be enough to teach them about the importance but it the it's the bar is very low right now so let me give you a story to illustrate that I have a colleague who's an electrophysiology these are the people who implant pacemakers and the senior physician in his group is widely known to be sort of popular with the malware shall we say and the senior physician doesn't exactly know what to do to clean this magic malware off his usb drive so he just plugs it into the junior fellow's computer in the morning and it magically somehow sanitizes it so there's a I think a lack of understanding about how does malware spread and what are effective ways to keep it keep it out of systems and it's out of sight out of mind just like microbes were in the 1840s and so it took 165 years to get to the point where we understand that you wash your hands to get rid of those microbes but we're nowhere near that when it comes so it's the idea that the the training and awareness is something that needs to be done within the professional community a lot like how for example the the ABA the lawyer profession you know you work in a courtroom but we're now starting to see cybersecurity training for lawyers that we need sort of the same for medical professionals it's just part of your job part of your business um yeah yes and education will be one point for instance maybe you shouldn't be just downloading random things on the internet on the same machine you're using to manipulate patient data so I see this so I take my students into the operating room and they watch live surgery and we see what you know the gmail being checked and we we know that an imperfect human can and we all are can easily get infected through these mechanisms so there's education will play one part but also some of the organizational structures are encouraging this just like in any big organization you have policies fighting policies and you'll see that in large hospitals as well okay can I just add a reality check to that so six years ago only 20 of american physicians used an electronic health record in their office and now it's up to about 80 so in many ways those really small businesses that are outside of large institutional settings like University of Michigan Medical Center for example um you know they're learning now what those of us who work in industries that computerized earlier had to learn in 1992 about how you store your passwords or what is password complexity and all the time we hear about situations where you know there's a physician and there's the spouse who's running the back end of the office and there's a part-time nurse and the passwords are literally on a post-it note on some cabinet door which is no more secure than leaving your front door open and I had somebody asked me recently well will the government help us pay for security and I said we don't subsidize putting locks on the door of your clinic why would we subsidize all of that too you know there's a limit to taxpayer but we just have to recognize that across the healthcare spectrum really wide economic capabilities as well as capitalization to address this problem and so education is important but sometimes people are really just in the nascent stages of learning how to turn on their EHR let alone complex passwords or not to use thumb drives at one more point there's also a lack there is no sort of silver bullet or magic pixie dust you can just purchase imagine you have infinite money even if you have infinite money you wouldn't be able to go out and buy something that solves your security problem today and and I would liken that to the 1840s with hand washing running water was kind of scarce in hospitals latex gloves hadn't yet been invented so it was a kind of a big ask to even ask for hand washing back then and right now it's still a big ask to say just make it secure because the solutions aren't innovated yet well this this raises an interesting point because as you put it you can't rely on the government to do everything in the field of cybersecurity in general and what we're seeing though and is both a call for the government to do everything which is not going to happen but you have the second part of it which is you have the slow but very significant creation of a cybersecurity insurance industry and is that something that you think will move over into the healthcare sector and play a key role in this when it comes to both when I say healthcare I mean and basically going back to the prior panel we're referring to both both health traditional healthcare companies but now technology companies that are playing the role in this space what do you see in terms of the kind of the market incentives you talked about consumers we've talked about government but there's other market incentives that are in play where do you see these in this space so I think if we break it down by size I think in really large activities for example the really large statewide data projects that may or may not be sponsored by states those are key candidates or key customers for the emerging area of more cyber insurance in healthcare because the data is so voluminous the project in California carries you know data on 10 million people for three years that I can't even count how many zeros are on the bytes there but you can all imagine that it's a lot versus an individual physician practice and I think that size is going to drive two parts of that one is there's going to be a point in the future and I don't know when it's going to be maybe when you know I have grandkids that not using the best technology to care for your patient is going to be the thing that people won't ensure right now there's no like you know sort of malpractice insurance for using computers or not using computers that hasn't evolved yet but as that evolves cyber insurance will go along with it at the small practice level but the big institutions are already trying to figure out how to do that especially if they're in multi-party arrangements like acos and stuff let's talk congress where are they in this what canon should they do sure you know so there is a sad fact in the field of commercial privacy which is nothing's happening and nothing's going to happen on government privacy when it comes to nsa it's a little bit better because you can form alliances across the aisle to work with republicans and democrats and come close to passing a privacy bill so hence you have akba you know being close to getting passed and this is the law that lets police look through your email if it's like 181 days old without a warrant the nsa bill came two votes two votes shy of getting a full debate but if you look at commercial privacy not a single commercial privacy bill was even voted out of committee in the last congress in the senate and i'm pretty sure in the house as well if you look at privacy more broadly and you'll see where i'm going with this take a while to guess as the number of privacy bills of any kind that were passed last congress zero right and that's not my favorite number but um so uh and you can a different question is how many privacy bills were were passed in california in that same period and then number 17 and so i think uh where this can actually start to get fixed is in the states uh and uh you've seen texas i believe pass a law that bans the use of health data in certain ways regardless of the setting in which it comes from and you've seen various states innovate basically be let little laboratories of privacy and democracy by passing their own laws with regard to biometrics and whatnot and so um yeah i think i think we need to look to the states uh for action here and we need to look to ftc and fda the challenge with that however though is as we try to build uh standards for how the health care system will operate with technology if we have rules that vary from state to state that it's just monumentally harder to build a nationwide system because then texas is doing something different from california and there's arizona new mexico in between so it's we have to think about that but i agree and there's there's no current activity to open these debates even though many things are different now than were uh existed in 1997 when hip was passed or even in the 70s when we had a big round of privacy um legislation both the privacy act federally and at the state level you know we with the affordable care act we've kind of undone cherry picking through underwriting so if we're locking up health data to prevent people from getting insurance have we solved that problem in a different way we we're not asking ourselves those questions and if we were to ask those questions we would be able to tackle the security that goes with the way we want to use the data if i can add just one thing so just just to clarify um i'm actually not advocating at all any that that state's passed their own little no but you're observing you're observing that they're doing it uh uh so so um perhaps i guess what i'm saying is i don't think i think states are preempted from doing their own you know they can actually go above and beyond i think security settings under high tech right under certain health security laws but i'm not saying states should should modify the health the privacy and security standards that apply in the medical setting i think states should innovate and pass their own privacy and security laws with respect to health information that is not covered under HIPAA and high tech and there what you would see is a race at the top and uh what you would see companies getting founded anywhere in the united states saying okay what rule do i have to follow do i want customers in new york i do and so i need to follow the standard and as long as the laws aren't crazy and you know some of them some are you know crazy laws come out sometimes but as long as they're not crazy they uh it'll create a race at the top and it'll be better for for consumers everywhere this has been a very u.s-centric discussion for technologies that will go global uh where do you see other states i.e. nation states in this and are there models that we might learn from so my prior position um a united health care had a colleague whose job it was to manage the privacy um in this setting where a united has an international division they offer clinics on-site clinics in remote parts of the world for american companies who have workforces there like the extraction industry and they actually have the capability to helicopter people to offshore international water-based boats that have full-blown hospitals on them right and his job was to manage which privacy rule applied in which part of that transaction between that on-site clinic at the extraction location the boat and then maybe you know either a european location or an american location depending on the illness of that you know because these workers can get really severely injured through an industrial accident i think that was the most fascinating job ever and that's where i you know we do have medical tourism it will grow particularly for the people who have insurance because you know you can get lasik and south korea for 800 dollars and here it's still in the $2,000 range so get a vacation get your eyes done so that's i'm thinking 10 years down the road it'll be about medical tourism perhaps international telemedicine we'll have to sort that out let me make one comment about the international less about the regulatory but just sort of on the security side some stories i've been picking up from my colleagues in other countries countries where they where they are less fortunate they have less robust health care industries well guess where all the depreciated equipment and hospitals go they sell them to other countries so all the problems we're having right now guess who's going to have those problems in 10 years so these devices are not going to be destroyed they're likely going to be reused and so there's going to be a huge amount of legacy out there that actually connects to a project here we have on on the cyber security side looking at it as whether cyber security itself is becoming a rich poor issue and you can see that in terms of individual victims versus who's better protected to your point about companies larger companies tend to be better at it but you also see on a global level where we're seeing clustering of cyber crime on sort of a developing world aspect so it's an interesting parallel to it i want to go back to the fear factor tour that you took us on and and let you both join the conversation what do you see as what's the what's the thing that scares you the most in this space but then as a follow-up question of that what do you think will be most common these are sometimes different and why don't we we will go in reverse order here sure so i i look at some of the more recent uh spectacularly sized breaches in health care and i start thinking about things like well why were those social security numbers in that anthem database that's kind of a well-known fact and why could they have segregated them out and there's a pretty significant dialogue about a breach of that size which is what was it the hackers were going after was it data that would let let them hack into people's financial records and steal financial identities or the data that would let them defraud the health care system those have two really different economic impacts and i'm i don't have an opinion about that i'm sort waiting to see how it sorts itself out and there's a third possibility that was uh floated which it was a foreign government looking for identifying information on us government workers covered by that i.e if i want to figure out james bond's actual identity exactly his health care information is useful to me and that's why it was uh alleged to be linked to chinese state linked yeah i don't i think that's been in the press but i don't know that there's definitive findings on that and then you have to look at all the other you know premera might have some u.s. government employees there's certainly a lot of you know military people up in the northwest but it's not going to have legislators so you've got to sort of think all that through but in my mind what i'm thinking about is we have to sort of have an order of risk so we know where to we can't address it all at once particularly not on the um policymaking or rulemaking side so we have to have an order of magnitude what is the thing that collectively we're worried about if it's defrauding the health care system we could actually probably figure that out at the point of fraud if it's stealing people's financial identities then we need to keep the taxpayer information separate from the health health insurance information so that those things are harder to connect what do you mean by um what might be most common well there's a difference between the um uh to use the past example um run of the mill breaches uh the financial fraud to a spectacular uh case that's a hit or um you know i could from not to speak for you but there's the uh scariest one might be i don't know there was a tv show about a presidential uh it was at the president the vice president yeah yeah the vice president in homeland but that's not going to be the thing that happens to everyone it's probably credit card stuff yeah so what do you see in these you know there's yeah um so i think uh um frankly i'm quite scared as to what exists today and um right now all this data that's coming out of the unregulated market right that's coming out of all the entities that that lucia does not have to worry about on a daily basis um right um all that's being bought up by advertisers and hey maybe that's not that bad uh uh some people are gonna hate it some people but it's also being bought by data brokers and these are people who literally create lists of people with stds with parkinson's with alzheimer who are alzheimer's disease who are obese you know um who are pregnant you know and um we uh um we don't know what these folks do with this data and they exploit various loopholes in laws that regulate insurance so um if i know that someone uh has one of these conditions and i deny him or her credit because i think he or she is a credit risk it's one thing but what if i have a list of a if i have a neighborhood where i know diabetes is very high and then i make a credit decision about that or an insurance decision perhaps um a not health insurance decision but a life insurance or something like that that may it may not be covered under existing federal laws and so i'm worried about um the um i'm worried about the growth of this unregulated market of health data and what it might mean for our finances and our health um uh on a regular basis and having decisions made made about us on a regular basis and i think the doomsday scenario would be um just the kind of perfection of that market uh and the trading of of this health data even more profusely and and right now it's kind of behind the scenes a little bit you know in order to access it you need to have um you need to know about the companies you need to present yourself as a company of a certain size even though hackers you know uh fraudsters have gone and purchased a bunch of data from a data broker before but right now it's not um it's not as lively of a market as as it might be and so i'm worried about that really uh becoming you know a bizarre of of maladies all right well my biggest concern uh is uh what happens if patients begin to not accept medical care because of fears of say cyber security problems as opposed to actual more sort of scientific studies it's it's very easy uh to base your decision on the most sensational case one of them being you know pacemaker problems or or insulin pump problems and it's not to say those problems aren't real but uh what i'm saying is for instance i would not be surprised if you're more likely to die from a hangnail uh you know then from a problem with the security of your pacemaker but um uh i i think it'll be a real tragedy if we are not able to give patients the confidence to um uh accept the recommendations of their physicians and their health care team uh these patients who are prescribed medical devices are prescribed because they are predisposed to very unique risks uh uh and um so if i'm not prescribed a device yeah i definitely wouldn't take one uh because i don't want to be uh have the chance of the infection but if i am prescribed a device i'm much better off with it than without um and uh if we if we if two things happen if either we focus too much on the sensational or if not a not enough action is happening behind the scenes and manufacturing uh eventually the the confidence will be eroded from the patient community kevin raises an interesting point so i don't know if anyone in the audience has ever has traveled to central america in the last couple years but you cannot use an atm anywhere in central america they're all compromised and i think about this when i think about the affordable care act because we have so many people who now have high deductible health plans and how are they paying those deductibles they want to pay them with a credit card in their small physician's practice office if that's white machine is not secure then do they start losing confidence and not just foregoing a device but foregoing care or you know finding the process of accessing care so inconvenient because they can't use the financial system because it's not secure when they pay for their medical care well there's a different parallel too which is the the vaccine debate where it's very much like you're saying essentially someone ignoring medical advice from their doctor uh because of something they've heard or read online um and you could have the sort of health uh health it version of this well let's open it up to conversation here more uh please um raise your hand if you have a question and a mic will come to you hi i'm norman um i work with tech change and you know i want to try and sort of connect the two conversations we were having about the rise of sort of you know wearables and you know some remote sensors and being used in health care data and as we've been talking about the security aspect we haven't really talked about that and so one question i have is especially as this is becoming more and more popular and as it's both in the consumer and perhaps even getting with official health care is something that's starting to be used more and more of that data is being generated it will become sort of a larger place for people to start looking at to sort of compromise and how do we try and help secure these systems which don't always necessarily have the processing power and the capabilities to do some of the things that we think of as necessary for um sort of cyber security um so i'll i'll take a stab at that so at onc we've actually been working with m health developers for a while trying to figure out what can be done in the absence of regulation right no regulation is is in the on the horizon line anytime soon and there's actually quite a big difference of opinion there are definitely uh sort of a group of developers who really want to follow some kind of best practices and are willing to make a commitment about that and their other developers who um a they don't want to be bothered or be their their startup and they don't want to spend their startup capital on the design advice they need to make it secure or see they just want to sort of grab their money and runs there's a group who doesn't really want that regulation and that's going to continue to exist because that's the nature of entrepreneurialism but i think sort of where the rubber hits the road at least from my perspective is you know if you're a you get back to the deal that was in that first panel the consumer has to really understand the deal and you should feel free to reject a deal that you don't like and i'll give you kind of an ironic story but i have a friend who's pretty sophisticated she uses gmail but not facebook because she says facebook is our data thieves and i'm like and gmail's not like i did kind of doesn't compute but she's done homework and she's reached a decision you know that works for her so we have to do a little bit of homework we have to understand what's the difference between the way you know acme product and beta product so i don't name names comport themselves and maybe not buy the beta product and a great example that is airbags remember when airbags were not required and chrysler had them and they became such a giant sales tool that everyone adopted them right away to alfredo's point if you pick something that consumers want everyone will mirror you um i don't know if i have too much more to add to that i think it's i think in the consumer space it's it's tough to know what to do because all these companies um have a financial incentive to be very secure um and yet a lot of them aren't you know and so um i i don't know if i if i have the right solution there uh or or or good suggestions for how to improve that problem other than um other than i think maybe supporting and investing more in the security research community um i think this is a community that um has been maligned uh that um federal law uh unfortunately deems a lot of what they do illegal and they survive by um by the graces the good graces of of the federal prosecutors who don't want to prosecute good faith um or white hat security researchers um that might be one thing you could do but yeah i don't know if there's an overarching is there the opportunity for the equivalent of a um bug bounty program within this space as opposed to software overall i i think that would be welcome i think that'd be a great idea um but um yeah i think that'd be fantastic another thing that would be great is to have a prosecutorial guidance from doj that will outlive one administration that says um hey you're a security researcher you're not a bad person stick within these rules and we're not going to go after you for cfa and i think it'll allow the growth of um of white hat security firms uh that will do a lot of that work and for security researchers to do that work independently i think that'd be very valuable and that's something that doj has not done ccips has not done so far that could that they could do it's got another question adria gropper patient privacy rights uh one aspect of security is uh to have a log to have something that you can review and we're all used to having that for our financial accounts yet in health care particularly uh where hippa says uh you have a right to an accounting for disclosures that is never available uh it's never available conveniently as somebody said in the earlier panel you shouldn't need foyer in order to get to a to an interface um what um what can you say about uh the need to have this kind of transparency as to how our data is flowing and in particular uh when you have a lot of data flowing under exclusions like treatment payment and operations or research and those uses are not disclosed to the patients how can we develop this culture of a versus b is using my data one way or the other um so i'm actually gonna defer because the agency that does that is the office for civil rights that's under the hippo regulations as adria and i'm sure knows and we're waiting for them to issue guidance at least for the health it space that's regulated that tells us exactly what that audit log has to that accounting for the disclosure's log has to consist of because the what is being produced industrially or right now is is you know every click it's not mean it's not doesn't have a meaningful consumer user interface we'll just put it that way um and it's voluminous data um and in part the developers you know maybe they could apply some ui techniques to that and come up with something on their own but they're waiting for guidance because they want to do something that is going to be deemed safe for them i'll just add one thing um which is if to continue the fear parade um uh if you want to see something really scary you should google you google atanya sweeney 2014 ftc uh uh health workshop and um she's put together these graphs which i'm sure you're familiar with of where your hippa and high tech act protected data in other words the protected pool i've been talking about so generously where that goes and it looks like a like a knitting you know like a like a like a quilt that's being knitted uh it's just all over the place and and i think we presume that when it's in this protected space um it's it's being compartmentalized and treated in a very specific way but uh professor sweeney has shown that in reality that data is shared very broadly now at the end of the day um uh if someone slips up or if someone is some true wrong occurs in that space there are remedies that are better than in the unregulated space but still remarkable uh we've got i've got that we've got time till 2 p.m just want to confirm that okay so uh up there uh hi christina from national center for health research i'm just interested in um i guess emr security uh less so uh i guess the type of thing where infusion pumps or defibrillators are being hacked into but more things uh and people are then scared to get that but more things like std testing and mental health uh and abortion for example now these are things that people really want compartmentalized and it would be much easier to get that kind of information um i think that's a much more likely i guess an area um or at least a more likely fear that people would have and i'm just wondering what sort of information what kind of compartmentalized asian in the emr themselves um is being regulated and should be regulated so um i can speak for in general how the emr's work so an emr if it's certified under the onc rules and used by a meaningful user has to meet the security requirements of hipah which are administrative um technical and physical and there's sort of outcomes based and people again those rules are issued by ocr people ask all the time well what should i do to get that outcome and of course we don't write that down because then the bad people would figure that out right away and circumvent it so it really is just outcomes based but in terms of what's required so hipah does not require segmentation we call that of anything except psychiatric notes and then um state laws there are many state laws in some of those areas but not every state has every state law so some states may adopt special rules for hiv aids and other states may not for example some states may treat hiv aids as a particular kind of condition and and std separately some may conflate those together um those laws apply to the physician's behavior in disclosing the data not to the emr and whether it's sort of logically segregated within the emr so that's how it works there because there's no federal standard that requires segregation except for psychiatric notes um there's no technical standard for that that's required there are technologies hl7 is working on son i'll be talking about some of him's in a couple weeks that enable that type of tagging and segregation but it's not legally required under federal law how would you navigate this in terms of um we identified certain things that you thought might be controversial but then there's you know each of us could could come up with lots of different ways and or there's status that might have once been controversial like hiv that is not is is evolved in terms of cultural this is like a whole another panel you know this right yeah absolutely i actually have a little bit of just something food for thought for you we actually you know can look back i'm a lawyer i see things through that lens we can look back at law states have enacted where that's a very open and public process people get to participate in it you know there are votes all that kind of stuff and in that respect it is a type of consensus building in a way that a private policy is not and that's why if you look at laws you might actually be able to sort of go oh it makes sense this state would have a special hiv law in this state which tends to be much more um religiously conservative for example might not have an hiv law because that's how that states politics worked um so there's that whole piece and then i have my own personal views about the sensitive nature of health i get it their states have made decisions on behalf of their citizens but i've had this argument many times is viagra really sensitive when you can see and add for it well smart met march madness so you'll see many ads every time you turn on tv but with any sporting event and so we have the tools to collectively decide that certain conditions get special protect protection we may not be using those tools very effectively and some of those tools may in fact be out of date i don't know if others have a different sense about that alfredo is sort of going i don't know if i agree with her no i mean i i i think that um my default is generally towards protection and and and um it's one thing to see a viagrad and one thing to be some via you know someone who actually has prescribed it you know those are very different positions to be in so i would in general default to more protection and um and uh perhaps a more collaborative process but but but hiv is a really interesting example there do you think think about new york has a pretty significant gay population and they actually don't have an hiv protection law in fact it's kind of the other way around and the the public health research shows that that population is actually healthier than in other states and they may have uh i think they've included protections where if hiv status is known you can't take discriminatory action so they've solved for the discriminatory effect that's in the uh parade of horribles but they um sort of have created a system where they've de-stigmatized the conversation about it i call it that pardon the sort of pun but the harvey milk approach if we can de-stigmatize it and address the discrimination can we learn from the data i don't know just to add just to respond to that i guess i guess in my mind you know in the same way that um it's one thing for the law to de-stigmatize it and it's not a thing for people to de-stigmatize it in the same way that every time you speed not every time you speed you get caught and so uh just because the law says you can't discriminate on someone because they're hiv positive doesn't mean that people will stop that rather it means that people will probably do it a little bit less and then you know one out of the five people that does it will actually get get called on it somehow and then one out of ten of those will be sued and then one out a hundred of those will go to court and and and maybe you know uh uh uh one out of two hundred of those will will uh will win it's gonna another question that's right back there philip disher and arnon porter llp talked a lot about the role of the regulators in cyber security i think there's some consensus up there that maybe sort of uh cyber security is always going to be an issue in devices will never be sort of hack proof we've seen a lot of documents whether it's the cyber security guidance document from fda the fedesia health it report uh the uh interoperability roadmap in terms of regulating to address this risk it seems like you're not gonna be able to regulate it away is there any recommendations in terms of where regulators should be focusing their time and maybe if you are a regulator maybe have to pass on this question i'm trying to think of if it's about sort of setting minimum standards i think regulators are always going to be a couple steps behind either either the bad guys or maybe the private sector and either setting voluntary standards just wondering sort of how the regulators fit in here uh in sort of a best-case scenario um so i i am sort of a i'm not i don't write rules but i'm in a regulatory regulatory capacity and i'll say two things one is there's stuff being debated about this right now on the hill like literally yesterday a new bill was introduced so there's definitely a dialogue going on about this and i think that from um you know the president's executive order in january is really clear that one thing that we can do from the government's perspective is facilitate people sharing identified threats so that other people know what threats to look for i think you were saying kevin that you know if you can see this pattern then you know what to look for and we need to sort of replicate that behavior so information sharing is definitely a key thing that the government can help facilitate small things um most of which i've already mentioned i think fda should limit uh its discretion narrowly uh in a small way so that if i'm a health or fit a health app really or or health wearable device and i'm collecting really sensitive information and i tell other people about your stuff without telling you that i don't benefit from that that that discretionary category where i don't need to submit approvals so i think that is a regulator should focus on the undisclosed sharing of of user information to third parties and i think the ftc should hold um uh folks like apple who make very good promises uh up to those promises so if apple says hey you're you know uh says to their consumers when you use health kit the apps that use your pull your data won't be sharing it with others um apple should be should live up to that promise and ftc is a unique position to to monitor that um i i do think the debates on the hill um play an excellent role in kind of just in oversight and and sort of setting best standards um suggested the best standards but um those bills just you know they're not going to pass um they they well they may pass but i doubt that a bill that you know stripped away a bunch of authority from fda would would would pass across the president's desk you know and so i think you're gonna see a legislative status here um and you're gonna have regulators be able to move um and states be able to move so uh a couple a couple comments there um one is on uh the sort of a reality check so if you ask the question what what can government do with regulation to improve cyber security of medical devices it's different from what can fda do so for instance with fda there's very specific congressional language about they had the remit for the safety and effectiveness of the devices for the manufacturing not for the use but for the manufacturer so if you're going to talk about how a hospital is going to use what a manufacturer provides sort of fda doesn't have the official remit if you want to ask about security it's not uh except for a 1982 law passed about the physical security of over-the-counter drugs because the thailand all cyanide incident in chicago there's very little mention of security in the actual regulatory language now on the other hand on the positive side at least in my experience i've noticed that fda has been very effective as a convener so in fact they recently held a cyber security workshop i think it was in october and i believe over a thousand people signed up mostly manufacturers so they what you'll find is again a lot of the the low-hanging fruit the the the knowledge of cyber security is a mystery to many medical device manufacturers and incentivizing them to think harder about it is something that the fda can do even without regulatory action they they you know they may take a regulatory action if if there's some egregious case with cyber security i'm certain they would if some if there's a death or an injury but in the meantime they're focusing on convening the parties assuming the good nature they they do have a strong assumption of sort of the good nature of the companies to do the right thing and if the companies don't do the right thing they don't have a lot of choices but but i can say uh the companies who do show up to cyber security events generally tend to be the ones who are thinking about it and my worries are the ones who don't even know about the existence of the events uh you might be wondering about them there was a great tweet uh showed up in my twitter feed from south by which said you know uh developers are blissfully unaware of fda and hip-hop regulation and i think that's exactly right that we have to worry about you know the next great brilliant innovation which could come from somebody's garage that isn't thinking about this stuff and there's a back door in there that does eventually cause harm that people don't recognize or it's not an fda-regulated device nor is it offered through a covered entity under hip-hop and it's in the kind of growing gray zone so i want to close by asking each of you to peer into the future so we can uh peer into the future in terms of projecting if there's a prototype right now it it will hit market in a couple years we can do that on the technical side but i want you to help us wrestle with this space of uh security privacy policy so what does it look like five years out or to put a finer point on it how will it look different five years out compared to today one way that it will look different five years out compared to today and you want a prediction not a dream right so didn't ask for you know unicorns and the like what what what what will it look like what's your projection i think that people will be struggling to improve the health care of americans with a siloed health care system that we can't we can't recalibrate the baseline rules on i think that you will have a good chunk of people let's say i'll say ten years that are using always on devices to prevent health conditions i think that's wonderful but i think you're also unless security and privacy dramatically improves going to have a growing size of the population that when they see a doctor about depression or an std or something else demands that the doctor use pen and paper and not put anything in the computer um i'll be more a bit of a casandra let's see i'll say i think there's going to be less attention on security problems in individual devices and more attention on security problems than how those devices are interacting uh in the greater system great well please join me in thanking this great panel