 Yeah, we're longtime DEF CON listeners, first time DEF CON talkers. So I like that our slides are cut off because it just says the internet already knows I'm X and you can fill in the blank. In this case it's pregnant, but that's historical. I wasn't drinking while pregnant, just so you know. So I'm Kashmir, I'm a journalist. I've been writing about privacy and security for about ten years and I'm a new mother which is relevant information at this particular time. And I'm Cooper, I'm a technologist and security researcher at EFF. I'm a privacy activist and I also care about privacy and security issues for people that have wombs despite not having one myself. So we paired up for this particular project. I'm an investigative journalist so I'm good at reading documents and talking to people and I'm also an immersion journalist. I like doing the things I'm writing about. So when I wrote about Bitcoin, I lived on Bitcoin for a week. It was hard. In the past DEF CON speakers helped I hacked a smart home. So yeah, I like to do the things I'm writing about, though I did not get pregnant just to do this talk. And I didn't either. So I'm a hacker, so my skill set involves apparently seeing dogs and blondes in the matrix. But so I'm interested in this. Originally I had, when I had advised another journalist on an article about anti-abortion groups using geo targeted advertisements to target anti- abortion ads at women inside of Planned Parenthoods. So you be in Planned Parenthood planning to get an abortion, you fire up Facebook and you get an ad telling you that you're, you know, killing your fetus. And so I met Cashmere and she pitched this idea to me after she had unsuccessfully looked for a real security researcher for many months and we settled on me. So yeah. And beyond being pregnant, I got interested in this. A few years ago I, we're on the wrong side. Next one. Oh, we are missing the slide. Anyways, a few years ago I helped popularize the story of the target story. Is there anyone in here who doesn't know the target data mining pregnancy story? Don't be afraid to raise your hands. Okay, I'll tell it. Almost everybody seems to know this story and I hear it come up at every single security conference, privacy conference I go to. So I'll be the one to raise it this time. But this is a story that came from an article written by Charles Duhigg, who's a New York Times journalist. He did this big article on how companies learn your secrets. And one of the anecdotes in this story was that target did really excellent data mining. It would look at the shopping behavior of women to figure out who early on who was pregnant. And it would be like they bought unscented lotion, they bought a blue carpet, they probably bought prenatal vitamins, which is like the big giveaway. And so there's a story about how target sent ads to this woman for all kinds of baby products. And it turned out this woman was a teenager and her dad saw the ads and got really mad and went to target and was yelling at them saying are you trying to encourage my daughter to get pregnant? And supposedly this dad called target back a couple of weeks later and apologized and said there were things going on in his home that he didn't know about and that his daughter was indeed pregnant. So some people don't want the internet to know about their pregnancy. In particular there's a Princeton professor named Janet Vertesi and she wanted, while she was pregnant, for the internet not to find out about it. So she went to great lengths to hide her baby bump from the world of big data. She looked at baby sites, baby product sites using tour. Her husband went to the drug store, paid cash for gift cards. When they wanted to buy baby products they would do it with these gift cards and have it sent to an Amazon locker. So it wasn't associated with their home address. And she said she successfully hid it as far as she could tell because she never got online ads targeted at her based on her pregnancy. I decided to do the opposite. I just told everybody I was pregnant. I downloaded, while I was trying to get pregnant, downloaded all of these period and fertility tracking apps and then once I got pregnant I entered into the apps I was pregnant and I just used an email address that I don't usually use in order to track what happened because I wanted to find out who was going to sell me out to retailers, who was going to figure out that I was expecting. So this is the list of apps that we looked at. We looked at a little over 20 different apps and they have all of these names. A lot of them are named similar things like my calendar, my time. And here's the first privacy issue. Once you put this on your phone and somebody's shoulder surfing do they know what you're doing with these apps? So there's one set of logos that are very discreet, like letters, numbers, these could all be pretty generic applications, ride sharing or something. There's a next tier though that involves a lot of pink flowers and silhouettes of women. I've never had so much pink on my smart phone. And then there's yet a third tier of weird images of fetuses. At least the last one doesn't give away what the sex is of the baby. And then of course the grand winner is this one. You don't necessarily know what this is. It could be like a porn app. No, but it's called Get Baby. So yeah, a little on the nose for this one. So I think that there's probably a few dudes in the audience who maybe haven't used these apps before. And you may wonder why it is that women would want to tell their smart phone when they're having their period. So these are some of the reasons why women use these apps. The screen shot on the left, you can't quite see it but it gives you options for why you're using the app. It can be that you're trying to avoid a pregnancy. So some women are using this as a contraceptive to avoid, I guess the sperm getting to the egg during their fertile period. Using it to get pregnant. Or if they're getting very expensive IVF treatments, they can use these apps to track what's going on with their body. Once you get pregnant, you can use these apps to track the human science experiment that's inside you. The main thing that the apps tell you about is what size fruit the fetus is in your body. Which is super weird. One of them told us that the baby was the size of a cheesy mango. Which I'm not clear what a cheesy mango is. And it told us it was not our baby. No, no, yeah. We should clarify. And then the other use for these apps is they have these community forums where women talk about all kinds of things. Sometimes it's stuff having to do with pregnancy and fertility. Sometimes it's completely different. There's a lot of discussion about people's sex lives. In the screen shots I have here, it's about, I can't even remember, baby bump selfies, those are really big. Oh, there's somebody that's saying, why is my period greenish black? But she should probably get out of the forums and go to a doctor. And then in terms of the kind of information that you give these apps, you might tell them what your vaginal discharge looks like. Because they're fertility apps, a very important piece of information is how often you're having sex and when you're having sex. Once you're pregnant, you might tell them the physical symptoms that you're experiencing from headaches to back aches, to not sleeping a lot. And then apps are helpful in terms of getting you to do things that you're supposed to do when you're pregnant, like take prenatal vitamins, do kegels, which if you don't know what that is, you can look it up. And just track your sleep and your weight like the usual kind of health tracking stuff. So I downloaded all these apps. Who figured out I was pregnant? So one of the first, one of the companies that figured out I was pregnant was Twitter. So this is from a Twitter account that is associated with the email address that I used with all these different apps. And I don't know if it's included in the screenshots, but they figured out that this is from, there's like an interest page that you can get to in your Twitter settings and it tells you what advertisers know about you, how they're targeting you. So I have been successfully associated with baby products, child care products, and very specifically it says in my demographic information I have one child, which is true. Also all these people figured out I was pregnant. This is my inbox at six months. And luckily again I was not using my usual email address, but I had ads from pottery barn kids, what to expect, which is one of the apps I signed up for, Huggies, and many, many more. So this is where we get into my specialty of reading documents. So I wanted to figure out, you know, how these people had gotten this information. There was a technical way to find that out and then there was just to read the privacy policy way. So one of the, and I'm only going to give you a few of the privacy policies, but the first one was what to expect. This privacy policy is 4,000 words long, which is eight pages if you print it out. And 2,600 words in, it very explicitly says, you know, we, when you sign up we give you a registration information to select partners. And there's a little link there that you can click and then it gives you a list of the partners. When I first did this it was like eight different companies, but when I went back to the screenshots for our talk it expanded to I think 14. It includes pottery barn kids, Huggies, some of the people that you saw spamming my inbox. And the privacy policy said, if you don't want us to tell these companies about your pregnancy status, don't use the app. You know, this can be worse than just getting spam in your inbox. Sometimes they send real mail. Similac is a baby formula maker and they're one of what to expect select partners. And so they get information from what to expect and other companies that they wouldn't disclose to me. And they will send a woman baby formula a couple of weeks before her due date so that she helpfully has baby formula around to feed to her child. This can go very wrong because not everyone who gets pregnant stays pregnant. So in one particular example a woman named Amy Pittman from Washington got pregnant, was excited about it, put it into the apps. She didn't like the what to expect app so she deleted it and then she had a miscarriage. And a few months later or I guess like eight months later she did get baby formula in the mail. So there's serious privacy harms here when you misidentify or correctly identify a woman's condition but then don't know what happens later. And it seems like a lot of these apps are not keeping in mind people who might be outside of the majority use case, the average user story or the average threat model so to speak which is a theme that will come up again. Okay so privacy policy number two this was the bump. This privacy policy was 4700 words which is 10 pages printed out. And so for reference this is the same length as the entire report that I wrote about this problem. And so I was reading this privacy policy and I was really surprised to get about halfway through it and I discovered baby's first wire tap. Which I use facetiously I don't know if a lawyer would approve of calling this explicitly a wire tap. But it had a feature. It told you that if you made a call from within the app like you identified a vendor that you wanted to do your baby registry on. If you made a call from the app it would record the call. It would record any message that you left. It would collect the phone number, the location where you were when you made the call, etc. Which I just thought was insane. Like I've never seen anything like that before with an app. And so I called up, I reached out to the bump and I said you know WTF. And they informed me oh you know we're not recording phone calls. That's legacy language for a contemplated feature for the not. Which is an app that they do for planning weddings. And the press person was like I'll send that to our legal team right away. And so this proves that no one reads privacy policies, not even a company's own lawyers. And they removed it from the privacy policy a couple of days after I reached out to them. The line of this bad thing you found was just a test. How common was that? Yeah I mean I'm sure many people out here who have reached out to companies about privacy or security issues often gets the response of it was just a test or we were just temporarily doing it. I don't know I hear that all the time. That was a response from about half the companies we contacted I think. Okay this is the last one. So this was Ovia which is an app, a company that makes a fertility app, a pregnancy tracking app and even a child tracking app. There are terms of use on our 6100 words which is 14 pages printed out. I don't know if you can see it in the screenshot but this is an app that gives you a really helpful fertility score. Like your score is high, do it, your score is low, nothing is going to come of it. And when I went through their terms of use they, you know a lot of these apps they sound like they're kind of like giving you the kind of advice that a doctor might give you. But many of the apps warn that they're not really giving you medical advice. And this one explicitly said, you know this app might be, have errors, maybe inaccurate just so you know we're not responsible for that. So I went back to look at their website and oh yeah you can see it good. So in the advertisement for their pregnancy app they said they'll give you real time alerts when your symptoms are dangerous. And right above that is like the little medical symbol. So I don't know it sounds like medical advice to me. Sure seem to be trying to imply that it's medical advice. So these are some of the issues that we ran into. When it comes to inaccuracy the warning that they gave is warranted. I found out that a year ago three doctors looked at a bunch of apps, 33 of the most popular ones on Android and iPhone. And they looked at their predictions for the fertile window. And of those 33 apps only three apps correctly predicted the fertile window. And we've got their results there. No one was completely off but they would just be off by like a few days. And interestingly the month that I got pregnant most of the apps told me I'd missed my fertile window. So my husband and I were excited when we found out that in fact it had worked. So if you're using these to not get pregnant it might not be the best method. Please don't use these apps as a contraceptive. So far I've talked a lot about one person that's involved in this in terms of privacy. There's two people involved. And so that was definitely on my mind as I was doing this project is that basically I was tracking my now in the world daughter online since she was negative eight months old. So I just wanted to say a public apology to Alev. So that's like a lot of what I was able to learn. So this is what I was able to learn just from basically tapping into skills I have as a journalist in terms of reviewing privacy policies, reaching out to companies. I also used an app I guess called Recon from Northeastern University that monitored the kind of connections that my phone was trying to make and that these apps in particular were trying to make. And so it told me essentially that there were a lot of advertisers that were getting information from the apps. But I felt like I needed more help in terms of really digging into the technical side of this. And that's where Cooper came in. Yeah. So I wanted to give sort of a hacker's eye to this and see what we could find out about the network traffic, what was what sort of API calls were being made, whether encryption was being used, whether the APIs were in security and what other companies were getting the data and what sort of data exactly they were getting. So. Cooper's really good at finding memes. So you'll appreciate the next few slides. So one of the first fun things that I discovered about these apps is that they give you some pretty specific advice. This one told me that I had a 2.6% risk for pregnancy, which is large considering I again don't have a womb. So there is no option in these apps to say I'm a dude. Yeah. So I used pretty typical reverse engineering methods, static analysis, dynamic analysis, and some other tools. So for static analysis I used a tool called JADX, which is a decompiler for APK files, produces something close to the original Java source code. It also extracts the resource files. And then I loaded that up in Android Studio where I was able to do some similar things like what you can do in IDA. I was able to rename function calls that were obscured, track flows of functions, see where permissions were called, why they were called. So this gave me a lot of good insights into apps that I wasn't able to get just from network traffic. And this is how I got network traffic. So for dynamic analysis I used a tool called man in the middle proxy or middle proxy. And what middle proxy does is it intercepts SSL traffic. You install a special root certificate on your device and then you connect to the proxy and you can see content, headers and everything else for HTTBS traffic. And it also allows you to replay requests, edit requests and replay them. So it was really good for looking at the APIs, figuring out who was being contacted and what they were being sent. And then the other tool I used is a proprietary tool called crypto wire, which they donated their services to us for this project. So crypto wire, it does a combination of static and dynamic analysis. And it allowed me to quickly do sort of a quick triaging of about 40 different apps in a couple of hours to see sort of a high level overview of which ones might be worth looking into further. And you can see here this is kind of the high level analysis screen of one of the apps where it told me that it was leaking personal information. So yeah, so one of the main things that I found in most of these apps was just lack of HTTPS. So meaning that content, important personal content was sent over plain text HTML. And so you might wonder, like, okay, we're talking about the privacy and security of fertility apps. Who's going to attack these things? Like who wants to hack these? So let's think about what kind of information is going into these apps, women talking about their sex lives, or the kinds of things that they're writing on the community forums where they're talking about issues that they're having with their pregnancy, medical information. Again, a lot of them talking about their sex lives. Strangely, a lot of women talking about their experiences being sexually assaulted. That was like a very common topic of conversation. So this is all being sent in the clear. And you might, it might be intercepted by somebody who shares a network with you, which could be your partner. Could be your restrictive father. Thanks to Congress deciding not to move forward with privacy rules for ISPs. This means your internet service provider could get this information and this would be more, more information used to target you with ads. And then the other thing about this is of course that somebody with man in the middle position could inject and execute JavaScript on a lot of these apps, which use the WebKit framework to render pages. Related to that, we found a number of issues with account hijacking a lot of fire sheep. So four of these apps would have been fire sheepable. So they were, which is to say that they were sending authentication cookies over plain text. So Pinkpad, WebMD, Baby, MyCalendar and the Bump, all were found to send authentication cookies over plain text. So top three have not fixed this. So if you're using these apps and you're using DefCon Wi-Fi, don't do those things at the same time. Probably just don't use the Wi-Fi. Even if you're not using these apps. We also found a lot of personal information links. So for Pinkpad and another, so Pinkpad is made by a company called Alt-12. We tested two of their apps and both of them send your exact GPS coordinates to the Alt-12 server every time you start the app. And why the hell does a period tracking app need to know your location? And it's in their privacy policy. It's that they can provide you with location based information and ads. Location based ads. So we also found a number of other things like email, name, gender, pregnancy status all being sent. And I don't think that Pinkpad was the only one that was sending location. A bunch of these apps requested the location permission. But what we found through static analysis was that a lot of them were encrypting the data that was being sent to advertisers and to other people. Pinkpad was just the most obvious about it. And if we had more time maybe we would have found others. And so again, thinking about the threat model here, I actually want you to put on your William Gibson hats. And think about the possibility for what could be done with information about women planning to be pregnant or thinking about pregnancy who are giving up their location details all the time. So I think at the beginning of this talk we made the point that advertisers aren't just creepy because they get a lot of information about you and they're trying to get you to buy something. In the case of knowing a woman's pregnancy status it can be very malicious where we had an anti-abortion group targeting women that are in a planned parenthood waiting room. So just think about some kind of policy group who wants to target a bunch of women who live in a certain neighborhood and tell them that they shouldn't get pregnant, like everything that's bad about it or try to encourage them to get pregnant. And I don't know if Cori doctor is in here but if you write this story we want royalties. So there were other information leaks too like this text file that was dropped on the SD card by one of the apps so this contains a log of the entries into the app every day. January 18th was a really good day. But this file being on the SD card means that any other app could read this file and furthermore it means that anybody who gets a hold of your SD card or the data partition on your phone is going to be able to read this file as well. So it's a pretty big privacy leak. And then of course third party tracking is a super common problem with all of these apps. I think all but one of the apps that we tested contacted several different advertising servers. And it's mostly the same stuff that you see online. The majority of it is Google, Facebook, Amazon and Adobe's various publishing networks. And then there's like a long tail of just random advertisers and data brokers that are you know on one or two of them. So I think it's safe to say that Google, Facebook, Amazon and Adobe know more about who in the country is pregnant than target. So one feature that all these or that a lot of these apps have in common is this pin lock screen. And it's kind of interesting presumably to keep somebody from just picking up your phone and looking at it. But they're not implemented very well. They almost all have a four digit limit. And at the time we looked at this none of them had any sort of protections against brute forcing. So you could guess as many times as you wanted without any sort of slow down. When we notified the companies one of the companies P-Tracker did actually decide to fix that issue and implemented a back off for the number of times that you could enter the pin code. But the other thing about these is that they don't have any sort of, they're not any sort of protection for the data at rest. So they don't encrypt the data in any way. They don't do anything to actually protect it on the drive. They're just a intense that you have that triggers before the app starts. So all you have to do is get around that somehow. And one way to get around that is to click this link that says I forgot my code. So for at least one app, the bump, when you click this link it sends you an email with a code with a temporary pin code that will unlock the app. And where does email tend to go? And so email tends to go to your phone. So if you're on the phone trying to get into the app and you can't because there is a pin on it. Just send a reset code and check the email because you already have the phone and there you go. And when I was using these apps I was not using pin codes. I wasn't particularly concerned about somebody getting into the apps. But if somebody does feel the need to use the pin they may have a very legitimate reason. You may be in an abusive relationship. You may be in a restrictive, you know, a religious household or religious society. They're with somebody that they don't want. Somebody who has access to their phone to have access to their sex lives. And so these pins should be stronger. Yeah. And if you're relying on this pin code for security, I don't recommend it. You should take better other steps. The other issue, another issue we found was files not actually being deleted. And so again we only found this in the bump. But this is also largely because we ran out of time to do this research. And this is probably an issue in other apps. So what happened in this case is that the bump encourages you to upload photos of your pregnancy progress. So a photo of your belly, a photo of your ultrasound. And so what happens in all of these apps, there's actually some apps that explicitly discourage us and say no posting any personal information in the community forums. But I guess the bump didn't have that prohibition. So a lot of women like to post the ultrasound and share it so you can see the development of the baby. But ultrasound pictures usually have the mother's full name, the hospital where the ultrasound was taken, the date. So a lot of sensitive information. So a woman might post that to a community forum and then realize all that she shared and delete it. But when you delete it, it turns out that it simply unlinks the photo from your account but doesn't actually delete it from the CDN server that the photo was uploaded to. So if you have that original URL, that URL still works to see the photo for the rest of time. And we thought this might be a caching issue at first, but the photo was still up for a week after I had deleted it from my account. So definitely not a caching issue. Just not even considered that a user might want to actually delete things when they say delete. And then the other thing we found was just a crazy amount of permissions being requested. It seems like the Android development philosophy is it's better to ask for all the permissions than for any forgiveness. No, that was bad. All right. So 10 different apps, half of the apps we tested requested the location permission. And again- Everybody wants to know where you are when you're pregnant. So again for advertising, right? But this harkens back to the story that I was working on earlier about women being targeted in Planned Parenthoods through location-based advertising. This is incredibly personal information and there's no reason that any of these apps should have this. Also a quarter of the apps we tested requested your contact list. Just in case I want to inform people about the pregnancy. Yeah, they might want to text everybody you know that your period is coming up. Also five or six of the apps requested your device ID, which is like a cookie for your phone. It lets advertisers link your profile between different apps. And then the phone permission lets them do the same thing but using your IMEI which is like a hardware serial number for your phone. And then pregnancy plus requested the SMS permission. And I have no idea why but it also has the contact permission. So maybe that thing I said earlier about it texting everyone. Anyway the other interesting so we found one interesting security feature was that four of these apps Glow, Nurture and Eve were all made by the same company and Clue all implement certificate pinning. So this is where you hard code the hash of the SSL certificate that you want to use for your HTTPS connection into the application. And it's pretty cool. So this prevents somebody from doing a man in the middle attack on HTTPS like what I did with Mitem Proxy. And I mean it's a nice feature to have. My bank doesn't even do this which would be great. But it's kind of extra. We're not security shaming. I'm not security shaming. I'm glad they did it. But TLS men in the middle seems to not really be in the threat model for the use of these apps. And it seems like maybe a better use of their time would be to implement something like two-factor authentication which none of these apps did or securing that pin code thing. So after finding all this we reached out to the vendor separately. So Kashmir reached out to them about the privacy issues and about the things in the terms of service. I reached out to them about the security issues. This is how I felt about their response. And this dog will always be relevant forever. So I contacted nine different vendors. And all these guys and also the bump. And we received a response back from P-Tracker, Glow and the bump which isn't up there because we got a response back from them just after we finished these slides. So P-Tracker and Glow fixed the issues that we found. The bump promised that they would fix the issues. And everyone else just completely ignored us. One company sent us a form letter saying they appreciate that we like their application. I had a different experience. This is where it's different being a security technologist versus being a journalist. I heard back from everybody that I reached out to except for everyday health which makes the what to expect app and all 12. And people, you know, acknowledged the problems with the privacy policy and changed it. I just, I got responses and companies definitely seem to pay attention to journalists. I think it helps that they have like pressed people set up particularly to receive our inquiries. And also I think they better understand what journalists are asking them. And sometimes they just have no idea what a security technologist is setting their way. So maybe the lesson here is that if you want companies to take your security issues seriously, either work with a journalist or tell them that you're a journalist. I definitely endorse this pair up. You should definitely pair up with journalists if you're a technologist and vice versa. Yeah, so on that line, what can hackers do? What can you all do to improve this situation? And one of the best things to do is to pair up with a journalist. I'm gonna do these kind of reverse order. Is to pair up with a journalist. The combination of a hacker and an investigative journalist is a really powerful combination. You can, we can find these problems and then we can tell the world. And we can through shaming these companies and through getting this publicity out there convince them to take these security and privacy uses more seriously. And we're also, hackers are really good at threat modeling and we can think about threat models outside of sort of one standard deviation from the average user. So we can think about threat models for people that are an abusive household or people that have a stalker. Right? These should be pretty common threat models for somebody writing one of these apps but they're apparently not. One thing that was obvious to us at the end of this is that people that these companies hadn't necessarily thought about kind of the abusive edge cases. And actually a year ago, Glow had a really big security issue that Consumer Reports discovered where Glow had this feature there where you could invite your partner to kind of monitor your fertility or pregnancy with you. And the way they had set it up was such that the woman would make the invitation to somebody. But after she did that anybody who knew her email address could then monitor what she was doing within the apps. And so it was just this kind of huge security hole and only discovered because Consumer Reports decided to look really closely at the app. But Glow did take that issue seriously and did fix the issue and I think is why they were so responsive to these security issues that we found because they do seem to after that shaming or after that publicity. Oblissing. Not shaming. Not shaming. But after that publicity they decided to take these issues seriously. So this is a great illustration of how publicizing these problems is a effective tactic for change. So Kashmir, what was your take away from all this? Yeah, I mean my first hand, opinion on all this, like I hate to admit it but I really enjoyed using these apps while I was pregnant. It's super weird being pregnant. It's like unlike anything that's ever happened to your body before and you just feel like a science experiment for nine months and so I appreciate the information I was getting from the apps. But if I decide to have more children I don't think I would use the apps again because now I kind of like know my way around it. So any future children I would have the only privacy invasion they'd be subjected to in utero would be the ultrasound which I like to call baby's first privacy invasion. It's always hardest on the first child, isn't it, Kashmir? Always hard. We're both first children. It's the hardest. Anyway and so that's all we have. We want to say some thanks to Crypto Wire for donating their analysis services to us. We really appreciated that and it helped us out. Thanks to Dave and Jingjing at Northeastern for their help with Recon. And thanks to Gizmodo and EFF for continuing to sign Paychex to us. Thanks to Defcon and then thanks to Alev for inspiring our research.