 Wai named Simon from New Zealand Or the Kiwis in the audience, put your horns up Okay, I'm from Mintz research I was looking through the DEFCON programme and I saw these people like founder of XYZ company So I thought founder of XYZ company and so I thought I can be a founder of a company too So I'm founder of Mintz research Kei, so the original concept for the contest came about when I was drinking beer at Deep Sec with a dude called Rich. We're talking about the PDF Mail To Vulnerability which had came out and we got the sample and like ran it through virus totals all detected, then we modified it a little bit, ran it through virus totals again and then you got like two or three, maybe four engines detecting it so we thought hmmm, this isn't very good. The signatures for this kind of stuff are a bit shit, then so we released the idea for the contest that we're going to be running a race to zero at DEF CON and the media got a hold of it and they went mad. Some good, some bad and a lot of sensationalist reporting. So we got the guys from offensive computing, let's say a golden opportunity for the anti-virus industry. The entry in micro came along, it will do more harm than good and Kopersky my favourite. What about a live bank robbing contest, attempt bank security systems, maybe a drug distribution trial ends at school to test in our colleagues police. I think that's a good idea Eugene. Then so fast rocked up and they're like, do you want to be a victim of this game? Cheers to the vaudev guys there, you might have seen these t-shirts around. So motivations for the contest, we wanted to highlight the shortcomings in blacklisting technologies. One to see which vendors were doing a good job and observe the real-world difficulty of avoiding detection by the different AV products and also look at the cost and the time spent on each of the samples to get them through the different engines and how much skill was required and also I was a pearls ellip from way back and everyone's like pearl sucks pearl sucks you've got to get into python so we're all right then I'll learn python and so the whole engine and interface stuff was written in python. So what's the main problem that with the anti-virus? We all know that signature or pattern based blacklisting is dead but things are starting to get better in the AV industry particularly in the home versions of software they're starting to incorporate the behavioural based technologies yet things are slowly getting better but there's still problems with your browser like your browser is almost like a little mini operating system nowadays and there's still all the file based exploits as well however AV in the enterprise is still a big problem and it's still lagging behind the desktop versions of the software so with the desktop version they've got the behavioural based technologies in there but that's sort of like the sounding board before it goes into the enterprise this is F secure have got a lots of honeypots all around the world and they collect like they look at boxes that are sending out more wear and they lock the IP address and load it into Google KML or the key markup language and then plot it on Google Maps and you can zoom around and you can see all the infected hosts on the net and here we see there's a navy dot mailbox that's infected it's probably running antivirus the antivirus might be out of date or it's probably just hasn't got any behavioural based detection in there just before I came over to Vegas I was involved in an incident at a corporate in New Zealand 3000 seats which is big in New Zealand like it's nothing over here I suppose but um they had got a virus that antivirus in the desktop and the users had gone to their web mail via HTTPS and downloaded postcard.exe which is the those that Hallmark virus that got released recently and clicked on it and then it just went all through the organisation it was copying itself to sandwiches and other people were clicking on it and it was just a mess and they wondered well we've got antivirus why why isn't this picking it up and it's because the signature there's no signature for it yet okay so the real problem is and this is a quote from a mate of mine because the finally Rena realize and they meaning the antivirus industry that you have to assume your users have the mental capacity of a two-year-old they're more likely to smash it on the keyboard than actually use a product correctly so user even if they've got this behavioural based technology we have to make it as simple to use as possible for them because they're just going to be clicking next next next postcard.exe wants to connect to the internet must be downloading some more postcards for me next next next so even when you've got this products that have got the behavioural technologies in them that normally when you install them it'll be an extra option will be like the advanced feature set or the expert level or in McAfee's case here the maximum protection but normally the users when they're installing a product they're clicking next next next next and they don't see that there they just maximum protection I'll just go with standard things so if there's one positive message that comes out of this contest after all the negative shit there's mum or mom and dad at home need to enable their behaviour based detection and if they haven't doesn't have it or it's out of date or it's an average product go out and buy the upgraded version and then turn it on and there's also the whitelisting technologies like core trace which are sort of the alternative to AV but then you still need to include AV with those products to detect exploits for unpatched software as mum and dad don't patch their boxes so they need to have detection for like the JPEG exploits and ANI, WMF. Okay, so we'll get on to the engine. I developed this on Zen so each of the VMs down the bottom there were so where it says AV that's a copy of Windows XP running in a Zen VM and so all 10 of them were running on this laptop so it's pretty with a 256 mega RAM each. At the top the Cherry PY would accept connections. It would auth the user and then that upload the sample. The sample get archived and then dropped on the SIF share and then the session schedule would connect to each of the AV instances and say, hey, go and check on the SIF share. I've got a new file for you to scan. The results would then get stored in the database and then I could look at them and analyse them after the contest. So when we did it in production out in the contest area, it was two VMs were ESX iBoxes and they each had five VMs on them. Okay, so the engines I used, Clam AV, Nod 32, F-Prop, Virus Buster, all the common ones. Symantec weren't in there because the way that I integrated the engines was with via the command line scanner. So I'd wrap that up with all the different parameters like slash heuristic and slash archive and all the ones that they had available for each of the engines. Some of them were quite minimalist and that probably wasn't a good indication of how well that antivirus product may perform in the wild. Most of them were pretty good. And this is an example here of after someone has uploaded the file, it'll scan it through and this is the stoned virus, which is a very old DOS one, just showing all the detection. So it's 10 out of 10 detection rate. The samples that I gave the contestants, there was 10 levels. So the first one was the old DOS stoned virus. And then we went through a few Trojans, NetSky, Bagel, Sasa, all the common ones there. And then level eight was MS Word vulnerability, I think it was MSO 7014. Then the ANI exploit and finally was the Slammer exploit, which is the SQL. So I think this is probably what my girlfriend thinks about me when I'm doing this kind of shit. Probably all seen this on XKCD. Okay, so were malware or exploits harder to bypass through the engines? With the black nesting technologies, the malware was once the teams had worked out the first malware sample or after stoned, stoned was probably quite hard and I shouldn't have included it as level one, but it sort of weeded out the real players. Once they got past that and then they got the Trojans, all they needed to do was work out their packer and get it working and then they'd fly through all the virus samples. Then they'd get onto the exploits and then that's where people slowed down again and took a bit more time. I enabled heuristics where available, as I said before, and the exploit signatures for some of the engines were really tight and that's where those products were winning. Okay, so techniques that the teams use. Team 4919 used a custom Trojan dropper to get through all the virus samples, protected and executable with T and decrypts and executes at runtime. MGM used git count, so they were just spinning the CPU for a long time and the AV engines would just eventually time out and with the way I was doing the scanning, if the engine didn't return a result within 60 seconds it was task killed and then it would just return a clean. So I didn't want it locking up but then they can get through, get past it with tricky techniques like this. Team Retim also did a custom Trojan dropper and they included a Mandelbrot fractal generator inside their dropper so that when you executed the code popped up the fractal. It was pretty cool. But the problem with the droppers was that the file that they dropped was exactly the same as the sample I'd given them so when the user sees the CXN on the desktop and they go to run it the antivirus is going to pick it up anyway. But for the concept of the contest they got through the engines alright but I probably should have stipulated more that it should actually be modifying the samples more, not just repacking them and if they did pack them and drop them then modify them so they're not the same. Okay so some techniques for the exploit stuff. Team 4919, within the original Slema payload there's a loop for a reserving space on the stack and Team 4919 replaces with three push AD instructions saving for three bytes and they did some other obfuscation mechanisms there and they also replace the RNG loop with a single RD TSC instruction. There are only four teams managed to complete the contest out of the total of 10 so and the teams that didn't complete the contest they pretty much couldn't get past level one. So this graph here shows the number of detections for the particular engines and we can see there that McAfee pretty like got 90, I think there was over 200 samples and McAfee there detected about 90 of them 90 to 200 so just under 50 percent. Okay here's level one with the stone virus you can see there the FPROP like I remember when I used to run DOS FPROP was the virus scanner so it's not surprising that they can detect multiple variants of stone there even when people have obfuscated them and re-upload them FPROP was still kicking ass. NetSky.p again samples would be uploaded the engines would scan them this is a result for level two NetSky.p trend micro there did pretty well. Now gets a bit more interesting when you get into the exploit samples because if the Abe company have written a really tight detection for that sample then it's going to be bloody hard to get it through all the engines so if they are targeting the actual exploit like in the text field if the like a line here is longer than 255 characters it's going to be a virus or it's going to be malicious because we know it's never going to be any longer than that. So you can see there Kapersky did really well on MSO 7014 exploit and no well one team did get past this but the problem was with Bitdefender the Trojan included an executable and the Bitdefender engine was looking for the MZ header within the word doc so team four 919 couldn't get past that but the Kapersky engine had the actual targeted code for the exploit. So here we looked at the Chicago street sweepers that Exor encoded the executable within the word document so they bypassed the Bitdefender check that team 4919 was missing and but then they were hit with the roadblock of Kapersky and they couldn't get past that until they thought well maybe Kapersky is checking the version of the word doc and also for this and matching that against the check basically the length check which was exploitable. So they then modified the versioning number and got the sample through but I haven't tested it fully to see if it would work because I didn't have a vulnerable version of office and a lot of the teams didn't have a vulnerable version of office to test it on so made it quite hard. Now if the slammer exploit this was the hardest level 10 and a lot of teams took quite a while to get this one sorted and see their McAfee was pretty good on detecting all the variants that they were uploading through the portal. Okay Kanuaia of David Scott Lewis from Wargames Up here please. Okay so the results so only four teams managed to complete all the levels Honourable mention for reteam in two hours and 20 minutes and he didn't use IDA to do any of the reverse engineering or anything. So reteam up here please and they also the dirtiest hack because with the word exploit they just packed it up in an EXE and then called office when run it. So one of the things I just want to mention why I was interested in being involved for those of you here last night you know that I was the model for David Leitman in the movie Wargames and for those of you who subscribed and wired it's also in this month's current issue. So that was really kind of my interest looking at this because I think we all know signature based detection methods really suck and we have to go to heuristics behavioral based modeling but even we talked about it last night at the Wargames Q&A about the adolescence of P1 if you've never read that using agent based systems genetic programming learning systems even before John Kosa really formalized this at Stanford. So really the AV companies really have to move to the next generation and can't rely on any of these signature based detection systems. That's really what excited me about all this and the Mercury I'm sorry Venture Beats going to be having a little story about this as well. OK the team most deserving of a bear was a team that included a loop in the dropper looking for the string bear. So can that team please come forward I can't remember who you guys are but yep choice. Cool and the winner is and the winner gets a handcuff key and an electronic lock picking kit because you're probably going to jail. The Chicago Street Sweepers