 Really, the point of this talk is, how do you evaluate privacy in cryptocurrencies? The point is not the shill Zcash. I am one of the co-founders of Zcash. I am on the board of directors of the foundation, but that's not why I'm here today. The point is that we as a community have done a very poor job evaluating privacy approaches for cryptocurrencies. And the problem is that we need to get better at it. We need to get better at it now. Because at least one of these, Ethereum, Bitcoin, Zcash, something none of us have ever heard of, maybe Dogecoin, I don't know, is going to be the future of payments. And it's going to take a while. But at some point when this happens, we're going to realize that this is a major privacy problem. And if we wait until then to actually address it, it will be far too late. So I've been working on privacy and cryptocurrencies since 2011 for precisely that reason. It wasn't clear to me that Bitcoin was going to succeed, but it was clear to me that if it did, it was going to have a pretty big privacy problem. And at the time, everyone basically believed it was private. This was sort of the best logic that everybody had. And of course, we know now how laughable that is. Not only are there a bunch of academic research that identifies privacy flaws in Bitcoin and other cryptocurrencies, there are actually commercial companies in the business of doing analytics and figuring out what's going on. So this is laughable. Don't believe it. In fact, far from being anonymous, it is in fact Twitter for your bank account. Other currencies expose all of your spending habits to everybody. And by everybody, I mean everybody, your creepy ex-girlfriend or boyfriend, your business competitors. If you're trying to compete with someone, they can figure out what your supply chain is, how much you're paying your employees, use that to post them, et cetera. Everybody. If you're a government and you think you don't want privacy for cryptocurrencies, think again because there are other governments out there that don't like your government. I don't care if you're the Chinese, the Russians or the Americans, somewhere in that triangle. There's someone out to get you. And they also would love to have economic information about how to cripple you. So nobody wants this information to be public to everybody. And it really is today. That's the nature of a blockchain. And so as a result of this, there have been a lot of proposals on how to add some amount of privacy to cryptocurrencies. And these range from very mundane systems that have no chance of actually working all the way up to very sophisticated ones. Zero cash on there was actually my doctoral thesis. It took me about five years to finish all the details of it. The question is, how will these things work? Because no matter who you are, if you're designing a cryptocurrency, at some point, you're going to need to add privacy to it. And you're going to need to use one of these techniques or something else that's come along and you need to know if it's actually private. And this is hard. Evaluating privacy for cryptocurrencies is not easy. To do it empirically would be akin to trying to evaluate privacy for the internet in 1992 when everything was just started, when the first things that were running were only at CERN. We can't measure it empirically. We don't have really real-world usage that isn't just speculative. We don't have day-to-day usage. We don't have this rich tapestry of data where I buy a coffee. I go to the subway. I can check into the metro. Then I buy stuff at lunch, which has some structure. None of that exists. So there is no way to do this empirically. And moreover, I as a researcher, even what little data there is out there, I don't have access to. I have budgetary limitations. I have ethics limitations. I can't buy these third-party data sets that chain analysis and other people use. So we really have no way to do this empirically. And as a result, what we have to do is we have to result to thought experiments. And this is somewhat unsatisfactory because it's not in pure co-evidence. It's very easy to accuse people of being biased. At some point, Twitter will probably start going on about that as people start tweeting slides right now about me. But it's the only thing we have, right? And to do that, you need to understand what attacks really look like. And we need to look to other things where you've had privacy problems as inspiration because we're looking 10, 20 years down the line for this. And so we can't look at cryptocurrency today to find out what the problems are going to be in 10 years. And so what are real-world threats? Well, the easiest thing to look at is what advertisers do with payment data. So we've seen advertisers such as Google collecting all kinds of data about payments you make, not just online. But these are actually trying to get real-world payments you make when you walk into Walmart all year or something and try to make a purchase. And Google's gotten a lot of headlines about this because, hey, Google is evil. It's a very good one. But the reality of the matter is everybody's doing this. And in fact, Google is probably one of the more scrupulous people. At the moment that they're doing it, you can be assured that there are people doing far sketchier things with this data, right? And what can you do with it? Well, one thing that actually is a pretty old story. You can use payment shopping data to figure out, for example, when someone is pregnant. In this case, you can target was building up profiles based on customer loyalty cards of who bought what. And then they would identify, for example, pregnant women and send them ads for baby products. And in this particular case, they did this to a teenager, sent the ads to our house. Her father saw them. And from there, well, father found out that his daughter was having sex and was pregnant. Kind of a problem. And so this is the kind of thing that we're looking at. And in fact, we've seen stuff closer to home with payment systems where Venmo, which for those of you who don't know, Venmo is a way that you can pay your friends. For example, for dinner, it's common in the States when we don't have very effective payment systems like European banking. But anyway, Venmo, US startup, had a social feed because everything is social these days of payments you made of who, from who, to who, and why. And this was by default public. And so people found out about this and started writing things like, here, here's a nice guide to how to stock your ex-boyfriend. And of course, this is kind of cute. It's got a fun, laughy picture. But if I flip the genders on this or really gave it any kind of appropriate treatment, you'd realize this is completely creepy. And this is really someone actually doing this, not tongue-in-cheek. And sooner or later, if we keep doing this, you end up with the same thing, but how to stock your ex using Ethereum or Bitcoin or whatever. And this is the problem. So when I said cryptocurrency was Twitter for your bank account, it's an accurate statement, but it doesn't entirely tell you how things go wrong. And I think one of the things you should think of is that it's a tracking cookie in real life, right? So we're used to these notions of we are tracked on the internet, everything we do on a day-to-day basis is cataloged by Google, by DoubleClick, by Facebook. Well, cryptocurrency takes that and it's like Web 3.0, but in reality. And this is clearly incredibly problematic. So we really need to get this to work to stop this. And so how do we do that? What are the defenses? Well, the first thing, if you take nothing else from this talk, is that in a world of AI social credit systems, you've probably seen that video going around on Twitter from the train in China, and targeted advertising, plausible deniability is not a plausible defense, right? So the original model for privacy in Bitcoin in particular was this idea of if someone shows up and says, hey, did you buy weed at a corner store or whatever, you can point to it and go, oh, well, you can't prove that transaction was mine. And this is sort of the kind of mentality that someone who lives in their parents' basement and has been mining cryptocurrency and what they used to use to play Counter-Strike would come up with. But it doesn't work, right? It doesn't work in reality now and it didn't work in reality then, right? Because look, if the police are gonna show up at your door and ask you what you were doing, plausible deniability isn't gonna get you out of a search warrant, isn't gonna get you out of what they find when they search your thing. This is not a good way to do OPSEC in any way, shape, or form, but it was what people were trying to do. So don't do that, do something at least slightly better. And the problem with what is better is that blockchain privacy is in fact not intuitive, right? Cryptocurrencies broadcast all of your transactions to everybody and this is the thing that people sort of revolt on, like you had a gag in your mouth, you know what, we gotta fix that. And it actually turns out to be easy enough to obscure the data on a blockchain from passive observers who have nothing to do with you and know nothing about you. It's not easy to do perfectly, but you can do it. But that's not who you need privacy from. You need privacy from people who send you payments and people you send payments to, right? Merchants, you don't want to be able to track you. You don't want people to identify you based on putting payment information up online. And once you get to that, hiding transaction values, confidential transactions, or whatever, or hiding your IP address isn't enough. You need to hide the transaction graph. And this turns out to be really what a lot of the more promising things in here focus on, but even then they don't do that go to job, right? So there's sort of three sets of approaches to privacy you can think of. There's Bitcoin, Ethereum, et cetera, where you just explicitly identify the origin of your payment, either the UTXO or the account. And this leaks the transaction graph completely. There was no privacy, right? Then there are the so-called decoy transaction-based systems which try to obscure the transaction graph. These look like something where you pick five possible decoy sources and say, look, this is the real origin for my payment, but here are five other things which it could have been and no one knows which one's which. It's just sort of a mess. And this looks like CoinJoin, MimbleWimble, et cetera, or CryptoRootNode, RingCT, Monero. So that's one set of approaches. And then finally though, the approaches of zero coin and zero cash, which deployed in a couple of different cryptocurrencies, where you don't identify the origin at all and you have no transaction graph to recover, at least if you use the privacy-preserving transactions, right? So what does this look like visually? Well, in Bitcoin, if you're paying a merchant, you have to identify the exact origin. In decoy-based systems such as Monero, same thing, you have to identify the origin, but it hides amongst a couple of different locations, right? And then that's sort of, this is what you're doing. You're adding noise, right? It's chaff, right? The real transaction is somewhere in there, but hopefully if you don't have very good vision, you can't figure out where it is. And then there are things like zero cash, Zcash, where this data just is not there, right? And so we know that Bitcoin isn't private. There's been a lot of analysis on zero cash, both academically and empirically. We know that privacy holds up if you use the shielded features and use them well. The question is, do these decoy-based approaches work? And spoilers, the answer is no. But are they private? And the reason they aren't is because there's some interesting structure you can get out of this, right? When you make a payment to a merchant, what they see is not these three decoys, they see those decoys and their possible origins as well, right? There's, you know, each payment has a possible source, each one of those has three possible sources of five or 11 or whatever it is. And it sort of fans out backwards in this sort of tree of possible things that you were interacted with, right? You can call it like a, think of it as a taint tree. And you can also use this going forwards, right? If I pay a merchant, right, I see where the money goes and then I see everyone who tries to spend that dollar. And some of those people are gonna be spending it as a decoy, right? It won't be legitimately theirs. But one of those will be the merchants. And again, this fans out, right? There are three possible things that were spent in. Each one of those has three possible things that were spent in and we just keep going forward. So it's cone going back, cone going forwards. And so this has a couple of immediate problems that should make it clear that, you know, things are not what you think. Things are not completely private, right? And so the most obvious one is what I'm gonna call an overseer attack, which is suppose you're a merchant, say you're Starbucks, and you have someone coming in and buying coffee every day. And they're buying it with a cryptocurrency that's private. Should Starbucks be able to build a profile of how much caffeine you're drinking on a week, maybe sell it to your health insurance company or something? Ideally, no, right? Privacy should mean that's not possible. They learn nothing. It should be as private as cash. Except that's not true, right? Because if you make one payment to Starbucks and then you make another and you make a third, they're gonna see these different taint trees for each one of these transactions. And there's no immediate linkage between them, so you might think it's private. But of course, if you look through this entire graph, you're gonna see intersections in it, because the payments are gonna have common origins, right? You're gonna have loaded all the funds off of some transaction on Coinbase or Shapeshift or whatever at one point, or your change transactions in every use. So this is a basic intersection attack, right? So there's that. Here's another one. Let's suppose you were dissident, right? In Saudi Arabia, you have a website on the dark web that you use to take donations, right? Tor, right? Take an anonymous, put an address up. You're anonymous. You should be completely safe. They're never gonna figure out who you are. And so because you think it's private, you take that money and you immediately deposit it in a cryptocurrency exchange in the Middle East that actually has all your KYC data. They know who you are. Because of things private, no one should be able to link the address online that's receiving payments with your deposits. That would be another thing you'd expect from a private cryptocurrency, yeah? Well, you don't get it here. Reason for this is that, say Bob is this dissident and he's taking funds and he's going to deposit all of them, right? Again, if you pay Bob two or three or four times and you deposit these things and you look at the taint tree, right? If you are the Saudis and you compromise this exchange, you subpoena it, you hack into it, whatever it is, you're gonna get, for every customer, these taint trees of every possible payment they got. And if you see someone who got one of these tracking payments you made, say the green one, they're gonna be a couple of people who got those, right? Because everyone picks decoys at random, someone might've just picked that by accident, right? And at least right now, things aren't bad enough if you're willing to just go shoot anyone based on the like 5% chance that they were actually your target. However, you didn't send one payment. You sent a bunch. And so what are the odds that some person has all of those payments in their taint trees, right? That all of these things are deposited in their account. They're approximately zero. And so now you have what high probability identified the person. And so this is really bad. This is gonna get you arrested or worse. Again, you thought it was private, it's not, right? And so it turns out that repeated interactions with a sender or recipient are dangerous, right? And this is where most privacy coins break, right? And this is just because no one is thinking about these things in the context that we're all gonna be used. But that's not even the worst thing you can do, right? So remember how I said there's a taint tree, looking backwards you can see where the coins have gone. Well, also looking forwards when you pay someone, as I mentioned, you can see where payments may have gone. And this is, again, it's uncertain. It's some kind of weird quantum foam, but there's a little bit of information in there. And what can happen is, let's suppose you wanna find out if a friend of yours is making repeated payments to a particular merchant. So you send the merchant one transaction and you send your friend one transaction. And these can be dust, right? Little tiny amounts. And you're gonna see these two sort of things fan out over time of where the money goes. And then you don't learn anything until maybe some of the money you paid your friend ends up in the same address as something you paid the merchant. And this doesn't tell you much. It could be that someone just picked a decoy that happened to overlap. It could be that your friend paid the merchant and was moving their money off of their hot wallet into a cold store or something. You don't know, there's not enough information here to really tell, but you suggest something might be going on. Okay, but what happens if this, again, happens multiple times? Now you know that something is wrong. Now you actually have information because again, the odds of this happening by happenstance because it's repeated interactions are approximately zero, right? And then you can watch this happen three, four, five, 10, 20 times. And now you really know for sure. And so for example, you could use this to, if the merchant was Pornhub, you would use this to confirm that your friend was in fact subscribing to Pornhub. And that would be somewhat embarrassing, right? A, he's looking at Porn. B, he's paying for porn. And I guess worse actually, he's probably paying for porn with Verge, which really like, you have better taste in friends. So again, this is again, I think fundamentally violates what you'd expect from a privacy coin, right? It shouldn't be the case that random people by sending you, you know, tainted dust to a bunch of different addresses can recover how things are being spent. But in fact, you can do that, right? And so our decoy-based systems, private, the answer is no, right? With a little bit of thought. And I emphasize that this is a little bit of thought, right? We've come up with three plausible attacks that can break these things, right? And I'm not saying you can execute these in practice in every case. Again, we don't have enough empirical data to evaluate any of these things. But at least as hypotheticals, these things don't pass the smell test. And so we shouldn't be using them. And so why is it that these things are used? Why is it that these are regarded by credible people as viable approaches to privacy, right? Like I basically just told you, like Satoshi really has no clothes. So, oh God, that's a horrible overlap or phrase with that slide. But why are we still using them? And the answer is in part because privacy is hard. It's hard to understand. The people who are building these things are trying really hard. They're doing very impressive things, right? It's building up impressive communities. But if you don't understand privacy, it's a little tricky to get right. So that's just to err as human. But part of it is the way that self-promotion works in cryptocurrencies, right? That we have this like, you've got to be better than the other one kind of thing. And this ends up with sort of privacy theater, right? One of the typical things that happens when you interact with someone who's an advocate for a privacy coin, particularly on Twitter, which I guess is not a good way to get normal behavior, but whatever, is that they'll give you these list of features that the coin has, right? It has stealth addresses. It has ring signatures. It has cut-throughs. It has ZK Snarks. It uses VPNs or Tor, right? And this whole list of features, right, doesn't tell you anything about what actual privacy you get. But it sounds really impressive. And this is the thing that we just need to stop doing, right? And this applies to some of these techniques really don't work at all to other ones like ZK Snarks are quite promising. But the technique isn't the solution, right? This is a technique. The whole protocol is what you actually have to build. And when you build a whole protocol, you need to actually evaluate it not based on the features it has, but based on what it actually does, what does it use? And so you should really ask yourself, like, here's all these features. What are they hiding? And do they actually accomplish it? And the answer is typically, they don't hide much and they don't actually hide it very well. Does it hide everything else? And the answer there is almost certainly not. And then, really, how is this thing being used? And the answer there is going to be in a way that probably isn't safe, right? And what this comes down to is cryptocurrency is Twitter for your bank account. It's a total privacy void. It's a vacuum. Everything is broadcast to everybody, right? This is the akin to me telling you you were standing naked in Times Square, right? And what most of these privacy coins want to do is hand you a Band-Aid, right? And like, given the magnitude of exposure, right, that's not really going to cover everything. That doesn't really work, right? And it's like trying to build a space chute out of duct tape and overalls, right? That's not how you go about protecting yourself from a very hostile and very exposed environment, right? You need to actually do this in the ground up. And we do actually have strong approaches to do that. I said the point wasn't to shell Zcash, but I am going to for just a second because you can use these techniques too in your cryptocurrency, right? We've gotten strong privacy that's actually quick, less than two seconds to create a transaction, less than 40 megabytes of memory. This went live sometime in the last two days. I haven't gotten much sleep, so not sure what day it is. And we also have strong privacy techniques for other things. These aren't deployed yet, but zero cash for layer one. Again, use this in whatever cryptocurrency you want. We have Bolt, blind off-chain lightweight transactions. This is strongly private layer two, and a thing called Bolt Labs that were in the process of setting up. And then lastly, this is actually academic work so far. A thing called ZEXE, which enables decentralized private computation. Both of these are papers. Those QR codes should go to them. And the point is there really isn't an excuse these days for using weak privacy techniques. We have strong ones. We have strong ones that use ZK Snarks. You could do these with Bulletproofs. You could use these without other crypto. And you should do them. You shouldn't be using half measures. And you shouldn't be telling people your things are private when they're not. I'd like to briefly touch on a different problem because I said I wasn't gonna show cryptocurrencies. I'm not gonna call any of them out by name. But usability is a problem. And it's actually one that we at ZCash have had issues with ourselves, right? Turns out if you have strong privacy, that isn't enough. You actually need good UX. You need also to discourage people from doing meaningless behavior, like, say, shape-shiftinging $3.52 into shape-shift into, say, Monero, ZCash, Dash, whatever, and then immediately shape-shifting it back out because that gets you no privacy, right? You sit there and you look and then you can see what happened. And yet people do this, right? We've seen this happen across a number of cryptocurrencies. And that isn't their fault, but you need to educate users, right? You need to have full privacy on mobile, right? That's a protocol issue. And you can't require people to be sinking the whole database just to get private payments. And this is really important. This is probably actually the hardest thing that we have left for building privacy is getting good usability and getting good answers on mobile. The core protocols in my admittedly somewhat biased opinion are mostly done. You can do tweaks to make them more performant. You can develop novel new cryptographic underpinnings to make them faster or have better trust assumptions. But we know basically how to build these things. We know the basic parts you want to assemble, right? So in conclusion, we need to critically evaluate privacy vulnerabilities and cryptocurrencies, right? Most privacy preserving cryptocurrencies have terrible privacy, right? Like merchants can trace you when you make payments. Governments can identify who you are if you put a single address up online. I mean, that would really surprise a bunch of people for how these things are getting used, right? Your friends can find out if you're making repeated payments to Pornhub. That's kind of embarrassing. These decoy based approaches just don't work, right? And in general, we need to get better at evaluating these things. We need to like avoid having people do privacy theater, right? A lot of these cryptocurrencies are effectively like the TSA, right? They have a lot of things that look really shiny. They make you maybe feel a little safer. But at the end of the day, they don't actually fix the problem. They don't actually give you protections and that's just bad. And lastly, usability is important and that really is probably the thing that we need a lot of work on in this space more than anything else. And so with that, I'll take questions, comments, rotten tomatoes, whatever. Thank you. I'm curious about Grin specifically and Moomblewimble. As far as I know with like the decoy based approach that uses like different blinding factors for each transaction, right? I'm curious if that changes at all, like an improvement over, if not even talking about the coin join aspect of the transactions, just with the blinding factors, does that change at all the privacy assumptions? Not really. So the problem with Moomblewimble and Grin so far, it's really promising techniques for compression is that a Moomblewimble transaction is you identify the UTXO you're spending and you generate a new output. You hide the values, you hide the addresses, but the transaction graph is still there. The way they get privacy is in theory, you can non-interactively aggregate all of those transactions together and get a bigger mix. The problem is that those transactions of broadcast or open peer-to-peer network and anyone who connects can just download and see every individual transaction and recover the transaction graph. If you're a researcher, this is easy. If you're an intelligence service, this is really easy. It just, that's not a privacy answer yet. They need an answer for that and they don't have one so far.