 Well, I think that people didn't have, wow, very different as the surveillance economy in mind necessarily when first thinking about the internet, they're thinking about communicating and sharing private privately and exploring whatever you want with whomever you choose. And that is the basis of what Tor believes should be the internet, should be what the internet is like today. So we believe everyone should have private access to an uncensored, open internet. And that is certainly not what is happening now when you're online. So going online, you're constantly exploited, everything is tracked. You can also be easily surveilled by your ISP and that information is sold. We've seen how that can be used to influence elections. And your activity can make you targeted by maybe nation actors who are not happy about how you're talking about them or maybe a stalker or there are tons of specific use cases for why you might want privacy. But fundamentally, privacy is a human right. It's protected by the Universal Declaration of Human Rights. And it's also protected by the Constitution by the Fourth Amendment in the United States. So privacy should be something that is always a choice and always an option and something that you, if you do want to share, that is your choice. And so that's what we think the internet should be like and what Tor can help you do. Okay, great. Oh, yeah, cool. So this is on to Adrian. I want to ask you just kind of from once again, a very general technological point of view, where are we currently in terms of regulation and where regulators of various first world countries are? How we're looking at the different privacy technologies that exist today, things like encryption via PGP on email, or even we got things like signal and wire now for, you know, messaging applications. What is that looking like from, you know, the powers that be and how they look on this stuff? Yeah, so I think definitely a good point to bring back up the fact that privacy is considered a human right. I think Europe has had much more regulation around this for a while, much longer so than the United States has, primarily stemming from, you know, the 1940s and a lot of the really horrible things that happened based on anyone's ability to understand details about a person's life. It's much newer in the US, obviously, like the really big change in more recent years was a GDPR that went live last year, which was a very sweeping regulation that touched countries and companies all over the world. Basically, anyone that is marketing to a user located in the EU or that has operations in the EU becomes under the scope of this regulation. It's sweeping in terms of data protection, how you're supposed to encrypt the data of the individuals that you kind of hold data on behalf of. It also is very comprehensive in terms of the visibility that you're supposed to have over your data. So in theory, you're supposed to track every single data processing activity that you have, as well as map out all the data flows. Next year, California will pass a very similar bill. And there are actually, I want to say, I'm going to botch the number off the top of my head, but many states across the United States right now have kind of a patchwork framework of varying privacy laws that they want to make sure that in this kind of arena of people not really understanding how ad networks work, how their data is sold online, that there's an opportunity for anyone around the country here in the US and obviously globally as well to be able to control that data better. So I think what you see from a lot of the regulatory agencies is, first of all, trying to put the accountability more on private and public companies that can actually make a huge difference in terms of what they do with their users data, as well as try to educate the user themselves. And what that means is opening up more rights for users. So the right to see your data, see the data that any company has about you. Everyone can go to Facebook right now and ask for all of the data that they hold on you, the right to delete that data, et cetera. So obviously with these new rights that are coming into play broadly across all jurisdictions, a lot of that you'll start to see competing with some of the fundamental applications of blockchain. Okay, great. Sterling, I have a question for you and it's, this conversation is going to be difficult primarily because it's a very nuanced conversation for those of you who don't know privacy is actually very difficult to achieve in terms of how much metadata leaks all over the place, right? And so the internet is a prime example of that. So I just want to ask your kind of state of the union in terms of general technology, internet, you know, obviously we all see these big news articles about, you know, this company has lost data or data has been stolen and stuff just in terms like a security when we're looking at a spectrum, just where do we where are we now? And for the foreseeable future, like, how do we take steps to become more secure as a whole? I mean, I would say in general, a lot of it is needs to be kind of separated between, you know, data that is being held on held about or on you, either by, you know, companies that you know and trust and have a relationship with, you know, which happens all the time, which is of course, you're right, and a lot of other entities with whom you don't have a relationship, you know, like the big shiny, shiny example of this is like the big Equifax breach, where it's like, oh, I'm apparently some kind of entity that has no relationship with Equifax, and yet there's all the data they hold on me and therefore can lose about me. And so there's kind of that side of it. And then there's other things involving, you know, things like transport, transport layers and like network layers. So, you know, when I am communicating with other people, which, you know, is essential and very important and often evolving, you know, with the proliferation of mobile devices and mobile networks and other ways to interface with other people over the internet, I think that's something completely separate that needs to be considered. And that has been evolving over time. You know, I mean, there's a reason that people, for example, often recommend not bringing mobile devices to DEF CON or, you know, taking extreme measures about it, your phone has many radios, you know, your phone, you know, puts a lot of implicit trust in something claiming to be a cell tower, for example. The internet, of course, was basically built with the assumption of either trust or, you know, a lack of no trust, and that's still evolving, right? I mean, protocols for secure network transportation are still evolving, you know, TLS 1.0, 1.0, you know, going all the way up through where we are today, advancing toward TLS 1.3 is still trying to solve problems with these very, very complicated protocols. So getting this stuff right kind of at a base mathematical, technical, provable level is exceptionally difficult. So people say like, you know, how on earth can we have this many iterations and advances of protocols intended to keep us safe? You know, how are we still finding issues? And I think that it underlies the fact that this is all very difficult, even from a technical perspective and not for lack of trying or lack of experience. So I think from that perspective, we're still definitely evolving. And I think in terms of, you know, how people and entities are holding data securely, I mean, like, we're not even on the board yet, you know, not even close, you know, there's a lack of control of individuals of their data. And there's also, I think, both a lack of caring and frankly, just loads of incompetence in how a lot of entities are holding other people's data. And I mean, I can't speak to the regulatory side, because I have no knowledge in it. But you know, at least in the United States, it seems that not much is being done to encourage that to get better. Okay, so I'm like one of one of the things I really want to touch on without going too big into a buzzword. So like there's new, like the big buzzword now are artificial intelligence, right? And we got things like the internet of things coming out. So all of these, I've heard it said that because the internet was started without privacy in mind, and then privacy was kind of bolted on kind of after the fact we start with a really, really transparent layer kind of here at the bottom. And we're scrambling to add privacy as as time goes on. But the the the advancement of things like artificial intelligence and internet of things is actually going so fast that it's being built on the current system, which is very transparent. Like, like, what are the benefits of a kind of entering dreamland for a second? Like, what would it look like to just like reinvent everything with privacy first? Because it would be a very different conversation with AI and IoT. If we knew that everything was private, you know, just my default, just that's the way that it worked. I actually don't think that that's necessarily the case. Like the best thing I've ever read about that is someone said that like the S in IoT stands for security, which is true, which is totally true. There's no S in IoT, and there's no security in IoT. And frankly, like, I mean, it's not that folks are building, you know, small IoT style embedded devices with no security in mind. It's that it's very, very difficult to get right. And if you don't get it right, which you have to assume you're not going to history has shown us that it's very difficult to get this right. And if you assume that you're not going to get it right right away, what do you do from there? You know, my phone can be updated, whether or not the company is going to, it can be, you know, does my IoT enabled light bulb get updated? Probably not. If it does, is that being done securely? So there's all these different layers to it. And I think the idea I think the idea of starting with an assumption of insecurity, despite trying to improve products and protocols over time, I think we need, I think we need a fundamental shift there, where we need to start from the assumption, like we do to some extent on computing devices, that they are going to need to be improved over time. Until we get there, you are going to have, you know, this S and IoT problem. The internet of other people's things, if they get exploited. All right, I'll get to you really soon, but I have one more question for Stephanie. And I would, so based on basically that everything that Serang just said, do you see Tor as being a part of that, kind of more as a stop gap until we find something better? Or do you just think we're not really moving towards anything better? So Tor needs to become something in and of itself as a solution? I think that a lot of what we're talking about and what is insecure can be improved by routing things over the Tor network. So IoT devices can be routed over Tor and then no one can see what the device is or what what it's doing. So we think that onion services, using the Tor network in this way, we see that as a basis for how other solutions can be built on top of it. And so we're kind of just diving into this now and figuring out what all can be done. But it, it seems like there is quite a lot of potential. And it's free and open source software. So we're hoping that more people try out different things with it and see what can be routed over it. And it's definitely a more a more secure internet when you're using onion services in this way. Okay. So we're going to start transitioning to blockchain so I can get to Sharmik because he's on his phone and board. So I just before we do that, is there any kind of questions about kind of general privacy concerns and in terms of technology and stuff? Because if not, okay, yes. Because privacy is a human right and I don't think that it's fair to blame tools for the actions of people. No matter what, I mean if you get rid of tour, if you get rid of the internet, people are still going to do bad things. People do bad things with phones, people do bad things with cars. And so it's about the intention and vision and what the majority of people are actually using the internet for and using tour for. And that is for good for communicating, for sharing, for expressing themselves and accessing very critical resources often. So lots of good there. Alright, so we're going to go ahead and segue. And my first question in regards to blockchain, so I'm going to put this towards Shamik, is similar to my analogy of the internet being built, absolutely transparent with kind of privacy being bolted on Bitcoin and most of its derivatives, which is almost everything, right? And all these other things in order for verification purposes of the sanctity of the blockchain is very, very transparent. And a lot of blockchains are trying to, you know, do some different bolt on different type of stuff. You know, there's various levels of privacy amongst different blockchains. Do you see this as an issue? Do you see this as like kind of moving down the road where we're going to be scrambling to try to put privacy on blockchain in the similar way that we're doing with the internet today? Or do you think that this is similar to the internet and this allowed the proliferation of the internet and good regular good regulation around the internet to get to the point where it is. So it becomes unstoppable. Do you see that as more the path that blockchain is going? Yeah, I think you're in the original analogy around like core internet trunk and how it is primarily an open system, mostly because development innovation on that open system is a lot faster and easier is like kind of the expectation of this model. So where we see people experimenting with interesting security models and privacy models like Zcash Monero, they're changing kind of the fundamental interactions. It's really like honestly a bit more of an experiment. Part of this is, hey, if we think about where cryptocurrency is overall as a technology, we're still like early, we're very early. When we're very early, we're still in the process of figuring out what some of the fundamental interactions that we want to enable on that technology. Right now, we have some very simple ones send and receive. It's like very common across all blockchains. It's like a fundamental of interacting with the chain is can you send it, can you receive it, given that some of the like more additional functionality is still being proven. So if we think about ERC 20s and just smart contracts in general where let's try something that's a little bit different, has another interaction layer, and we're still trying to figure out which of those interactions really makes sense for people like they enjoy using blockchains for it. Where it's like we're far away from from on like the technical side to say that the use case directly matches up with like the privacy expectation. Now, one of the things that's nice is sure it's not an anonymous network, but it is a pseudonymous network, right? An address can exist from anywhere and can belong to anyone. There are a lot of points in in Bitcoin, for example, where you can figure out where or who an address belongs to or where or who a wallet belongs to, but it's not trivial in the average case, right? It's like, hey, big exchanges, you can probably say that's probably a big exchange, given what's the volume. But on the peer to peer side, you you may not know. And so that pseudo anonymity is really useful when just figuring out some of these initial use cases. Like we're just it doesn't matter who or what you are, right? Historic money systems. It's like, OK, you send money to people or maybe corporations that are still people, you know, by US law here, I'm sometimes sending money directly to a machine. And it is anonymous to me that I'm sending money directly to a machine and I'm interacting with the machine. And so like part of that interaction model just says we get to try new things until we figure out what matters. It's like we we need to scale kind of where we have privacy concerns, where we want to wait on figuring out how we want to do this well to like getting that MVP figured out. OK, so this last question was kind of focused on cryptocurrency as the use of blockchains, but a lot there you'll hear the the mantra around blockchains, not Bitcoin or, you know, basically this idea that we can use block chain for other data besides just ledger information. Now, since blockchain is really just a database that everybody has on their computer, assuming you're running a full note of whatever blockchain you're using. Now there all of a sudden there's a more there's potential for more information to be leaked. That's not just ledger information. That's not just numbers moving to and fro, you know, assuming like we OK, some people like how do we make identification services on top of the blockchain? Or, you know, what would it look like to run a full note there? So I guess just kind of a question for anyone. What do we do about me already financial information is pretty sensitive. But what do we do about more sensitive information if all this stuff starts to become people are trying to put it on the blockchain which everyone can can have and own and look through and from a privacy perspective. Yeah, so on the technical side, one of the things that I care a lot about is like ownership of data, right? Like who actually owns it and what are their expectations when they're interacting with these systems? And so when when you start laying out some of these threat and trust models, you say, OK, like this is the type of financial data or type of user data or type of whatever data we're we're sticking somewhere. If it's like Walmart's supply chain for, you know, XYZ, random animal product, they have a very different expectation of who can access and where they can access it. In fact, this is where you start to see where they're not necessarily public blockchains or private, like not sorry, not privacy based blockchains, but private blockchains, which is very similar to like a private DB. You know, you throw something in your DB, you have access control over it, you know who's going to have access to and what they can see what they can see with it. And so this starts to like break some of the like decentralized models that we care about and that we find interesting. And so when when you start comparing use cases, you're saying some of the fundamentals on a decentralized open system don't necessarily meet the user expectation or like the interaction model that you want, where you're like, this is a one on one and private interaction. Like whatever you do, you end up bolting on on top of it. Just like, you know, lip purple and aim like way back in the day, you could use AOL instant messenger and just have this encryption layer on top of it. It was a one on one interaction. The core model was still there, right? It's still an open system X and PP or whatever the hell they were using at the time. And AOL could look at all your, your, you know, build that social graph and do whatever they wanted with it. But you do throw on this like interaction layer that don't actually know what you are saying that you just that you might be saying something to someone. But would not one of the issues be here. So in a typical database, you can encrypt that data and only the person with kind of the decryption keys of the password would be able to decrypt that. Whereas even if it's a private blockchain, if it's not privacy focused, if any version of that blockchain leaks at any point in time, you still have this whole huge graph that goes all the way back. So really all of this data, which you have to assume at some point it's going to leak because humans mess up, right? We all mess up. So even if there's a private blockchain that is, let's say, within Equifax, if at that at any point leaks, it's more than just certain records being compromised and maybe those records are encrypted because it's not a privacy focused blockchain. Yeah, so that just comes back to a traditional like application security problem, right? Where you encrypt the disk because somebody you don't want somebody just to walk off with a disk and be able to see the data on top of it. You encrypt the data in the column because you want like protection at the logical access layer. But still you may have your application just spewing out whatever core data you want if you hit a particular endpoint. It doesn't matter that you encrypted everything up to there. Like when that endpoint says, yeah, here's all the user data. Like there's all the user data. And so just like a blockchain, it depends on kind of even if you get the fundamental encryption models and like, hey, the data itself is completely anonymous. You can't tell what's going on. Anything built on top of it likely needs access to it in order for it to be useful. So maybe there's like an archival database, right? An archival blockchain that says, yeah, it goes there. It gets written down, but nobody can really tell what it is unless they were directly part of that interaction. But in the in the average case, it's like, I need to know who you are so that I know what my relationship is with you so that I can have the right interaction. So for Stephanie, then this idea that as Schemek was just saying, you know, at some point somebody's got to communicate something with somebody. Otherwise, these are just kind of like little stationary things that are just self-contained and don't really do all that much. So kind of this idea that these things have to be communicated and this might be where Tor or other mixed net solutions like I2P might come in because otherwise, I mean, maybe HTTPS is not strong enough or it relies on certificate authorities, right? So it relies on this whole third party system that these external third parties and if they fail and they have, then all of a sudden you're going to have great amounts of data leakage on this very sensitive information. So like, is Tor in any way, shape or form looking specifically on how to aid blockchains in this proliferation of data or you're kind of looking at kind of blockchain projects to implement that themselves and kind of share that? Is there any sort of optimization for this kind of data being communicated to so many different people? Because typically with Tor, like with most websites, you know, I request a website from this person, it's kind of one-to-one. Whereas with blockchain, whenever I want to update the database, whenever I send something, I have to send tons of information to tons of different nodes so that way we can all sync with each other, right? I think it is being explored by people in the cryptocurrency in the blockchain space, but it's not something that we have people dedicated to right now since we're a small nonprofit organization, but people are able to test them themselves. I have heard of lightning nodes being used with Tor, so the entire transaction can happen over Tor. I'm not sure how that scales exactly or how it interacts with other things that you've been talking about, but it seems like there is a lot of potential there. So we do encourage people to explore that. And also please keep in mind that the Tor network is a commons, so we want you to do what you would like to with it, but also please give back to the network and run relays, and that will also help you by helping the bandwidth. And if you can't do that, you can also make a donation. We do accept multiple cryptocurrencies and we'd appreciate your giving back. Great. So Adrian, kind of cycling back just a bit, and we're going to probably move on from this section here pretty soon, but cycling back just a bit kind of to the open nature of blockchain and Bitcoin. So recently the GDPR, right, was passed in the EU. Or is it GPDR, one of those two? Yes. Yeah, okay. You guys know what I'm talking about. It was recently passed. And one of the stipulations is that a person has to be able to request that their information be taken off and that be complied with whatever company. So kind of with this whole idea that blockchain is immutable and most blockchains are absolutely transparent, like this doesn't work with that in any way, right? Because you can't just get rid of that data if you want to get rid of that data if it's your data. So do you have any ideas of how those play together? Yeah, I mean, I guess the easy answer is that I don't think any of the regulators really anticipated that use case when they were finalizing the GDPR, which is the general data protection regulation. I think there are a bunch of the different data protection authorities in Europe right now that are actively working on this. So the French CNIL, which is their privacy regulator, has issued a paper about how GDPR would work with blockchain. The right to delete your data is definitely at odds with this idea. I think right now for the most part, the best place where we can stay is trying to educate the regulator in terms of when and why this will fail. I think the other part of it is really just user and kind of individual focused communication and education, right? Right now there is a lot of regulatory ambiguity, especially for companies, well, and I should say like for the public blockchains, not private as much, where you do have that sort of central governing authority that is much more culpable in some ways than players in a decentralized, permissionless, or I should say public blockchain. There is an onus on that entity to help their individual users understand that if you decide to interact in this way or transact in this way, you are irreversibly putting your data on this chain. And there are going to be varying levels of visibility and varying levels of ways to re-identify someone once they put even pseudonymous data on a certain blockchain. I'll use the case of cryptocurrencies specifically because it's obviously what we work in, but each chain that we support does have kind of differing technical characteristics that make it easier or harder to identify someone based on their transaction history. We do feel like as are, any exchange really has the responsibility to help educate customers that, you know, hey, by transacting in this asset, you are in turn revealing a little bit more of yourself. Here's some of the risk measures that you can take to decrease that visibility if you care about that. And if not, you know, it's your choice. So really it's kind of a mix of user choice, education, and then ultimately regulator education so that there can be some kind of future clarity on this. Because right now, regulators just don't really understand the, I think, the depth of the conflict here. So Surin, obviously, so you work for the Mineral Research Lab and obviously they Mineral as a whole as a community takes a very different approach to privacy than we've been discussing so far, where they say, okay, we don't want any of this to be available to anybody, except for the people who initiated and are receiving this transaction. So this idea that maybe playing nicer with the GDPR because, excuse me, even though it doesn't still don't play nice with your ability to kind of revoke these things because they're still inside the chain, but they're still only accessible by you. Why does Mineral take this kind of really staunch approach to privacy that they basically say this base layer transparency with kind of this bolt-on stuff isn't gonna fly, even if it's harder to digest for regulators and the user base at large? I mean, of course, like I can't speak for anyone else and what they might think about this, but I think as has been said before, for a lot of people who are probably attending this conference and who use distributed digital assets, I'm assuming that one of the reasons they do so is because they do believe that privacy is a fundamental human right. It's been said that transparent assets like Bitcoin are effectively Twitter for your bank account and there's a lot of danger in that too. You're told often not to walk around displaying large amounts of cash. Why would you want to do that for potentially even larger amounts of cash effectively on the internet? So projects like Monero and Zcash, another privacy-focused projects, take the approach that that danger can be mitigated through the use of good technology and good mathematics and they take very different approaches to it and of course it's not possible to remove all metadata from such transactions. Monero can't do it, Zcash can't do it. There's all sorts of different levels, whether or not it's the network layer, whether or not you're using something like Tori2P, metadata that's involved with the way that you're interacting with the network, metadata that's involved with the way that transaction patterns might happen. So it's a very, very complicated model to set up both at the transaction layer and at the network layer and things like that. So I mean, it's kind of a one-way street, right? You can leak data, you can't unleak data. So I would say a lot of people believe that the safest move is to assume that data should be kept as private as possible with of course the possibility that someone sending and receiving in a transaction, for example, of course can share information with other people or with auditors or with exchanges or with regulators as they see fit. Privacy doesn't mean that you are not allowed to share anything about yourself. It means that you should have the right to share things about yourself with entities that you see fit. None of this is an excuse to say that you can or should break the law or anything like that, right? I mean, I don't think that that's the goal of any of these things. The goal is to ensure that you remain as safe as possible and that you should have the choice to share things about yourself as you see fit. So Minero is on one end of the spectrum in terms of privacy by default on the base layer. And then Bitcoin's kind of on another where everything's not transparent and we're gonna try to make privacy happen through second layer solutions which hopefully people will be using, right? So, and then we got kind of all these in-betweens with projects that are, okay, well, we have privacy by default, but you can opt out or privacy not by default, but you can opt in on the base layer and not need a second layer. So there's all these kind of in-betweens that are going on here that there's not really a whole lot of formal analysis to my understanding as to how effective these things are and it gets really subtle in terms of which particular people are you kind of putting the threat model against and all this kind of thing? Like is there, as we march ever towards the future, right? Are we ever gonna reach a point where we're gonna be able to look at all these things mathematically in a formalized way and say, okay, this works under these circumstances and this doesn't, because right now it's just kind of shooting darts in the dark, right? You just seeing what sticks and hoping that you're not the person that if something fails, you're the one that pays the price for it and says an example for the rest of the world, right? So is there kind of, I guess I would ask from Coinbase perspective like, because you guys, you're evaluating these cryptocurrencies and saying, okay, what level of privacy is acceptable? What level of privacy isn't, right? And so you have put on Zcash and you haven't put on Monero and we're not gonna take questions about what other assets they're gonna add. But just kind of in terms of evaluating that and trying to decide what is secure and what is private enough for people and what is not. You know, on the privacy side, that one actually, we look at it from a different focus. One of the things that an exchange has to worry about is like counterfeit bills. It's just like a bank account or like an ATM. If you are giving it counterfeit bills, like they don't want your counterfeit cash. And so when you think about privacy interactions and there's like less traceability and like lots of reputability with regard to where this asset came from. It makes it really hard for us for a currency that hasn't done a very robust analysis and is very formal about how their protocol and interaction layer works. To say that any given asset that's coming in the door is something that we can actually trust. Because if you think about a distributed trustless network, I don't need to have to trust it, except you always ignore the fact that you are trusting the protocol and the nodes and like the whole mining operations and whatnot that everything is working as it should to spec. But like, hey, we're in tech. Like we know that's all bullshit. And so the likelihood that everything is working fine is super low for some of these new assets that are just brand new, right? Like, hey, even a couple of years after launch, we may be finding like really significant issues. And so there's like the one side of is the user's needs being met from the privacy that this asset offers. Then there's the other side of is our user base in mass, their needs being met with regards to can we say that when we get a coin and we sell it to somebody else, right? Or there's like an exchange that's happening in our accounting for it, that it is an actual valid and like non-counterfeit coin. So it makes life like even harder, like just completely orthogonally because it changes the interaction pattern of that relationship. Great, thanks. Oh, yeah. Yeah, I think just one more note on the environment in which this analysis happens is based on, as an exchange operating in New York, we are subject to the New York Bit License, which is a world of fun. It's basically required for any virtual currency operation that serves New York customers or is operating in New York. And one of the really important things for anyone who has ever read the Panama Papers or kind of follows some of the anti-money laundering laws is that we do also have some regulatory obligations to help our regulators understand like, no, this isn't actually money laundering that's happening. Kind of to the earlier question about like, how do you tell the good from the bad? I honestly think that, you know, the more data we find about like how much is good happening via these like private transactions versus bad is really helpful because maybe it's for nothing or maybe it is actually a significant problem. Either way, we unfortunately aren't in a place where we can even sacrifice our need to comply with a lot of the New York customer or anti-money laundering laws that are pretty traditional for banks. So that's a commitment that we've made. So to Shamak's point, you know, anything that kind of raises that flag or makes that due diligence harder for us means that we're immediately at odds with our like core regulator. So that's kind of one of the hard lines for us. I think another one too is, you know, to what extent are, you know, and again kind of picking up on the point earlier, there's so many shades of gray along the spectrum of the types of personal information that different chains ask you for in order to participate. So one of the areas that we try to understand is to what extent does this differ from Bitcoin and, you know, some of the Bitcoin forks, like, you know, how is personal data different in this scenario for like this new asset versus this asset that we've worked with in the past and for our customer? Like how do they know the difference? How do they know if all of a sudden all of their transactions like kind of lead back to one single source of truth, you know, for someone who's kind of a hobbyist, maybe that doesn't make a huge difference, but for someone who has multi-millions of dollars that they're transacting with, then like kind of traceability becomes a lot more important to them. And that doesn't need to matter if, or that doesn't have to matter only if you have large amounts in your wallet or on chain. So, you know, again, like, what is our role in educating the customer? What is our role in like working with the regulator to make sure that we can kind of, you know, tow the line while also to Sean makes point making sure that we're not, you know, listing counterfeit coins. Great. So we are running short on time. So we are going to be moving on to the next portion and which is going to kind of be focusing on regulation, but I'm actually going to ask the first question to Stephanie because blockchain is still quite a young technology. And I don't think anybody would disagree that we would like more privacy kind of on the blockchain. We may disagree in terms of how much is more privacy or how to go about it. But I think everyone would be comfortable with, you know, a certain amount more privacy. But we in blockchain are not the first ones to fight these privacy battles. And so we can learn from a lot of the people that have come before us. People like tour who have been alive a lot longer than we have and who have been in this struggle and in the fray for quite some time. So just kind of from a regulatory perspective where the people that work on tour may face certain backlashes after certain things are revealed about what is sometimes on different onion sites and stuff. Like, how do you guys handle that? How have you kind of taken this in stride? And how do you plan to continue handling that as the regulatory landscape kind of evolves in terms of privacy? It depends a lot about where you are in the world about what it might mean if you're using tour. For the most part, I mean, it's not something that you get in trouble for using tour. I mean, we do have to sometimes remind people that it is free and open source software and a tool and anyone can use it. And if we do start having the ability to take down sites that route over the tour network, then we're putting everyone's privacy, we're putting everyone's security at risk and we're not gonna become someone who decides what can be, what's okay and what's not okay because then we'll be in a position of being asked by different governments and whoever to target activists and whoever else, someone that they just don't like. So another point that is important to consider is that when governments or ISPs don't like tour, then they block tour because all of the IP addresses of our relays are publicly listed. And so we've developed ways to get around that censorship with bridges and pluggable transports. And it sounds like that is going to be something that becomes very important to this community as well thinking about getting around the firewall if China decides to start blocking mining, for instance. So I think that's probably something very critical to keep in mind that tour can help with too. Yeah, one of the topics you'll see coming up over and over again in this conversation is a lot of the privacy laws based off of physically where you are located. What becomes strange, it's like sure, like mining infrastructure does physically live someplace and it can't not physically live someplace. But most of the interaction models and most of the interactions that we are doing are completely independent of any physical locality. Like my phone here in Nevada has different rights than it does in California. And that's really strange, like that's a short flight away. And some of the core concerns that we have to deal with are they don't care about the boundaries of any given nation or nation state. And so as we start to see almost a divergence of the expectations of privacy, what different countries, what different nations are enforcing, that also adds like just a layer of complexity both on complying with what is like minimum required, but also kind of dealing with the other half of that. Cause none of this, like half of this privacy law is let's increase privacy, the other half of the privacy law is but the government still wants to understand what's going on. And so just like as we just start making this problem iteratively more complex across like, because of these nation state boundaries, it's the crypto doesn't care where it's going. It just wants to get there and it doesn't care where you were. Like it's, I can entirely end up in a position where I am in a place where it is illegal for me to receive a shielded transaction and for somebody to send it to me. Now, have I broken the law by having a shielded transaction associated with my account where I had no agency in preventing it? It starts to make things like really dicey. Okay, so Adrian, I guess we have time for this one last question and we got to keep it a little bit brief because we're at 143 and got to end at 145. So just kind of looking towards the future because that's where all panels like to end and looking towards the future, right? So looking towards the future with blockchain, kind of this blockchain moving forward at a pace and privacy moving forward at a pace. Sometimes they're moving together. Sometimes they're moving kind of in different directions or at different paces. What, how hard is it going to be to kind of talk with regulators and educate them because maybe, you know, Tor ITP comes in the news and it's something unfavorable and blockchain comes in the news and it is favorable. So if we're trying to march them alongside by side, it's a hard sell because on one half, you have something that's not quite so palatable and on something the other half, you have something that is and you theoretically want them moving in tandem, right? So as blockchain progresses, we can keep the privacy that we all enjoy currently or oh, wish that we had, right? So how difficult is it going to be to talk with educators about not just blockchain but privacy at the same time because they are kind of two different fields that are being smushed together with varying levels of success? Yeah, it's a big question. I don't know if I can answer that. Yeah, 30 seconds. That's a little bit, yeah. Yeah, I mean, I think there are definitely some key themes that are emerging across privacy in general. One is the importance of, you know, having visibility over what happens to your data. The other is really being able to control that data as much as possible, including any downstream sharing or use of that data. I think every single regulator that's getting into the privacy space fundamentally wants to protect the rights and interests of their constituents. Maybe not everyone, but for the vast majority, the most part. So I think, you know, it's a lot about having the blockchain companies, those that leverage blockchain, you know, the financial side to the exchanges, the custodians, et cetera, just having very open conversations with regulators, both about how blockchain writ large as well as the cryptocurrency applications help support those same rights and freedoms, as well as how they're thinking about privacy. And again, it comes down to minimizing risk, right? The most important thing is the risk that is being posed to an individual based on how their information is being shared. As applications are designed, you know, apply privacy by design, think about the potential risks that could befall your customers based on how their information is being shared and then use that information to then kind of share your thoughts and your path forward with the regulator as a part of a group or if you have those relationships individually, but I think it comes down to how you're protecting your customers. Okay, great, thank you so much. Let's give a round of applause for our panelists.