 OK, so I'm Rio, and I've been working with Let Boil and Vinod Vaikunthanathan on this notion of robust property preserving hashes. Now, a property preserving hash basically means we have some property that we evaluate on some inputs, but we want to still be able to determine that property on compressed or hashed version of those inputs. And the robust part is we want to be able to do this in the presence of an adversary. If you don't know who that is, that is Loki, the Norse god of mischief, who is very clever and very adversarial, especially in the Avengers franchise. So consider the following scenario. Someone is seeding the Avengers, which means they're putting it out on the internet and just letting people download this movie, which is very illegal, and Marvel Studios doesn't want this to happen. So Marvel Studios obviously has to check lots and lots of movies that people are posting online to see if they're actually posting the Avengers or doing something that's actually legal. Movies are big. They don't want to have to look at the whole file every time. So instead, they'll want to just compare hashed values of the movie. And they might want to test for equality. They might want to test hamming distance or edit distance. Their first thought might be, why not just use a collision-resistant hash function? Well, Loki's very clever. So he's just going to slightly alter a couple bits in the movie. It'll still play fine, but it will render this hash function unusable. Next up, x next prime still have to be close together. Otherwise, the movie is no longer the Avengers. So why not use a locality-sensitive hash? Well, locality-sensitive hashing is not robust at all. And our adversary is very clever. So he can find the right bits to flip such that the hash of x prime will look far, far away from the hash of x. So this is also not an option because it's not robust. So in other words, we want to do the same thing that was done to universal hashing, but for more general properties. So universal hashing was taken into the robust setting. By this notion of collision-resistant hashing. And what we want to do is we want to take locality-sensitive hashing as well as other types of properties and make them robust locality-sensitive hashing. So to conclude, our results are as follows. We provide definitions for this new hashing idea. We make many connections to communication complexity. We find lots of lower bounds. And despite all of these lower bounds, we give a construction for the gap-hanging predicate. Thank you. All right. Thank you.