 improvise is going to tell us about channels of small, large ratio leakage, and distributed differentially private computation. Gars, and in this talk I want you to think about channels as adjustments of the new distributed vorables. and are going to ask what Alise and Bob can do given an access to specific channels. So think about Alis and Bob DIY They want to use the channel so they can initiate it and by that Alice gets a sample from X and Bob gets a sample from Y and in order to perform a certain task, they can initiate the channel several times and to communicate between them, in the end they can retrieve an output and we want to understand what the power of such scheme. So what Alice and Bob can do using an access to C that they cannot do trivially. In this work we are going to focus on channels with designated outputs. So we will assume that X contains some part that is the output of Alice and one Y contained a part which is the output of Bob. We think about it as one or two bits usually. And in order to execute this protocol Alice only need to know her output in the channel. But if Alice is curious she may want to learn as much information she can about the output of Bob and she can use all of X in order to do so. Cool, so let's see examples. So our binary symmetric channel is just a channel in which X and Y are one bit each and X is equal to Y with probability 1 minus P where P is some parameter of the channel. Our perfect channel is a channel in which easy X is always equal to Y. And the channel in which X and Y are uncorrelated is also an example to our channel. More generally we can define a protocol to be a channel if we look on the views of Alice and Bob. So if we have a protocol we can think about the view of Alice, the transcript and randomness as X and the view of Bob is Y and this protocol defines a channel. And we say that the channel is trivial if there exists a protocol that realize it. So for example if X and Y are uncorrelated this is a trivial channel because Alice can sample X alone and Bob can sample Y alone. And if X is always equal to Y then Alice can sample X and send it to Bob. But binary symmetric channel is untrivial if P is larger than zero and less than half. So there is no protocol that realize this channel. Another example to a channel maybe the most important one is oblivious transfer. I'm going to talk about randomize one out of two OT but this is equivalent to the definition you think about. So in such a channel Alice gets two uniform independent bits X0 and X1 and Bob gets a third bit B and one of the bits of Alice XB. And we want that Alice will not know what B is and Bob will not know anything about the bit that he didn't got from the channel. More generally Alice can get more information from the channel as long as B is still uniform given a view. And the same from Bob with respect to the bit he didn't got from the channel. So this is the most important channel because it's complete for secure computation. So given an access to such a channel Alice and Bob can compute any function of their input securely. So we really want to understand which channel can be amplified into oblivious transfer. What do I mean by that? So if Alice and Bob can communicate, initiate the channel, communicate, and then output a bit that is induced OT, then we said that a channel can be amplified into oblivious transfer. Cool. So information theoretically we know that any non-trivial channel, any channel that cannot be implemented using the trivial protocol can be amplified into OT. But distance of transformation is inefficient and not a black box. So Alice and Bob needs to know to use all of X and all of Y in order to construct the oblivious transfer. And we need, we want black box amplification. So we need to work harder. And the motivation for this problem, we have two reasons. So one is to construct oblivious transfer, to construct efficient oblivious transfer from channels. So for example, we know that we construct OT from physical channels, from quantum channels, from binary symmetric channels, and so on. And the other reason is to understand which task requires OT. So if we can show, for example, that private information retrieval can be amplified into oblivious transfer, then we cannot opt to construct private information retrieval without assuming OT. But still we do not understand enough which channel can be amplified into OT and which not. There are still channels that we don't know if they are trivial or can be amplified efficiently into OT. And we really want to find zero one log or some properties of a channel that if the channel has them, then it can be amplified into OT, and if it doesn't have them, then it's trivial. And natural such properties are agreement and leakage. So agreement is just the probability that Alice and Bob agrees, they have the same output. So we said that a channel has alpha agreement if the probability of agreement is half plus alpha when we think about the output as one bit each. And the leakage is the amount of information each party learns about the output of the other from this view. And the usual way to measure leakage is by statistical distance. So we said that a channel has low statistical leakage if Alice cannot tell from an error view whether the output bits of error and Bob agrees or not. So we wanted the distance between the distribution of the view of Alice when the outputs are the same will be close to the distribution of the view of Alice when the outputs are different. And we measure the distance with statistical distance. So we said that the channel has better statistical leakage if this distance is less than better. And the same for Bob with respect to Y. So we know what Schlager showed that if the leakage is much smaller than the agreement, if beta is less than alpha squared, then a channel can be amplified into OT. But still we don't know what happened when the leakage is larger than alpha squared or the leakage is larger than the agreement. And this is not because we don't have strong enough theorems, it's because the statistical distance not capturing some properties of this channel. So for example we have an example for a trivial channel that I'm going to show in which the agreement and the leakage are of the same order. So in this protocol with probability 1-p Alice and Bob going to output two independent bits. And otherwise Alice is going to sample a bit, send it to Bob, and Alice and Bob are going to output this common bit. And you can calculate and see that the agreement in this protocol is p over 2, and the leakage is 2p. So the agreement and leakage are of the same order. But this doesn't mean that any protocol or any channel with the same agreement and the same leakage is trivial. Maybe we have another channel with the same parameters that we can amplify into OT. So we want to find a more fine-grained leakage measure or a leakage measure that captures this difference. So in our work we propose the log ratio distance or the log ratio leakage. So two distributions are epsilon closed in log ratio distance. So if for every event E the probability of the event under D0 is between E in the minus epsilon to E in the plus epsilon times the probability of the event under D1. Another way to say it is that the log ratio between the probabilities is between minus epsilon to plus epsilon. And again we say that a channel has epsilon log ratio leakage if the distribution of the view of Alice when the bits are the same and the distribution of the view of Alice when the output bits are different are closed but this time in log ratio distance. So actually it means that the bit that says whether the outputs are the same or different is epsilon differentially private given X or given Y. And to get a sense of this distance measure so if D0 and D1 are epsilon closed in log ratio distance then they are two epsilon closed in statistical distance and this is tight. But if they are epsilon closing statistical distance it doesn't mean that they are closing log ratio distance for any finite number so they can be infinitely far. So let's go back to our trivial example and try to measure the distance of the views with log ratio distance. So think about the event in which Alice sends the message the bit B. So notice that this message is sent only if Alice and Bob agree on the output. So the probability of this event when Alice and Bob do not agree is zero and when they are agree is some constant. So the log ratio between these two probabilities is infinity which means that the log ratio leakage of this protocol is infinity. And this is good because we want to say that if a channel has low log ratio leakage then it can be amplified into OT. And this is a trivial protocol that cannot be amplified into OT so we want the log ratio leakage to be large. So indeed this is large and this measure captures some property of the channel that we want to capture. Cool. So our main theorem for channels stated that any channel with epsilon log ratio leakage and agreement which is larger than epsilon squared can be amplified into OT. So let's compare it to Wuscheliger. So Wuscheliger needed the leakage to be much smaller than the agreement and here we allow the agreement to be much smaller than the leakage. Which is good. This is tight in the sense that there exists a trivial protocol with C times epsilon squared agreement and epsilon leakage. So this is tight up to a constant. And we also have a similar theorem for computational variant of channels. So an immediate application of this theorem and maybe our only application for this theorem is the variant of differential private XOR. So differential private XOR, in differential private XOR protocol Alice and Bob want to compute the XOR function privately. So you can think about it as a relaxation of secure computation. But we want to think about it as a channel. So think about Alice, the thing that Alice gets two bits O, A and Z and Bob gets another two bits O, B and the same Z. And Z supposed to be the XOR between O, N and O, B. So we said that the channel has alpha accuracy. If the probability that Z is indeed XOR between the output bits is alpha plus alpha. And for privacy we wanted the view of Alice will be differentially private with respect to the output bit of Bob. So we wanted X given that the output bit of Bob is zero will be close to X when the output bit of Bob is one in low pressure distance. This is the usual definition of differential private when we write it in our notation. And we ask the same for Y with respect to the output bits of Alice. And again, this is a relaxation of secure computation. So we can ask if still we need to assume OT in order to implement such protocol. So it's known that if alpha, if the accuracy is less than epsilon squared, then we can achieve such protocol trivially without any assumption. And this is the best we can do without assuming anything. And we know that if alpha is greater than epsilon, then it's impossible because the privacy requirement contradicts the accuracy requirement. And Goyalet, I showed recently that if the agreement is the best possible for given epsilon, for given privacy, the protocol implies OT. But there is still a large gap between what we know we can implement trivially and what we know to imply OT. And using our last theorem, by measuring the leakage of the channel with low pressure leakage instead of statistical distance, we close this gap. So up to a constant, we show that any channel that compute differentially private XOR with epsilon differential privacy and alpha accuracy, if alpha is larger than epsilon squared, the channel can be amplified into obliviousness. Cool. So to summarize, we showed zero one law for low pressure leakage and application for private computations. And for open questions, we may want to find more zero one laws for other characterizations of channels. And to find OT dichotomy. So Eitner, Nissem, Omrish, Altiell and Silbac showed that any protocol is either trivial or implies key agreement. And their definition on trivial is that the protocol can be simulated without interaction, only using public randomness. And we really want to say similar things for OT. So it will be great to answer this question. Thanks. Any questions for Noam? I have a question. So you have an application of a differentially private XOR. And how it relates to OT. Do you have any other functionalities in mind? Great. So we generalize our result for monotone functions. So for some definition of monotone function, we show that any non-monotone function implies OT. But for monotone function, and for example, the end function, this is a still open question. No again.