 다음은 오블리비엘 트랜스퍼랑 시퀄쇼링입니다 이 트랜스퍼랑 이 2개의 소식입니다 첫 소식은 오블리비엘 트랜스퍼를 통해 노이즈 채널을 통해 이베 이슈아이, 이여, 꾸시렛 리헷, 래파엘 오스트로버치, 마노 프라카란, 아미사 하이앤 들어, 울시레그루 마노 류 기버 string 然後 릴렛 1 2 3 4 그래서 더 잘 넣지 못했죠. 그래서 이 유지력의 노IC 채널이 전혀 없어졌죠. 1975년에 와이너로 잘 사용했죠. 그는 정보를 제공했죠. 그는 정보를 제공했죠. 그는 정보를 제공했죠. 그는 정보를 제공했죠. 그리고 트러블 앙클린의 88P was to show that it's not just communication, you can even do secure computation using noisy channels without any other computational assumptions or setups. So what is Oblivious Transfer? Many of you would be familiar with it. It's a two-party perimeter, a two-party secure computation perimeter in which ideally Alice has two inputs X0 and X1 and Bob has a bit B and they feed it into this black box and as output Bob gets X of B. So Bob can pick up one of the bits that Alice has as input and Alice wouldn't find out which bit he picked up. That's what's Oblivious about it and Bob gets only one bit, he wouldn't get the other one. And we want to build this from a noisy channel for our purposes. A noisy channel is just a binary symmetric channel. Alice feeds in a bit and it gets flipped with certain probability, let's say a small probability. So this B is a bias random bit in this case. So this is what they built from a noisy channel. They could build an Oblivious Transfer from it. And why it's so interesting is because of a result first by Kylian showing that Oblivious Transfer is complete for secure computation. That is to say once you build Oblivious Transfer, then you can do any secure computation protocol and now we know how to do it quite efficiently too. So as I said, Kylian already built it and what's new in this work is to do it at constant rate. So what is constant rate? Or why is it interesting? The analogy is with what Shannon did in late 40s for communication. Not secret communication, just reliable communication. So Shannon's channel coding theorem shows that if you're in a noisy channel, firstly it's pretty easy to reliably communicate over it. If you just want to send one bit instead you'll send many bits and the receiver just takes a majority. But there's a very easy way to use a noisy channel for a reliable communication. What Shannon's celebrated theorem was that actually you can do it at constant rate. To send a large number of bits, you just need to send a constant times that many bits and you can get very small error probability. So the analogous question for us is what rate can you do secure computation over a noisy channel or more precisely how many bits do you need to send over the noisy channel per instance of OT that you built. So if you look at the original construction by Krepo and Kylian, it was not so great. They need to send something like K to the 11 bits per bit of OT, per instance of OT, to drive the error down to 2 to the minus K. So K here is a security parameter. Krepo improved it later to K-cube and Krepo, Morozo and Wolf improved it to about K-square later. More recently, Hamikishai, Kushlerz, and Nielsen actually did manage to get constant rate, but they had to restrict the adversary to behave in a semi-honest fashion. That is to say it will follow the protocol honestly. So our goal is to replicate this result, or this result if you want to think of it that way, to get a constant rate construction without making any restrictions on the adversary. The adversary can be malicious, and this is the best you can do. And that's the goal, and that's what we do in this paper. And I won't talk about it, but noisy channel could be more general than BSE, and we can handle all those too with simple extension to the protocol here. Okay, a very brief overview of the techniques. So we use what's called the IPS construction. This was from a few years back, joined work with Hamitsa Hai and Iwalishai. And I won't go into the details, but this is a construction that has, you know, let us build very efficient protocols for several things. It turns out to be very versatile. It needs a few components. So once you instantiate the components, the IPS compiler will put them together for you. So what are the components that we need in this case, or for IPS compiler? It needs an inner protocol, whatever that is. It needs to be, that protocol needs to be only weekly secure, namely secure against a semi-honest adversary. Another protocol called the outer protocol, which also has a weak kind of security guarantee. It needs to be secure, it's a protocol with a large number of parties. It needs to be secure only when a majority of the parties are honest. And a few string OTs. What is a string OTs? Just like what we earlier said about OT, but instead of Alice having two bits as input, she has two strings as input. And Bob can pick up one of those strings. Okay, so firstly, to put these things together, it turns out we need to modify the compiler slightly because now in the IPS protocol, the inner protocol, the original IPS protocol, the inner protocol would run on top of an OT channel. Now it's going to run on top of a noisy channel. So we need to be able to handle that. That requires us to have an inner protocol which can target some errors. Since I don't have much time, I won't go over the details. Suffices to say that whatever properties we need for this, including the new properties we need, we can get those things fairly easily from the literature. In particular, we use, in the inner protocol, we use the Harnick-Israe-Kuschleis-Nielsen OT protocol for semi-honest adversaries. What I'll focus on for the rest of the talk is how to build a string OT. And we need to do this efficiently. So I'm calling it a concentrated construction of string OT. So what exactly do I mean by concentrated construction for a string OT? String OT is an object which is more general than bit OT. But a constant string OT is an easier question in that I'm just trying to instantiate one string OT, which handles a long string. And constant rate means to handle a t-bit string OT, just one instance of string OT of t-bit long strings. I should use about oddity bits in the noisy channel. This is a new construction. Previously, similar result was known, not over BSC or that kind of noisy channels, but over more OT-like noisy channels and particular erasure channels. It's worked by Brassard Krupper-Wulf and Imai Morosso Nassimendo. Okay, so we are going to look at this construction, how to build a constant rate string OT from noisy channels. Now if we take, so let's get started with an observation. So suppose you take, say, the Krupper-Clien construction, which gets an error 2 to the minus K using a polynomial in K number of execution. Suppose you set that K to be a constant. So then it becomes a constant rate protocol, but its security is not so great. It has a constant security error instead of a 2 to the minus K. Now K is a constant, so it has a constant security error. So I'll call that kind of protocol fuzzy. And our challenge then is to go from this constant error to negligible error, but that's what we are going to use IPS compiler for and what we are left with is then to build a string OT. So we need to build a string OT, a constant string OT from this fuzzy OT objects. And in fact, I won't work with fuzzy OT. I'll work with something fairly similar. It's fuzzy OILI. OILI stands for Oblivious Linear Function Evaluations. Pretty similar. Alice has two field elements from some finite field. It'll be a constant-sized field for us. Two field elements A and C, Bob is an element B, and Bob gets his output AB plus C. So now we are given this fuzzy OILI, we want to build a string OT. That's a goal. Okay, so we do this in two steps. First is we'll reinterpret this fuzzy OILI as what I'll call a shaky OILI. And I'll come back to what that is in a second. And once you build this shaky OILI, and it's a perfect shaky OILI. So it's not fuzzy. There's no error about it. It's perfect, but it's shaky. And then use this shaky OILI to build string OT. So I need to tell you what these things are. So what is this fuzzy and shaky business? So to recap, a fuzzy protocol is a protocol which realizes a function F for us, it is OILI, with some constant security error. So security error means a gap between the real protocol execution and ideally what it should have been, essentially. A shaky functionality, call it F sigma, first flips a coin, independent of everything else. It just first flips a coin up with a bias sigma. If the coin turns up heads, then it works as F. Otherwise, with probability sigma, if it comes up tails, then it completely yields to the adversary's control. The adversary can see what's going into the protocol and it can control what comes out of the protocol. And a theorem is that an excellent fuzzy protocol for functionality F or the domestic functionality F is a perfectly secure protocol for a shaky version of functionality where the sigma depends on epsilon. All these things for us will be constants. So no mind. So why is this interesting? It's interesting as a composition theorem. So imagine you run many instances of this fuzzy protocol. What kind of guarantee do you have? You run n instances of this fuzzy protocol. Well, the best thing you could say maybe is that the error is now n times epsilon. But if epsilon is some constant, then n times epsilon is greater than 1 and having a security error greater than 1 means you're getting nothing. What this theorem lets you do is always something, and something quite useful out of this. It says that don't think of this as just a fuzzy protocol. Think of it as a perfectly secure protocol for this shaky functionality. So then when you run any instance of this shaky functionality about, you know, about sigma, fraction of them will yield to their diversity control, but the remaining fraction are good copies of F. So that's why it's very useful for us. I don't know much time, so I'll very quickly try to maybe I'll skip this. Try to give you a little example to give you a flavor of what this, what it, you know, how to prove that a fuzzy protocol is a perfectly perfect protocol for a shaky functionality. So we'll take a very degenerate functionality F. Actually, this functionality produces no output. It just takes a bit from Bob. It's a very easy functionality to securely realize, but we'll instead look at this fuzzy protocol to realize this. So there's a funny protocol where with probably half Bob will actually send his input. There was no need for him to send his input, but it's a fuzzy protocol. He does that. And otherwise he sends nothing. He does a reasonable thing. So if Bob's input is zero, then with probability half, he sends a bot. With probability half, he'll send his input. That is zero. If his input is one, it's an input coming from the environment outside. If his input is one, he'll send one with probability half probability half. Let me tell you that this is a fuzzy protocol, so there is a way to simulate it. For those of you who are familiar with the definition, let me give you a simulator. So the environment again sends you either zero or one. The simulator doesn't see it. Simulator now has to pretend that there was a... It has to simulate the message from Alice to Bob. It doesn't know what Y is, right? Well, what does it do? With probability half, it can send a bot. No problem. But otherwise, it's supposed to send the actual bit that Bob got, which it cannot. So instead, it sends a random bit. So it probably one for those. And it'll do the same thing irrespective of whether Y equal to zero or Y equal to one. It doesn't know. And if you look at this, you know, this distribution and this distribution, I mean, this is depending on whether environment chooses to go here or here. Doesn't matter. The simulation error is going to be one-fourth. So this is a fuzzy protocol with one-fourth error. I'll show you that this is a perfectly secure protocol for shaky functionality with half, sigma equal to half. Basically, I'll need to build a new simulator. Let me actually show you the new simulator. So the simulator doesn't have any choice, option over how F behaves. F behaves as it wants. So with some probability, FF half doesn't fail. Probably half. In that case, the simulator doesn't know Alice's Bob's input. So he just needs to send bot all the time. I write a half because this happens with only half probability. And this is the, this numbers are such that it's dominated by both these numbers. I could, in particular, subtract this from this intuitively. And what is remaining, I write down here. I write down here. Now this thing is when the functionality fails. So if you look at this, this to behave like this, the simulator needs to know whether y equal to 0 or y equal to 1 because the behavior is different in the two settings. But this is okay because this is when the functionality has failed and the simulator can see what the input to the functionality is. And if you put these two things together, you will get exactly what the real protocol behavior is. So it's going to be a perfect simulation. Of course, things get much more complicated when you have when it's not a degenerate function and you can look at the paper for the details. But we have this theorem. And it holds for any deterministic function. Probably something very useful you could use elsewhere. So I'll skip the fine print and so forth. Okay, now you have shaky early. How do you build string from it? How do you build if you have quickly each of this is let's think of them as bit oilies. Alice would send bits of this she picks two random strings X0, X1 and she would send A and C to the oily. She will send these two inputs X1 minus X0 and X0 Bob will send B. So Bob will get the linear evaluation X1 minus X0 times X1 fine. Now If he of he need he not send B in all instances Some bits of X1 X0 X1 X1 . Have a a few bits of this X X1 X1 isri A strong extract and so forth She will师 bonus 이 오일리 인스텐스의 1을 사용할 수 있습니다. 이렇게 직접 사용할 수 있습니다. 어떻게 사용할 수 있을까요? 이 것들의 적용을 사용하는 것입니다. 아주 짧은 글쎄요. 이 적용은 호음오프의 적용입니다. 이 적용은 리뉴얼입니다. 두 적용이 있습니다. 두 적용이 있습니다. They are randomized. With the homomorphic property, one is, I call it n, and the other is n squared. Think of them as read-solving code is secret sharing of degree D and degree 2D. So then it has this kind of homomorphic property. Not hard to see. Where the star represent coordinate by its multiplication. It also has error correcting and secret sharing properties. And we also need it to be sufficiently randomizing. It can all be instantiated using MPC-friendly codes. You look at next talk for more details since I'm out of time. And the bottom line is you can build. If you use the encodings, then what you get in the original construction I showed a little bit earlier. Even with the shaky early, there will be a security. So just to summarize. So we have this constant rate string OT. We put them all together into this IPS compiler. So what are the things that go into it? An auto protocol, which instantiates n instances of OT among a large number of parties. And inner protocol, which is somewhere on a secure. And IPS compiler needs something on the watch list, which is where we need these string OTs. And we'll use the string OTs I just described. And the string OTs use this homomorphic arithmetic encoding scheme. And we crucially rely on this fuzzy to shaky security. That's all. Thanks. We have no time for questions. So let's move to the next.