Aarhus MPC Workshop ’14, Wednesday

This week a large part of the MPC community is meeting in Aarhus for a workshop on Theory and Practice of Secure Computation. The local Crypto Group is going to blog live(ish) about all the talks at the workshop. As we are blogging live, our posts might contain inaccuracies and errors. Please write us if you find any!

The abstract of the talks, as well as some of the slides are available here.


The participants.

The participants.

5 Years of FHE

Zvika Brakerski

Fully Homomorphic Encryption (FHE) allows one to compute arbitrary functions over encrypted data without the decryption key. This problem was first raised by Rivest, Adleman, and Dertouzos in 1978. However, it was conjectured that this is impossible to achieve. 30 years after the introduction of the problem, Gentry gave an evidence that such an encryption exists, by proposing a candidate construction that is secure under relatively reasonable computational assumptions.  Followup works used the same blueprint of Gentry but improved the construction in a number of dimensions. Most importantly, the computational assumptions considered were weakened to standard ones.

Unsurprisingly, FHE is “the swiss army knife of Cryptography”, as pointed out by Zvika, allowing for several cryptographic applications such as  private information retrieval, multiparty secure computation, functional encryption, indistinguishability obfuscation – to name just a few. The talk focused on the properties we hope an FHE scheme to satisfy.

First of all, simplicity and security assumptions are important. Regarding efficiency, we care about the overhead of the evaluation function, the size of the keys/ciphertexts and the computation model.

Next, the talk moved to the “Second generation” FHE based on standard assumptions. The work of [BV11a] and the key-switching trick were presented.

Finally, the recent approximate eigenvector method of [GSW13] was described which eliminates the need of a key-switching in the evaluation key.  The scheme of [GSW13] is simpler and asymptotically faster since homomorphic addition and multiplication are just matrix addition and multiplication.

To sum up, lots of exciting work has been done during the last 5 years on “improving” FHE.

Outsourced Pattern Matching

Carmit Hazay

The topic of Carmit Hazay’s talk was a line of work concerning outsourced pattern matching. Pattern matching is the problem where we have a text and a ”pattern” (a smaller string of text) and we want to determine all positions in the text where this pattern appears. In cryptography we want to study the setting where the text and pattern are held by different players (respectively the “sender” and the “receiver”) and the receiver should only learn the positions where the pattern appears (and nothing else about the rest of the text) while the sender should not get any information about the pattern. Carmit remarked that there are several known results using tools such as oblivious PRF, oblivious automaton evaluation or garbling. However all these works have the drawback that the complexity depends linearly on the size of the text.

That motivates the main topic of the talk, which is introducing a third party (a server) that will be used to execute most of the computations. This is especially interesting due to the recent advent of cloud computing services. The protocols in this setting have a preprocessing phase, where the sender sends a message to the server, and the query phase, where the receiver can interact with both sender and server. We aim to prevent the server from learning anything else than the number of the pattern matches, and we want to minimize as much as possible the communication between the receiver and the other parties during the query phase. Carmit discussed three recent works of hers in this topic (the first was a joint work with Sebastian Faust and Daniele Venturi, while the second was together with Hila Zarosim).

  • In her work with Sebastian and Daniele, optimal communication (linear in the length of the pattern and the number of matches) is achieved, but at the cost of using the random oracle model. Their solution involves having the server solve easy subset sum problems.
  • In her work with Zarosim, she proved that the limitation of working the random oracle model is inherent since it is impossible to have optimal protocols otherwise in the cases where the receiver may collude with the server, or see the preprocessed message from the sender.
  • Finally Carmit described recent results where she could bypass the imposibility result described before by assuming that the server cannot collude with other parties.

Carmit finished by describing a couple of possible future research directions. First, an interesting question is whether better solutions can be achieved if we allow a higher round complexity. Second, variations of the problem are also worth studying, an example being approximate pattern matching.

On the Intrinsic Complexity of Broadcast

Martin Hirt

Martin's energetic talk.

Martin’s energetic talk.

The talk by Martin Hirt was about the intrinsic complexity of broadcasting. In the problem of broadcasting we assume that a dealer wants to broadcast a message among a set of n parties (where we count the dealer as a party as well) and each pair of parties is connected by a bidirectional secure channel. Furthermore t of the parties are corrupted. Our goals are consistency (the messages received by the different honest parties at the end of the protocol should be the same) and validity (if the dealer is honest, that message should be the one sent by the dealer).  It is well known that, without extra assumptions, broadcast is impossible if (and only if) t>=n/3. Therefore, in order to achieve broadcasting one needs to have some extra assumption: there are solutions if we assume that some public key infrastructure has been set up by some trusted third party.

But a different point of view is to assume that we already have access to a broadcast primitive, only that this primitive is restricted in the sense that we can use it to broadcast only “small” messages, and wonder whether we can “amplify” this broadcast primitive, meaning that using this primitive together with additional communication between the players we can broadcast larger messages. Martin not only showed that this is possible, but in fact he could quantify how much of this resource it is sufficient and necessary to assume. He introduced a function to measure this, the intrinsic complexity of broadcasting among n players. Perhaps surprisingly it turns out that the bounds for this complexity do not depend on the size of the message being broadcasted.

For example, he showed that in the case n=3, if our primitive allows us to broadcast a message from a domain of size just 3, then we can use it to broadcast a message from an arbitrarily large domain. Moreover, in this case 3 is also a lower bound. In the protocol, the message is first circulated among the players in two different ways, and then the broadcast primitive is used to broadcast a “hint” (smaller than the message) which allows the honest players to eliminate possible discrepancies in the information received before.

In the case n>=4, Martin showed that broadcasting 8n log n bits in total by means of the primitive is enough to broadcast a message from an arbitrarily large domain. On the other hand, in this case we know that it is not sufficient to broadcast only n-3 bits through the primitive. The protocols in this case use the notion of graded broadcast, which was explained in the talk, where the players output a message and a grade which quantifying the player’s perception about the honesty of the dealer and about other player’s perceptions. Martin showed how to amplify graded broadcast and how to use this in turn to amplify broadcast.

A potentially interesting question is whether cryptographic (rather than perfect as in the talk) security would help bringing the complexity down.

Faster Private Set Intersection based on OT Extension

Benny Pinkas

The last talk of the 3rd day of the MPC workshop in Aarhus was titled “Faster Private Set Intersection based on OT Extension” and was presented by Benny Pinkas. The talk was a survey of existing solutions to the Private Set Intersection problem (PSI) as well as a new solution based on OT Extension. Also the work presented included actual implementations of all the surveyed protocols meant to enable meaningful comparison.

Benny starts off by introducing the PSI problem which in a nutshell consists of a client and a server each holding a set of items. The goal is that the client learns the intersection of these sets while the server learns nothing. Next a range of applications were presented including human genome and vehicle proximity testing, but clearly only your imagination sets the limits of the applications of PSI.

Before jumping to the current protocols for PSI, the model was covered. All protocols are phrased in the semi-honest model and some also employ random oracles for efficiency benefits. Also Benny explained a naive approach to solve the problem: Have the server hash each entry of his set and send the hashes to the client. The client can now perform hashes on his set and check for collisions. At first this approach could work, but a lot of the servers privacy is lost since the client can simply hash all kinds of items and check if the hashes match.

The survey covered 4 approaches where some of the approaches had additional optimizations and variants.

  1. The first approach, dated back to 86′, was based on the DDH assumption + random oracles and was indeed very simple and elegant both in terms of computation and communication. Also the work performed was trivially parallelizible.
  2. The next was based on a blind RSA approach, but in this setting one party has to do all the computation wheres in the former DDH protocol the work was shared between client/server. It was clear that Benny prefer the DDH approach, also because of it being implementable using elliptic curves.
  3. Next up was a generic circuit approach which employed a sorting network to first take the union of the two sets, sort them and compare adjacent elements (will be identical) and finally shuffle the elements to hide which indexes of the sets were indeed identical. To evaluate the circuit in a GMW approach OT-extension was heavily used and with recent optimizations for random-OTs this had a significant impact. Although it turns out this approach is the least efficient in the end, because of the generic nature of this approach it can be used in any setting.
  4. Fourth approach was done on bloom filters and was also based on OT. There was no time to go into details of the scheme, but it was clear that the parties need to run an OT for each entry in the bloom filter. Since these entries could be random strings the random-OT optimizations could also be used here.

Finally Benny presented their new protocol for solving PSI which relied heavily on OT-extension as well. The idea was to implement a private equality tester using the OTs and run it for all elements in the sets in parallel. Thus the communication was O(n^2*\lambda). This could further be improved using hashing techniques and it turned out cuckoo hashing achieved the best results.

As a conclusion they showed tables comparing all the implementations of the above protocols and it was evident that the new protocol had the fastest running time of all the protocols. However the DDH approach, because of it’s simplicity had a better communication complexity.

The presentation was based on joint work with Thomas Schneider and Michael Zohner.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s