WAMPC, Day 2, Morning

A delegation of the Aarhus Crypto Group (Bernardo DavidTore FrederiksenClaudio OrlandiTomas Toft and Roberto Trifiletti) is attending the Workshop on Applied Multi-Party Computation and will attempt to cover this event on the MPCLounge.

We decided that timeliness is more important than perfection, and we therefore apologize for any mistake in these posts!

This is the third of a 4 post series.


First Session: Invited Talk

Secure Computation in 2029: Boom, Bust, or Bonanza

David Evans

David Evans

David Evans

David decided to use his invited talk not to present his recent results, but instead to try to make predictions about the future of MPC adoption. David started with justifying the use of the year 2029 in his title: apparently heuristics research in astrophysics shows that scientific development doubles every 15 years. David played a similar game with MPC using Google Scholar, showing that around 700 papers where ever published on MPC, while only 160 were published until 1999. On can go further back and find out that 7 papers were published up to 1984 (while the count is 0 up to 1969, except for a few hilarious misreported entries, according to which Payman Mohassel was already working in the field in 1908!).

Anyway, the take home message is that according to David MPC will evolve dramatically in the next 15 years.

Jumping forward to 2029 predictions: The US invested around $100M in MPC research, which is almost the same as being spent on the arts every year, or how much it costed to clean Virginia up from snow last week. Of course there’s some expectation on a cash-back, but what can be expected as acceptable results? For researchers getting papers published and making sure students are hired after graduating is important, but to the public an industrial impact is expected. During the talk David makes a number of claims and they are listed as they appear in the presentation.

Claim #1
The MPC industry will be bigger than anti-malware industry in 2029 (but is unclear if this will have to do with the MPC industry exploding, or the anti-malware shrinking).

Claim #2
High cost is no longer the main impediment to widespread MPC use.

Next David moves to explain the genetic dating problem, which is the traditional dating problem, but instead of keeping your preference private you want to keep your genetic information private. The goal is still to see if there’s a match, so the two are in no way exclusive. In fact they supplement each other perfectly. There is an (non-MPC) mobile app for the genetic dating problem in Iceland, and is clearly needed in such small communities:

"If I would have had this app last year I probably wound't have gone home with my cousin"

“If I would have had this app last year I probably wound’t have gone home with my cousin”

Continuing on this topic David noted that the cost to sequence the human genome went down drastically in the timeframe 2008-2013 from $10M to $0.01M. (up until then the decrease was more steady, following more or less moore’s law). Rephrasing this in MPC terms we have that in 2004 it could be done with FairPlay for a price of $100M, while in 2013 it can be done with JustGarble costing only $0.001M. Moving on to costs that still matter, we have that for 3 or more parties costs are still way too high. On a different note MPC actually requires around 10.000x as much energy as computation in the clear. Other factors of importance which needs to be addressed is to make MPC meaningful to the end-user and develop methods to reason about how much parties learn from the output of the computation.

In the final part of the talk David talked about two of his own projects: the MPC mobile apps part of the “Might be Evil” project, as well as a quite different project.

Second Session: Databases

Practical Private Database Querying

Vlad Kolesnikov and Tal Malkin

Vlad and Tal talked about Blind Seer — an implementation for performing database queries on large datasets. The core goal is extreme efficiency: at most an order of magnitude slower than running MySQL on known data. To allow this, they allow limited leakage, which ensures the possiblity of sublinear solutions. The amount of leakage is hard to quantify, but parties will learn no information on the plaintexts beyond the patterns, hence it will be acceptable for many scenarios.

The construction is based on Bloom-filters. A server holds a search tree in encrypted form. Starting at the root, the client and server use a Yao circuit to determine which sub-trees need to be queried. This is revealed, and the parties continue recursively until they either reach the leaves or find that no sub-trees contain. Naturally, this leaks information, e.g., and AND-query may be pruned, while this is never the case for an OR-query. Efficiency is very
good, theoretically it can be proportional to #matches for the best terms. In practice, most queries are at most a factor of two from MySQL.

Some queries require additional hiding such as whether any elements where found (0-1 set size indistinguishability), e.g., whether an airline passenger list contains a terrorist. This cannot be concealed (in the cryptographic sense), however, when the probability of “bad events” is low, dummy paths can be followed to ensure that server only learns that the probability of a bad event is increased.

Practical linking of databases using secure multiparty computation

Riivo Talviste

Riivo talks about the scenario when a state is interested in datadrive decission making. The goal is to combine different, sensitive data sources, but avoid the risk of doing this. The concrete example is analysis of income in the public sector. This has been implemented and can be demoed here.

A second, practical application considers the negative impact of being employed. During studies. Combining income data (from the tax office) with data on education, one can compute, e.g., the probability of a studen dropping out if she is employed from her second year.

Traditionally, this requires approval from data protection agencies however, even if permission is given, this can be to anonymized data, which can lead to data loss. An actual example of 76-98% is given for the scenario. However, when using MPC this is not an issue, since the agency allows using non-anonymized data in this case.

A large set of statistical functions have been implemented. The background is interviews with various domain experts — when questions “what functions are interesting?”, the typical reply was “Statistics!”

Third Session: Server-Aided MPC

PartialGC: a system for saving and reusing intermediate garbled circuit values

Benjamin Mood

This work consisted of trying to come up with ways of reusing intermediate values in garbled circuits. This included transforming output wires from the first circuit into input wires of later circuits using these values. A scheme was proposed for doing this transformation and also a way of checking that this was indeed done correctly so no cheating occurs.
Next the actual evaluation is outsourced to a high performance cloud server using same techniques as in the work of “Secure Outsourced Garbled Circuit Evaluation for Mobile Devices” by Henry Carter, Benjamin Mood, Patrick Traynor, and Kevin Butler from Usenix 13′.
Finally comparison from previous schemes were presented and this scheme achieved speedups of 8.7x, 15x, 23x speedup for 64 circuits and 11x, 19x, 32x speedup for 128 circuits for 64, 128, 256 keysize.

Whitewash: Outsourcing Garbled Circuit Generation for Mobile Devices

Henry Carter

Mobile devices are present in every moment of our lives: accessing emails, instant messaging, social networks, etc. Hence, it is interesting to consider running secure computation applications in such devices. However, computational resources in mobile devices are seriously constrained, making it impractical to have them run current protocols for secure computation, such as garbled circuit based protocols.

The goal of Whitewash is to enable two smartphone users to securely compute functions using garbled circuits with the help of a remote “cloud”. To achieve this goal, smartphones outsource circuit generation to a cloud provider in such a way that no information about the inputs or outputs is leaked. The scheme also outsource the necessary oblivious transfer steps that might overload a smartphone’s resources.

Basically, the mobile device exchanges random seeds with the cloud, who then uses it to generate garbled circuits. The mobile device, the cloud and the second party involved in the computation then commit to their inputs and information used to verify the correctness of the final computation results later on. Throughout the computation, the data is protected by “blinding keys”, which are used in the end to retrieve the output.

As for efficiency improvements, the highlights are: no OT nor consistency checks performed by the mobile device and strong resistance against collusion by the mobile device and the cloud, as well as collusion by the other party and the cloud. Such improvements are verified through comparisons with previous protocols evaluating circuits for hamming distance, matrix multiplication and RSA on standard smartphones. The results show exciting improvements in runtime. Moreover, the results show that OT accounts for most of the computational effort involved.

Advertisements

One thought on “WAMPC, Day 2, Morning

  1. Thanks for writing the summaries! One correction – the price of doing the comparable MPC today is $0.001 (yes, I actually mean a tenth of a cent) not “$0.001M” (which I guess would mean $1000 which is still fairly expensive for Alice or Bob’s dating needs).

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s