Multi Party Computing vs Trusted Third Parties

Multi party computing (MPC) is not as hyped as blockchain, but that is just because it does not offer fundamentally new things like the distributed ledger. The functionalities that MPC offers are not inherently different, because those tasks were previously solved with the help of trusted third parties. However, with the rising awareness about privacy in the general public and the legislature, using trusted third parties is frowned upon and in some cases it is even illegal. MPC technology avoids the need for a trusted third party and makes new data driven services feasible even in the case of personal data.

At Cybernetica, we have been doing research on MPC for over a decade. This research has culminated in the Sharemind MPC platform - an easy to use commercial enterprise ready MPC solution. Sharemind MPC applications are programmed in a C-like language called SecreC, which differentiates private and public data on the type system level.

In the SafeCloud project we are building a SQL module for our Sharemind MPC platform. A lot of people know how to write SQL queries, because relational database management systems are still very popular and their primary interface is SQL. We make it easier for developers to create new privacy preserving services with the Sharemind MPC platform by providing a familiar SQL interface.

In SafeCloud's terms our solution is called Secure Multi-cloud Application Server and it supports a limited subset of SQL. At first it was envisioned as a specific client application for Sharemind MPC. However, during the project we found that we cannot easily solve the output privacy problem in the general case for SQL. This realisation led us to rethink our architecture and we created a Sharemind MPC module which translates SQL statements to MPC operations. This architectural change allows developers to use SQL directly from the SecreC language. By using SQL directly from SecreC, developers can ensure that the final service will not allow queries that try to break output privacy i.e. try to infer private values by combining different outputs.

Karl Tarbe, Cybernetica