In 2018, Microsoft teamed with the Harvard University Institute for Quantitative Social Science to create a differential privacy solution. The platform leverages the OpenDP Initiative, Microsoft sought to develop an open platform for maintaining data privacy for individual users.
Microsoft has now confirmed the platform is now available. Users can tap into the project on Microsoft's GitHub, allowing support and development.
While the platform is focused on maintaining user privacy, it also allows researchers to safely tap into insights. Microsoft has not named the platform, which is based on Harvard's OpenDP.
Redmond was eager to confirm the platform is completely open. It has a global royalty-free license from Microsoft, which means anyone from anywhere can use the platform to create datasets. Julie Brill, CVP, Deputy General Counsel, and Chief Privacy Officer at Microsoft, highlighted how important privacy has become.
“We need privacy enhancing technologies to earn and maintain trust as we use data. Creating an open source platform for differential privacy, with contributions from developers and researchers from organizations around the world, will be essential in maturing this important technology and enabling its widespread use.”
How it Works
So how does differential privacy under OpenDP work? Well, Microsoft and Harvard add statistical noise to datasets. What this does is essentially hide and protect data to make it private to the individual. However, researchers can still extract data when they need to.
If a data query from a researcher gets close to accessing personal information, the platform stops additional data queries.
“Through these mechanisms, differential privacy protects personally identifiable information by preventing it from appearing in data analysis altogether. It further masks the contribution of an individual, essentially rendering it impossible to infer any information specific to any particular person, including whether the dataset utilized that individual's information at all.”