HomeWinBuzzer NewsMicrosoft Uses Harvard’s OpenDP for New Differential Privacy Platform

Microsoft Uses Harvard’s OpenDP for New Differential Privacy Platform

Microsoft has announced a differential privacy platform created with Harvard University’s OpenDP, which is available on GitHub.


In 2018, teamed with the Harvard University Institute for Quantitative Social Science to create a differential privacy solution. The platform leverages the OpenDP Initiative, Microsoft sought to develop an open platform for maintaining data privacy for individual users.

Microsoft has now confirmed the platform is now available. Users can tap into the project on Microsoft's GitHub, allowing support and development.

While the platform is focused on maintaining user privacy, it also allows researchers to safely tap into insights. Microsoft has not named the platform, which is based on Harvard's OpenDP.

Redmond was eager to confirm the platform is completely open. It has a global royalty-free license from Microsoft, which means anyone from anywhere can use the platform to create datasets. Julie Brill, CVP, Deputy General Counsel, and Chief Privacy Officer at Microsoft, highlighted how important privacy has become.

“We need privacy enhancing technologies to earn and maintain trust as we use data. Creating an platform for differential privacy, with contributions from developers and researchers from organizations around the world, will be essential in maturing this important technology and enabling its widespread use.”

How it Works

So how does differential privacy under OpenDP work? Well, Microsoft and Harvard add statistical noise to datasets. What this does is essentially hide and protect data to make it private to the individual. However, researchers can still extract data when they need to.

If a data query from a researcher gets close to accessing personal information, the platform stops additional data queries.

“Through these mechanisms, differential privacy protects personally identifiable information by preventing it from appearing in data analysis altogether. It further masks the contribution of an individual, essentially rendering it impossible to infer any information specific to any particular person,­ including whether the dataset utilized that individual's information at all.”

Luke Jones
Luke Jones
Luke has been writing about all things tech for more than five years. He is following Microsoft closely to bring you the latest news about Windows, Office, Azure, Skype, HoloLens and all the rest of their products.

Recent News