NIST Offers Draft Guidance on Evaluating a Privacy Protection Technique for the AI Era

NIST Offers Draft Guidance on Evaluating a Privacy Protection Technique for the AI Era

A pyramid is made up of phrases to evaluate differential privacy. The bottom block is Data Collection Exposure; the top block is an epsilon.

Here’s a tricky situation: A business that sells fitness trackers to consumers has amassed a large database of health data about its customers. Researchers would like access to this information to improve medical diagnostics. While the business is concerned about sharing such sensitive, private information, it also would like to support this important research. So how do the researchers obtain useful and accurate information that could benefit society while also keeping individual privacy intact?

Helping data-centric organizations to strike this balance between privacy and accuracy is the goal of a new publication from the National Institute of Standards and Technology (NIST) that offers guidance on using a type of mathematical algorithm called differential privacy. Applying differential privacy allows the data to be publicly released without revealing the individuals within the dataset.

Read More