The statistical science of learning about the group while protecting the information of individuals in the group is called differential privacy.
An interesting technical digression not included in the final version of the article: Differential privacy ensures that any conclusion drawn from an analysis remains almost the same no matter whether any particular individual’s data is used in the analysis or not. In other words, the results of the analysis do not reveal any information specific to an individual. It can be accomplished by adding some uncertainty, or noise, to the information to be disclosed, in the form of an algorithmic framework, though doing so also comes at a cost: a tradeoff between accuracy and privacy.
That tradeoff represents a crucial challenge going forward. So much data is produced on a per capita basis at this point that managing and protecting it becomes a priority. The work of researchers like Lee is enlightening on many levels, and the more we understand about the implications of the information-flooded world we have constructed, perhaps the better able to navigate it appropriately we will be. Nonetheless, a great young faculty member we are happy is now part of UGA and the Franklin College.