21st century statistical disclosure limitation: motivations and challenges
Handbook of Sharing Confidential Data: Differential Privacy, Secure Multiparty Computation, and Synthetic Data, Drechsler, J., Kifer, D., Reiter, J., and Slavković, A. (Eds.), CRC Press, 2024
Over the coming decade, national statistical offices will likely undertake a re-engineering of their data confidentiality programs comparable in magnitude to the transformation of statistical disclosure limitation (SDL) that began in the 1970s. Fellegi and Delenius ushered in a principled and scientific approach to SDL that fundamentally reshaped how statistical agencies assessed and controlled disclosure risk in their public data releases. Over the subsequent decades, agencies continued to improve and strengthen their implementations of SDL, but these changes have largely been incremental adjustments and extensions to approaches pioneered in the 1970s and 1990s. Today, advances in computing power, the development of powerful optimization algorithms, and the proliferation of rich, third-party data have contributed to a data protection landscape that renders the widely used SDL methods of the last several decades increasingly vulnerable. Modernization of SDL for the 21st century is not going to be easy nor will it be uncontroversial. Not only will it require statistical agencies to rethink their entire approach to SDL and how it fits within the broader data life cycle, but it will also require agencies and data users alike to make difficult decisions about the content and form of official statistics and how data users can access them.
Recommended citation: Abowd, J. M., & Hawes, M. B. (2024). 21st Century Statistical Disclosure Limitation: Motivations and Challenges. In Handbook of Sharing Confidential Data (pp. 24-36). Chapman and Hall/CRC.
Download Paper | Bibtex