Mireya Jurado
Florida International University Knight Foundation School of Computing and Information Sciences
Lecture Information:
- June 29, 2022
- 12:00 PM
- CASE 349 & Zoom: https://fiu.zoom.us/j/92766265481 Passcode:QIF2022

Speaker Bio
Mireya Jurado is a PhD candidate in the Cognac Laboratory (Cognition, Narrative, and Culture) under the supervision of Dr. Mark Finlayson. Her research interests include security, privacy, and information theory. She is a fellow at FIU’s DHS Center for Advancing Education and Studies on Critical Infrastructures Resilience (CAESCIR) since 2020 and was a FIU School of Computing and Information Sciences Director’s Fellow from 2016-2019. Mireya has a M.Sc. in Computer Science from FIU and a B.A. in International Relations and Anthropology from the University of Miami.
Description
Across our digital lives, two powerful forces of data utility and data privacy push and pull against each other. In response, technologies have been designed to help navigate this tension, by revealing, or leaking, information that could be useful to an adversary in exchange for some added utility. For many of these technologies, there is uncertainty about how harmful these leaks are. How much security and privacy are we sacrificing? Under what circumstances are these technologies safe to use?
To bring clarity to these compromises between security and utility, this dissertation applies the information-theoretic framework of quantitative information flow (QIF), which models systems as channels correlating secret inputs with observable outputs. The QIF framework can determine an optimal adversary’s expected probability of success, and thereby isolate the effect that these technologies have on security and privacy.
In this work, we apply QIF to three applications: We first examine two forms of property-revealing encryption: equality- and order-revealing encryption. These techniques allow database management systems to respond to queries and sort much faster without direct access to sensitive values, but the leakage implications are still unclear. The third application is local differential privacy and the shuffle model. Differential privacy aims to protect the privacy of individuals participating in statistical databases but must balance utility and privacy. The shuffle model can improve this balance, but exactly how much security it is providing is uncertain.
Through novel combinatorics and exact calculations, we precisely quantify the leakage of these applications under different parameters and relevant adversaries. For each application, we provide clarity about how much information is leaked and under what circumstances an application could be safe to deploy.
Zoom: https://fiu.zoom.us/j/92766265481
Meeting ID: 927 6626 5481
Passcode: QIF2022