Culture

Reflections on Grace Hopper 2018

My experiences at Grace Hopper helped me recognize the power with which we, as technologists, can wield data to promote equity and safety while protecting the individual privacy and boundaries of those we are hoping to serve. 

Sara Hartse

Oct 24, 2018

The Grace Hopper Celebration is the world's largest gathering of women technologists, and thanks to Delphix, I was able to attend this year’s event in Houston. The conference has a huge variety of sessions, both technical and career-focused. I chose to attend sessions primarily in the security and privacy track, and interestingly, I noticed several themes core to the mission of Delphix emerge. These themes were the powerful insights possible from data analysis of different systems as well as the importance of data privacy for the users of these systems.

Pictured (left to right): Jas Sumal, senior information architect for Customer Education & Experience; Rachael Naphtal, senior member of technical staff 2 with Development Engineering; Sara Hartse, senior member of technical staff 1 with Development Engineering; Anna Li, staff program manager for Development Engineering

One of my favorite sessions was Lucy Qin’s “Privacy-Preserving Data Analysis for Social Impact,” where she highlighted the personal bargain we as individuals have grown used to making in the last decade: giving up our personal data in exchange for data driven insights. Qin challenged the ubiquity of this model and described systems where multiple parties construct insights by computations performed on masked versions of your data - without any single organization having access to the raw form.

To illustrate this point, she described a project underway by the city of Boston focused on reducing the wage gap between men and women. Companies were extremely hesitant to share their salary data with any third party, so Qin’s research group facilitated a secure multi-party computation scheme, where multiple institutions calculate averages on masked datasets and retrieve the final outcome (the mean salary of men vs. women) by combining the masked averages.

Qin outlined a number of other data sources, with highly sensitive personal information but very rich and beneficial insights, that could leverage these privacy preserving algorithms. Those sources included medical patient data to understand factors of different outcomes, data on the usage of surveillance court orders and data from the UN about refugee resettlement outcomes.

Another example of this model in action is Callisto, a new tool leveraging cryptography to allow victims of sexual violence to report their experiences on their own terms in a safe and secure way. Anjana Rajan, CTO of Callisto, also held a fascinating talk about addressing the problem of underreported sexual assault cases as a game-theory and privacy problem. Many survivors face strong disincentives to be the first reporter, fearing they won’t be believed and the stigma of victim blaming. But after the first report, things get a lot easier. Callisto allows survivors to record their experience and the identity of their attacker to save in an encrypted form, sealed unless they choose to report it later.

The software cross-checks the hashed identifying information about the attacker and alerts the reporters if another allegation has already been made against the same person, connecting the reporter with resources if they decide to take further action. Rajan brought up some fascinating points about Callisto’s core design tenants: one is the emphasis of user control and autonomy, since the process of reporting these crimes has the potential to be traumatizing in itself. The other is the prioritization of security and privacy, since the information being disclosed is massively sensitive, not even Callisto has access to the unencrypted reports.

Lastly, this recurring theme of privacy and control was reinforced by Jennifer Iudice, a design researcher at TEAGUE. In her session, “Travel with Trust: Designing for Women’s Safety in Autonomous Rideshares,” she talked in-depth about how her team works to figure out how to make rideshares safer for vulnerable passengers. For example, simply the premise of a rideshare taking you to your door is risky for many people since it reveals where you live.

Iuidice posits that privacy and safety should be an integral part of the ridesharing software design, not an afterthought. This resonated with me with regards to the requirements of our Delphix customers and the ways in which they need guarantees about what is protected throughout their data pipelines.

The discussions of privacy-oriented design illuminated a concept I hadn’t fully considered before, which is the idea that our software doesn’t just need to provide a wide variety of protections but it must convey to the user the privacy guarantees they have and how to control them. My experiences at Grace Hopper deepened my understanding of our mission at Delphix to supply data to creators in a safe and accessible way. The speakers explored the deep nuances in these areas, and the power with which we, as technologists, can wield data to promote equity and safety while protecting the individual privacy and boundaries of those we are hoping to serve.

Hearing about the innovative and empathetic work in the areas of data analysis, data privacy and design led me to reflect on my work as a Delphix engineer, calling greater attention to our company mission of ensuring that our platform allows customers to make informed decisions about the privacy of their data.

Sara Hartse is an engineer on the Systems Platform team at Delphix. She works on the ZFS filesystem and other OS features.