Rising Stars 2020:

Deeksha Dangwal

PhD Candidate

University of California, Santa Barbara


Areas of Interest

  • Computer Architecture and Engineering
  • Privacy

Poster

Privacy of Program Traces

Abstract

When working toward application-tuned systems, developers often find themselves caught between the need to share information (so that partners can make intelligent design choices) and the need to hide information (to protect proprietary methods and sensitive data). One place where this problem comes to a head is in the release of program traces. A trace taken from a production server might expose details about the users, the system, or even information about the actual computation itself (e.g. through a side channel). My research on trace wringing mitigates this exact risk by preserving and sharing the structure of the program trace without leaking the actual addresses. Trace wringing uses a simple metric to quantify information leakage—number of bits. Our pipeline leverages computer vision techniques to describe repetitive program patterns succinctly into lossily-compressed “packets”. The size of the packet in bits provides an upper bound on information leakage. Our argument is simple: if we only share n bits about the trace, then we cannot leak more than n bits about that trace. The question then becomes: how do we minimize leakage, without affecting the utility of the trace?

Additionally, the problem of trace privacy can be linked to the well-studied problem of information flow analysis and by learning which addresses in a trace are sensitive to the private input, we can specifically target them with more aggressive leakage mitigation strategies. When sensitive information can be identified at the program level, the impact of that sensitive data can be identified in the resulting address and instruction traces. We present an ensemble of scrubbing techniques that, at the extreme end simply deletes or redacts sensitive addresses from a trace or replaces sensitive data with stand-in addresses that are behaviorally similar. We demonstrate that trace wringing can be improved significantly, even without introducing information sensitivity analysis, through the careful tuning of public (i.e. non-sensitive) hyperparameters. We examine the previously unexplored tradeoff space between privacy and utility of memory access traces these hyperparameters control. Together, these techniques have improved both utility and bit-leakage by an order of magnitude over naive wringing.

Both wringing and scrubbing present a knob to system designers working with sensitive data and our results have shown that turning this knob can be used to traverse the tradeoff space between privacy and utility effectively. This can further be used to translate and set user privacy expectations into systems.

Bio

I work in computer architecture, and broadly, I am interested in the design of private computer systems. Currently, I am exploring privacy of program traces; the intent is to minimize information leakage in program traces when sharing program behavior for co-optimization. The key tradeoff is to balance the number of bits leaked while maintaining utility of traces shared.

Previously, I have worked on PyRTL, a Python-based RTL specification language, and built the OpenTPU on it. I have also worked on Charm, a high-level architecture modeling language. During my internship at Facebook Reality Labs, I worked on building privacy into AR/VR pipelines and at Microsoft Research, I implemented parameterizable architecture-aware neural network primitives for custom hardware instructions.

Personal home page