![](dangwal.jpg)
Additionally, the problem of trace privacy can be linked to the well-studied problem of information flow analysis and by learning which addresses in a trace are sensitive to the private input, we can specifically target them with more aggressive leakage mitigation strategies. When sensitive information can be identified at the program level, the impact of that sensitive data can be identified in the resulting address and instruction traces. We present an ensemble of scrubbing techniques that, at the extreme end simply deletes or redacts sensitive addresses from a trace or replaces sensitive data with stand-in addresses that are behaviorally similar. We demonstrate that trace wringing can be improved significantly, even without introducing information sensitivity analysis, through the careful tuning of public (i.e. non-sensitive) hyperparameters. We examine the previously unexplored tradeoff space between privacy and utility of memory access traces these hyperparameters control. Together, these techniques have improved both utility and bit-leakage by an order of magnitude over naive wringing.
Both wringing and scrubbing present a knob to system designers working with sensitive data and our results have shown that turning this knob can be used to traverse the tradeoff space between privacy and utility effectively. This can further be used to translate and set user privacy expectations into systems. [an error occurred while processing this directive] I work in computer architecture, and broadly, I am interested in the design of private computer systems. Currently, I am exploring privacy of program traces; the intent is to minimize information leakage in program traces when sharing program behavior for co-optimization. The key tradeoff is to balance the number of bits leaked while maintaining utility of traces shared.
Previously, I have worked on PyRTL, a Python-based RTL specification language, and built the OpenTPU on it. I have also worked on Charm, a high-level architecture modeling language. During my internship at Facebook Reality Labs, I worked on building privacy into AR/VR pipelines and at Microsoft Research, I implemented parameterizable architecture-aware neural network primitives for custom hardware instructions. [an error occurred while processing this directive] Personal home page [an error occurred while processing this directive] [an error occurred while processing this directive]