Privatar: Scalable Privacy-preserving Multi-user VR via Secure Offloading
Jianming Tong ⋅ Hanshen Xiao ⋅ ⋅ Hao Kang ⋅ Ashish Sirasao ⋅ Ziqi Zhang ⋅ G. Edward Suh ⋅ Tushar Krishna
Abstract
Multi-user virtual reality (VR) applications such as football and concert experiences rely on real-time avatar reconstruction to enable immersive interaction. However, rendering avatars for numerous participants on each headset incurs prohibitive computational overhead, fundamentally limiting scalability. This work introduces a framework, Privatar, to offload avatar reconstruction from headset to untrusted devices within the same local network while safeguarding sensitive facial features against adversaries capable of intercepting offloaded data. Privatar builds on two insights. (1) **System level**. We observe identity-bearing information in facial inputs is highly skewed across frequency, and propose **Horizontal Partitioning (HP)** to keep the most identifying frequency components on-device and offloads only low-identifiability components. HP offloads local computation while preserves privacy against expression identification attacks. (2) **Privacy accounting level**. For **individually** offloaded, **multi-dimensional** signals without aggregation, worst-case local Differential Privacy requires prohibitive noise, ruining utility. We observe users’ expression statistical distribution are **stable over time**, and hence propose Distribution-Aware Minimal Perturbation (DAMP). DAMP minimizes noise based on each user’s expression distribution to significantly reduce its effects on utility and accuracy, retaining formal privacy guarantee. On a Meta Quest Pro, Privatar supports up to 2.37$\times$ more concurrent users at 5.7~6.5\% higher reconstruction loss and ~9\% energy overhead, providing a better Pareto frontier in Throughout-Loss over SotA quantization, sparsity, and local reconstruction baseline. Privatar further provides both provable privacy guarantee and stays robust against both empirical attack and NN-based Expression Identification Attack, proving its resilience in practice. Our code is open-sourced at https://anonymous.4open.science/r/Privatar-372A.
Chat is not available.
Successful Page Load