SIGMA: Secure GPT Inference with Function Secret Sharing
- Kanav Gupta ,
- Neha Jawalkar ,
- Ananta Mukherjee ,
- Nishanth Chandran ,
- Divya Gupta ,
- Ashish Panwar ,
- Rahul Sharma
Secure 2-party computation (2PC) enables secure inference that offers protection for both proprietary machine learning (ML) models and sensitive inputs to them. However, the existing secure inference solutions suffer from high latency and communication overheads, particularly for transformers. Function secret sharing (FSS) is a recent paradigm for obtaining efficient 2PC protocols with a preprocessing phase. We provide SIGMA, the first end-to-end system for secure transformer inference based on FSS. By constructing new FSS-based protocols for complex machine learning functionalities, such as Softmax and GeLU, and also accelerating their computation on GPUs, SIGMA improves the latency of secure inference of transformers by over the state-of-the-art that uses preprocessing and GPUs. We present the first secure inference of generative pre-trained transformer (GPT) models. In particular, SIGMA executes GPT-Neo with 1.3 billion parameters in 7.4s and HuggingFace’s GPT2 in 1.6s.