OneAdapt: Fast Adaptation for Deep Learning Applications via Backpropagation
- Kuntai Du ,
- Yihan Liu ,
- Qizheng Zhang ,
- Haodong Wwang ,
- Yuyang Huang ,
- Ganesh Ananthanarayanan ,
- Junchen Jiang
ACM Symposium on Cloud Computing (SoCC) |
As deep-learning applications become ubiquitous, they are often bottlenecked by network bandwidth (when gathering high-fidelity data) and GPU cycles (when running deep neural nets or DNNs). Previous studies have shown that high accuracy can be achieved by optimally adapting key configuration knobs, such as video resolution, frame-discarding threshold, the granularity of data compression, etc. However, no existing techniques can simultaneously meet the three requirements of an ideal configuration adaptation system: (i)
timely adaptation when optimal configuration changes, (ii) using minimum extra GPU computation, and (iii) doing so for a range of configuration knobs.
To meet these requirements, we present OneAdapt, the first system that leverages the differentiability of DNNs to quickly adapt the configurations of DNN-based analytics systems, such as video object detection and segmentation. The key of OneAdapt is a method to efficiently estimate the gradient of DNN’s inference accuracy with respect to a small change in each configuration knob, which we refer to as AccGrad. Thanks to DNN’s differentiability, it is possible to quickly estimate AccGrad for multiple configuration knobs with a single DNN backpropagation. With AccGrad’s estimates being frequently updated, OneAdapt runs a gradient-ascent strategy to gradually tune configurations such that each adaptation gradually improves accuracy-resource tradeoff (higher accuracy, less resource usage, or both). We evaluate OneAdapt over a range of applications: five types of configurations, four computer-vision tasks, and five types of input data. We show that compared to state-of-the-art adaptation schemes, OneAdapt reduces resource bandwidth usage and GPU usage by 15-59% while achieving the same accuracy, or increases accuracy by 1-5% with the same or less resource usage.