C.A. Micchelli, W.L. Miranker
Journal of the ACM
Enabling secure inference of large-scale CNNs using Homomorphic Encryption (HE) requires a preliminary step for adapting unencrypted pre-trained models to only use polynomial operations. Prior art advocates for high-degree polynomials for accurate approximations, which comes at the price of extensive computations. We demonstrate that low-degree polynomials can be sufficient for precise approximation even for large-scale DNNs. For that, we introduce a dedicated fine-tuning process on unencrypted data that reduces the input range to the activation functions. The resulting models have competitive accuracy of up to 3.5% degradation from the original non-polynomial model, which outperforms prior art on tasks such as ImageNet classification over ResNet and ConvNeXt. Upon adaptation, these models can process HE-encrypted samples and are ready for secure inference. Based on these, we provide optimization insights for activation functions and skip connections, enhancing HE evaluation efficiency.
We evaluated ResNet50-152 on encrypted ImageNet samples, an accomplishment not previously reached by polynomial networks, in just 3:13–7:12 min, using commodity hardware under the CKKS scheme with 128-bit security. In comparison to prior high-degree polynomial solutions, our low-degree polynomials boost evaluation latency, for example, by for ResNet-50 and CIFAR-10. We further show our approach versatility, by adapting the CLIP model for secure zero-shot predictions, highlighting new potential in HE and transfer learning.
C.A. Micchelli, W.L. Miranker
Journal of the ACM
Saurabh Paul, Christos Boutsidis, et al.
JMLR
Joxan Jaffar
Journal of the ACM
Kenneth L. Clarkson, Elad Hazan, et al.
Journal of the ACM