VGG-11 FHE Implementation
This page demonstrates the VGG-11 inference workflow using FHEON. It highlights the use of optimized convolution, bootstrapping, and fully connected layers for encrypted CIFAR-10 inference.
Overview
The workflow consists of:
Context and Key Generation
Initializes FHEController with parameters such as ring degree, slots, depth, and modulus chain.
Generates rotation keys for convolution, average pooling, and fully connected layers.
Keys are merged, deduplicated, and serialized for reuse.
Data Preparation
Loads CIFAR-10 test images from binary format.
Normalizes pixel values and encrypts them into ciphertext vectors.
VGG-11 Layers
Implements convolution + ReLU blocks with optional bootstrapping.
Average pooling reduces spatial dimensions between stages.
Fully connected layers finalize classification.
Bootstrapping is applied at critical points to refresh ciphertext precision.
Inference Loop
Iterates over encrypted CIFAR-10 images.
Sequentially applies convolution blocks, pooling, bootstrapping, and fully connected layers.
Decrypts outputs and writes predicted labels to file.
Key Functions
convolution_relu_block() – Convolution + bias addition + ReLU with optional bootstrapping.
FClayer_relu_block() – Fully connected layer with bias and optional ReLU.
secure_optimzed_avgPool_multi_channels() – Homomorphic average pooling across channels.
secure_globalAvgPool() – Global pooling before the final fully connected layer.
Optimizations
Rotation Key Deduplication: All layer-specific rotation positions are merged into one set.
Bootstrapping Strategy: Applied selectively after certain convolutions and FC layers.
Dynamic Scaling: ReLU scaling is adjusted based on ciphertext values to maximize accuracy.
Memory Efficiency: Kernel/bias plaintexts are cleared after use to reduce memory overhead.
Full Example Source
You can view the full optimized VGG-11 implementation here: