TResNet-20 Batch Implementations
This page demonstrates the TResNet-20 batch inference workflow using FHEON. It provides exampleS of how to set up the FHE context, configure the BatchANNController, and perform encrypted-domain batch inference on CIFAR-10 images using a ResNet-20 architecture with various batch sizes and configurations.
Overview
The workflow includes:
Context and Key Generation
Initializes the FHEONHEController with parameters such as ring degree, number of slots, circuit depth, and scaling factors.
Generates rotation keys optimized for batch convolutional operations.
Serializes keys for efficient batch FHE computations.
Batch Data Preparation
Reads and preprocesses multiple CIFAR-10 images in batch.
Encrypts batch image data for inference.
TResNet-20 Blocks
Implements convolutional blocks for batch processing.
Handles shortcut connections and residual blocks.
Applies batch ReLU and pooling operations on encrypted data using the BatchANNController.
Implements fully connected layers for batch classification.
Batch Inference Loop
Processes multiple input images simultaneously.
Applies the ResNet-20 layers sequentially to the batch.
Performs encrypted global average pooling and fully connected classification.
Decrypts the batch results and outputs predicted labels.
Batch Configurations
Multiple variants are available with different batch size parameters (N parameter):
TResNet20N16.cpp – Configured for batch processing with N=16
TResNet20N32.cpp – Configured for batch processing with N=32
TResNet20N64.cpp – Configured for batch processing with N=64
TResNet20N128.cpp – Configured for batch processing with N=128
TResNet20N256.cpp – Configured for batch processing with N=256
TResNet20N512.cpp – Configured for batch processing with N=512
Key Functions
The main functions used in the example include:
convolution_block() – Performs convolutional layer on batch encrypted data.
shortcut_convolution_block() – Handles the shortcut connections in batch processing.
double_shortcut_convolution_block() – Implements convolution with shortcut for downsampled layers in batches.
resnet_block() – Combines convolution, ReLU, and shortcut operations for batch processing.
fc_layer_block() – Implements the final fully connected layer for batch classification.
Full Example Source
You can view and download the full source code of these examples: