CVPR 2023 – Integral Neural Networks

In this episode, we discuss, CVPR 2023 award candidate, Integral Neural Networks by Kirill Solodskikh, Azim Kurbanov, Ruslan Aydarkhanov, Irina Zhelavskaya, Yury Parfenov, Dehua Song, and Stamatios Lefkimmiatis.

The paper introduces a novel type of deep neural networks called Integral Neural Networks (INNs), which deviate from the traditional representation of network layers as N-dimensional weight tensors. Instead, INNs use a continuous layer representation along the filter and channel dimensions. The weights of INNs are represented as continuous functions defined on N-dimensional hypercubes, and the discrete transformations of inputs to the layers are replaced by continuous integration operations. During the inference stage, the continuous layers can be converted back to the traditional tensor representation using numerical integral quadratures. This representation allows for the arbitrary discretization of a network with various discretization intervals for the integral kernels. The paper demonstrates that INNs can be used for model pruning directly on edge devices, with only a small performance loss at high rates of structural pruning, without the need for fine-tuning. Experimental results on multiple tasks and neural network architectures show that INNs achieve similar performance to their discrete counterparts, while preserving approximately the same performance even with high rates of structural pruning (up to 30%) without fine-tuning. In comparison, conventional pruning methods under the same conditions result in a 65% accuracy loss. The code for implementing INNs is available at gitee.


Posted

in

by

Tags: