CVPR 2023 – Polynomial Implicit Neural Representations For Large Diverse Datasets


In this episode we discuss Polynomial Implicit Neural Representations For Large Diverse Datasets
by Rajhans Singh, Ankita Shukla, Pavan Turaga. The paper proposes a new approach to implicit neural representations (INR) which are popularly used for signal and image representation in various tasks. The current INR architectures rely on sinusoidal positional encoding, limiting their representational power. The proposed Poly-INR model eliminates the need for positional encodings by representing an image with a polynomial function and using element-wise multiplications between features and affine-transformed coordinate locations. The model performs comparably to state-of-the-art generative models without convolution, normalization, or self-attention layers and with fewer trainable parameters.


Posted

in

by

Tags: