Abstract:
Conventional Neural Networks can approximate simple arithmetic operations, but
fail to generalize beyond the range of numbers that were seen during training.
Neural Arithmetic Units aim to overcome this difficulty, but current arithmetic
units are either limited to operate on positive numbers or can only represent a
subset of arithmetic operations. We introduce the Neural Power Unit (NPU) that
operates on the full domain of real numbers and is capable of learning arbitrary
power functions in a single layer. The NPU thus fixes the shortcomings of existing
arithmetic units and extends their expressivity. We achieve this by using complex
arithmetic without requiring a conversion of the network to complex numbers. A
simplification of the unit to the RealNPU yields a highly transparent model. We
show that the NPUs outperform their competitors in terms of accuracy and sparsity
on artificial arithmetic datasets, and that the RealNPU can discover the governing
equations of a dynamical system only from data.