Artificial Neural Networks (ANNs) are thinly based on biological neural pathways. In an ANN, each node computes its activation by applying a non-linearity to a weighted sum of its inputs. While this formulation has been wildly successful for a variety of tasks, it is still a far cry from its biological counterpart, largely due to ANNs lack of phase information during computation. In this paper, we adapt ANNs to operate on complex values which naturally allows the inclusion of phase information during the forward pass. We demonstrate that our complex-valued architecture generally performs better compared to real-valued and other complex-valued networks in similar conditions. Additionally, we couple our model with a biologically inspired form of dimensionality reduction and present our findings on the MNIST and MusicNet data sets. |
*** Title, author list and abstract as seen in the Camera-Ready version of the paper that was provided to Conference Committee. Small changes that may have occurred during processing by Springer may not appear in this window.