Neural Networks Are Polynomial Regression

Norman Matloff announces a new paper:

A summary of the paper is:

  • We present a very simple, informal mathematical argument that neural networks (NNs) are in essence polynomial regression (PR). We refer to this as NNAEPR.

  • NNAEPR implies that we can use our knowledge of the “old-fashioned” method of PR to gain insight into how NNs — widely viewed somewhat warily as a “black box” — work inside.

  • One such insight is that the outputs of an NN layer will be prone to multicollinearity, with the problem becoming worse with each successive layer. This in turn may explain why convergence issues often develop in NNs. It also suggests that NN users tend to use overly large networks.

  • NNAEPR suggests that one may abandon using NNs altogether, and simply use PR instead.

  • We investigated this on a wide variety of datasets, and found that in every case PR did as well as, and often better than, NNs.

  • We have developed a feature-rich R package, polyreg, to facilitate using PR in multivariate settings.

The paper and presentation slides are ungated, so check it out.  H/T R-bloggers

Related Posts

Building TensorFlow Neural Networks On Spark With Keras

Jules Damji has an example of using the PyCharm IDE to use Keras to build TensorFlow neural network models on the Databricks MLflow library: Our example in the video is a simple Keras network, modified from Keras Model Examples, that creates a simple multi-layer binary classification model with a couple of hidden and dropout layers and […]

Read More

When Image Classifiers Look At Unknown Objects

Pete Warden explains that image classifiers aren’t magic: As people, we’re used to being able to classify anything we see in the world around us, and we naturally expect machines to have the same ability. Most models are only trained to recognize a very limited set of objects though, such as the 1,000 categories of the […]

Read More

Categories

June 2018
MTWTFSS
« May Jul »
 123
45678910
11121314151617
18192021222324
252627282930