demo

A neural network, learning live

Two inputs, eight hidden units, one output. Press train and watch the decision boundary deform every frame as SGD adjusts the weights. This is what every neural network does — just with a few more dimensions.

Things to try

  1. Moons with 4 hidden units, tanh, lr = 0.1. Press train. Boundary settles into the right shape in ~50 epochs.
  2. Drop hidden units to 2. Now the model can only draw straight-ish lines through the data — high bias. Bump back to 8 and the curve bends.
  3. Switch to circles. With 8 hidden units, the model learns to wrap around. With 2, it can't.
  4. XOR with 2 hidden units, tanh. Watch it struggle. The historical reason XOR was the death of the perceptron in 1969.
  5. Try a really big learning rate (0.5) on moons. The boundary thrashes. Try 0.005 — it crawls. The middle is where life is.

Anchored to 03-neural-networks/perceptrons-and-mlps.