Neural Network Playground

Design your own Multi-Layer Perceptron (MLP) to solve non-linear classification problems.

Learning Rate: 0.03

Epoch: 0

Loss: 0.0000

Parameters: 0

Deep Learning Intuition

  • Hidden Layers: Allow the network to fold and warp the input space to separate complex classes.
  • Activation (Tanh/ReLU): The source of non-linearity. Without it, deep networks are just linear models!
  • Overfitting: Too many neurons + too few points = a jittery boundary that memorizes noise.