3

Whenever I tune my neural network, I usually take the common approach of defining some layers with some neurons.

  • If it overfits, I reduce the layers, neurons, add dropout, utilize regularisation.

  • If it underfits, I do the other way around.

But it sometimes feels illogical doing all these. So, is there a more principled way of tuning a neural network (i.e. find the optimal number of layers, neurons, etc., in a principled and mathematical sound way), in case it overfits or underfits?

nbro
  • 40,454
  • 12
  • 105
  • 192
Fasty
  • 151
  • 5

0 Answers0