Dropout

I’ve discovered what Dropout layers are used for. Facing the problem of overfitting, a bit of research (Google) revealed that Dropout layers are good for reducing overfitting, so I tried them out. Indeed with a configuration that had been giving me considerable overfitting (training loss small, test loss large) a couple of Dropout layers brought them back into line. However the overall result was no better than a Linear Regression model.

So what to do. I can try more variations of parameters, but I’ve covered a decent range already without being exhaustive. And truth is people like Ernie Chan say that ML, including NNs, can’t provide reliable signals, which my output is equivalent to I guess. I think this little exercise has actually served its purpose as a familiarization exercise. In my ongoing study I’ll have a much firmer basis to build new knowledge on. Anyway, I need to get back to finalizing my tax. Fortunately I’m making some progress on that and not letting myself be completely distracted by this ML stuff.