Viewing a single comment thread. View all comments

nopainnogain5 OP t1_jccd7vm wrote

I was wondering if this has something to do with lack of experience. And from what I've heard indeed the more you experiment with the models, the better you understand what helps when, to some extent.

The thing is, a neural network still remains a black box, as the number of parameters is too big to fully understand what is happening. It is an empirical study mostly. So you choose your architecture, test, change hyperparameters, test, change the architecture, test, change some other parameters, test, and so on. You can't be sure your model will work properly right away and it may take lots of iterations. With larger models which take long to train it might be overwhelming.

Of course, it might be different in your case, you can start with some toy examples, and if you still like it, I'd recommend playing with larger networks.

1