Evaluate the accuracy and the loss of the model

Hello guys !!
I trained my deepspeech model in the same datasets with differents values of parameters (dropout and learning rate) and I test the model . Then I plot the Train Loss and Test Loss to see the best value of parameters to use.
My approach was as follow :
1 - I fixed the dropout value to 0.3 then I trained my model with different value of the learning rate choosing randomly from this intervale [0.00009 0.1] Then I took the value that gives the minimal Test loss which 0.0002
2 - I fixed the learning rate value to 0.0002 then I trained my model with different values of dropout choosing randomly from this intervale [0.1 0.9] Then I took the value that gives the minimal Test Loss which was 0.4

Finally here are the result I get
WER = 0.53
CER = 0.31
Test Loss = 54.07
Train Loss = 26.66

I have some questions :
the first one, I am wondering if this approach is good or not?
The second one I can not clearly figure out how Loss is calculated ?
And finally if someone have some headlines that I can follow to increase the accuracy of the model ?

Don’t use the test set for choosing hyperparameters, that will lead to fitting your model to the test set itself. It’s basically cheating :stuck_out_tongue:. Use the dev/validation set for choosing the hyperparameters and then only at the very end you use the test set to see how the model does.

1 Like

Thx for replying
Yeah you right :stuck_out_tongue:

Do you know how the CTC loss function is calculated because I can not figure out what is the signification of having 54.07 as a test loss (is it a percentage or what ?)

You can read the CTC paper for an in-depth explanation of how the loss is computed: ftp://ftp.idsia.ch/pub/juergen/icml2006.pdf

I don’t know of any good way to evaluate it absolutely, usually we use it to compare training runs from different experiments. In general if you have done a test epoch then you’ll probably just look at the WER or CER which can be evaluated absolutely and has a much more intuitive meaning.