Deepspeech forgetting previous trained data when trained on newer data

I am constantly training new data on deepspeech. but when i train on new data the model is not giving good accuracy for data that is previously trained.
what may be reason behind this, is it because model is giving high weights for last trained data.
I want to always train on new data and still want to get good prediction on data that I first trained.

That’s most likely a case of catastrophic forgetting. The new data influences the network too much.

Have you tried to lower the learning rate?

My learning rate is 0.0001
is this some thing wrong with it.

Try lowering it more, I was hitting something similar, going down to 1e-6 seems to behave much better.

1 Like

thanks…
i will try that.

@lissyx, @yv001
Hi guys…
well I tried 1e-6 as learning rate with around 600 sample with 8 epochs and 1 batch size but model hardly learned anything…

You can try to go to 1e-5 with the learning rate for the same amount of epochs.

If you see forgetting of the original data, you can take a sample of the original data ( like 10%, depending on size of the original dataset) and mix it with the new data.

1 Like