Inference on Android using custom language model and trie

Hello,
I created my own lm.binary and trie (for just the digits one to nine) using the native binaries and I am trying to use them in the basic android app.
I placed them in the same location as the output_graph.tflite and used this code:

this._m = new DeepSpeechModel(tfliteModel, N_CEP, N_CONTEXT, alphabet, BEAM_WIDTH);
this._m.enableDecoderWihLM(alphabetFile, lmFile, trieFile, LM_ALPHA, LM_BETA);

But on running the inference, it is mapping to words other than the ones present in my generated lm and trie. i.e. Instead of inferring only the numbers one to nine, it does print other words as well. There are no errors in the logs too. Is there something I am missing? Is there a way to check which lm file is being picked up?

if you are passing the proper file path, and there’s no error / crash, there’s mostly no doubt yours are being used.

Is it possible it’s just that your acoustic model is not producing numbers ?

I got the solution to my problem. I believe it was a version mismatch. The lm file was not actually being picked up and even no error was being thrown. I tried to keep the deepspeech codebase version to 0.5.1 and built the lm fresh and was able to get the model working with my lm.

1 Like

Thanks for the feedback, that does make sense. It’d be great if you can identify how this can be improved / where we should have thrown something.