Hello,
I created my own lm.binary and trie (for just the digits one to nine) using the native binaries and I am trying to use them in the basic android app.
I placed them in the same location as the output_graph.tflite and used this code:
this._m = new DeepSpeechModel(tfliteModel, N_CEP, N_CONTEXT, alphabet, BEAM_WIDTH);
this._m.enableDecoderWihLM(alphabetFile, lmFile, trieFile, LM_ALPHA, LM_BETA);
But on running the inference, it is mapping to words other than the ones present in my generated lm and trie. i.e. Instead of inferring only the numbers one to nine, it does print other words as well. There are no errors in the logs too. Is there something I am missing? Is there a way to check which lm file is being picked up?