Hello,
Does native_client.py support inferencing of > 1 audio file at the same time? I am looking to use my GPUs for inferencing and optimize the utilization by batching the requests from multiple audio files.
1 Like
No it does not.
However, having such a batching is a good idea and wold also be useful for server based STT systems to batch all requests within some time window on to the GPU at the same time.
1 Like