Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How do I prepare the input and the output for tensorflow lite using deepspeech.tflite model ? #3755

Open
Himly1 opened this issue Nov 29, 2022 · 0 comments

Comments

@Himly1
Copy link

Himly1 commented Nov 29, 2022

I am new to TensorFlow so I want to know Is there any documentation or demo that describe how to use the deepspeech-0.9.3-models.tflite ?

I know how to load the tflite model with TensorFlow but I got no idea about prepare the input and the output for the model.
Here is the java code to load the model.

public void loadModel(Context context) throws Exception{
        AssetFileDescriptor fileDescriptor = context.getAssets().openFd("deepspeech-0.9.3-models.tflite");
        FileInputStream is = new FileInputStream(fileDescriptor.getFileDescriptor());
        FileChannel channel = is.getChannel();
        long startOffset = fileDescriptor.getStartOffset();
        long declaredLength = fileDescriptor.getDeclaredLength();
        MappedByteBuffer buffer = channel.map(FileChannel.MapMode.READ_ONLY, startOffset, declaredLength);

        tflite = new Interpreter(buffer, new Interpreter.Options());
    }

But I got no idea how to prepare the input and the output for the function tflite.run()
Here is the definition of the function:

image

Any ideas? Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant