You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@Thunderzen hello! Currently, we don't provide a direct C++ implementation for YOLOv8 using TensorFlow Lite. However, you can perform inference with your .tflite model in C++ by using the TensorFlow Lite C++ API. Here's a basic outline of the steps you'd typically follow:
Load the TensorFlow Lite model:
std::unique_ptr<tflite::FlatBufferModel> model = tflite::FlatBufferModel::BuildFromFile("your_model.tflite");
interpreter->AllocateTensors();
// Set input datafloat* input = interpreter->typed_input_tensor<float>(0);
// Fill 'input' with your input data
interpreter->Invoke();
// Get output datafloat* output = interpreter->typed_output_tensor<float>(0);
Hi, is there documentation on the post-processing after performing interpreter->invoke in c++? For example, code to extract the information like bounding boxes, confidences and class scores?
Hello! Currently, we don't have specific documentation for post-processing YOLOv8 outputs in C++ after using TensorFlow Lite's interpreter->Invoke(). However, typically, you'll need to access the output tensor, which contains the detection results, and then apply the appropriate logic to extract bounding boxes, confidence scores, and class IDs.
Here's a brief example of how you might start accessing the output tensor:
float* output = interpreter->typed_output_tensor<float>(0);
// Output processing code here
The exact details depend on the output format of your model. You might need to reshape the tensor and apply non-max suppression to filter overlapping boxes. For a more detailed guide, you might find TensorFlow's C++ API documentation helpful, or consider exploring community forums for specific examples related to YOLOv8. Hope this helps!
Search before asking
Question
I have the .tflite and the relevant libraries in cpp but unsure on how to perform inference.
Additional
No response
The text was updated successfully, but these errors were encountered: