Skip to content

Detection models and Python scripts for automated insect monitoring with the Insect Detect DIY camera trap.

License

Notifications You must be signed in to change notification settings

maxsitt/insect-detect

Repository files navigation

Insect Detect - DIY camera trap for automated insect monitoring

DOI PLOS ONE License: GPL v3 DOI Zenodo

This repository contains Python scripts and YOLOv5, YOLOv6, YOLOv7 and YOLOv8 object detection models (.blob format) for testing and deploying the Insect Detect DIY camera trap for automated insect monitoring.

The camera trap system is composed of low-cost off-the-shelf hardware components (Raspberry Pi Zero 2 W, Luxonis OAK-1, PiJuice Zero pHAT), combined with open source software and can be easily assembled and set up with the provided instructions.


Installation

Please make sure that you followed all steps to set up your Raspberry Pi before using the OAK-1 camera.

Install the required dependencies for Raspberry Pi + OAK by running:

sudo curl -fL https://docs.luxonis.com/install_dependencies.sh | bash

Install the package libopenblas-dev (required for latest numpy version):

sudo apt install libopenblas-dev

Install the required packages by running:

python3 -m pip install -r insect-detect/requirements.txt

Check out the Programming section for more details about the scripts and tips on possible software modifications.


Detection models

Model size
(pixels)
mAPval
50-95
mAPval
50
Precisionval
Recallval
SpeedOAK
(fps)
params
(M)
YOLOv5n 320 53.8 96.9 95.5 96.1 49 1.76
YOLOv6n 320 50.3 95.1 96.9 89.8 60 4.63
YOLOv7-tiny 320 53.2 95.7 94.7 94.2 52 6.01
YOLOv8n 320 55.4 94.4 92.2 89.9 39 3.01

Table Notes

  • All models were trained to 300 epochs with batch size 32 and default hyperparameters. Reproduce the model training with the provided Google Colab notebooks.

  • Trained on Insect_Detect_detection dataset version 7, downscaled to 320x320 pixel with only 1 class ("insect").

  • Model metrics (mAP, Precision, Recall) are shown for the original PyTorch (.pt) model before conversion to ONNX -> OpenVINO -> .blob format. Reproduce metrics by using the respective model validation method.

  • Speed (fps) is shown for the converted models (.blob 4 shaves), running on OAK-1 connected to RPi Zero 2 W (~2 fps slower with object tracker). Set cam_rgb.setFps() to the respective fps shown for each model to reproduce the speed measurements.

  • While connected via SSH (X11 forwarding of the frames), print fps to the console and comment out cv2.imshow(), as forwarding the frames will slow down the received message output and thereby fps. If you are using a Raspberry Pi 4 B connected to a screen, fps will be correctly shown in the livestream (see gif).


Processing pipeline

More information about the processing pipeline can be found in the Insect Detect Docs 📑.

Processing pipeline for the yolo_tracker_save_hqsync.py script that can be used for automated insect monitoring:

  • The object tracker output (+ passthrough detections) from inference on LQ frames (e.g. 320x320 px) is synchronized with HQ frames (1920x1080 px) on-device (OAK) using the respective message timestamps.
  • Detections (area of the bounding box) are cropped from the synced HQ frames and saved to .jpg.
  • All relevant metadata from the detection model and tracker output (timestamp, label, confidence score, tracking ID, relative bbox coordinates, .jpg file path) is saved to a metadata .csv file for each cropped detection.
  • Using the default 1080p resolution for the HQ frames will result in an inference and pipeline speed of ~13 fps, which is fast enough to track moving insects. If 4K resolution is used instead, the pipeline speed will decrease to ~3 fps, which reduces tracking accuracy for fast moving insects.

Check out the classification instructions and the insect-detect-ml GitHub repo for information on how to classify the cropped detections with the provided classification model and script.

Take a look at the post-processing instructions for information on how to post-process the metadata with classification results.


License

This repository is licensed under the terms of the GNU General Public License v3.0 (GNU GPLv3).

Citation

If you use resources from this repository, please cite our paper:

Sittinger M, Uhler J, Pink M, Herz A (2024) Insect detect: An open-source DIY camera trap for automated insect monitoring. PLOS ONE 19(4): e0295474. https://doi.org/10.1371/journal.pone.0295474