Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bad performance for int8 inference on XuanTie 906 (RISC-V) #5447

Open
vassilijnadarajah opened this issue May 6, 2024 · 1 comment
Open

Comments

@vassilijnadarajah
Copy link

vassilijnadarajah commented May 6, 2024

detail | 详细描述 | 詳細な説明

I was wondering if maybe anyone knows or might have an idea as to why the inference time on the XuanTie 906 (RISC-V) processor is so slow for int8 models?
The image can easily be reproduced by running the NCNN benchmark on the Alwinner D1 (1x XuanTie 906). The displayed numbers are the runtimes in milliseconds.

Screenshot 2024-05-06 at 16 01 11 Screenshot 2024-05-06 at 16 01 38

Thanks!

@nihui
Copy link
Member

nihui commented May 6, 2024

Currently ncnn lacks good optimization for int8 models using risc-v and rvv

You may be interested in contributions here
plctlab#2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants