Skip to content

Issues: lm-sys/FastChat

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Label
Filter by label
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Milestones
Filter by milestone
Assignee
Filter by who’s assigned
Sort

Issues list

LLM + knowledge Graph
#3403 opened Jun 12, 2024 by plyu3
prompt will always be truncated
#3399 opened Jun 12, 2024 by jiaoyangkuohai
Fine-tuning Vicuna-7B
#3396 opened Jun 11, 2024 by dhruvpes
Implement GLM-4 models
#3395 opened Jun 11, 2024 by plyu3
Add AutoCoder LLM
#3393 opened Jun 9, 2024 by upintheairsheep
Add xAI Grok-1
#3391 opened Jun 9, 2024 by upintheairsheep
The accuracy issue of MT bench
#3386 opened Jun 7, 2024 by Luoqiu76
FastChat Code base help
#3381 opened Jun 6, 2024 by Saikumarcloudangles
generate Vicuna error
#3380 opened Jun 6, 2024 by wen020
how to submit with enter
#3379 opened Jun 5, 2024 by crischeng
how to customize free-style LLM?
#3374 opened Jun 2, 2024 by jxgu1016
Does it support Qwen1.5?
#3369 opened May 29, 2024 by balala-s
gguf
#3368 opened May 29, 2024 by Jeffhop
ProTip! Follow long discussions with comments:>50.