-
Notifications
You must be signed in to change notification settings - Fork 18
Issues: llm-tools/embedJs
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Add extra parameter for loader source titles, return it with the result
enhancement
New feature or request
#67
opened May 29, 2024 by
dr460nf1r3
Switching Loaders / Vectors
enhancement
New feature or request
#58
opened May 18, 2024 by
converseKarl
Better way to manage depenedencies?
dependencies
Pull requests that update a dependency file
question
Further information is requested
#57
opened May 15, 2024 by
adhityan
Support for multi model inputs
enhancement
New feature or request
#56
opened May 15, 2024 by
adhityan
Add post Context/Custom Meta to Loaders / Rag information to targetted Rag(id)
enhancement
New feature or request
#42
opened Apr 30, 2024 by
converseKarl
I'm unsure how to run a model that needs inputs
question
Further information is requested
#39
opened Apr 26, 2024 by
JonahElbaz
Question - can we get the token numbers input and output from the query transaction
question
Further information is requested
#30
opened Apr 23, 2024 by
converseKarl
add an optional fine tuned model id when performing a query once it passes to the LLM
enhancement
New feature or request
#29
opened Apr 23, 2024 by
converseKarl
How to use streaming for OpenAI models?
enhancement
New feature or request
#26
opened Apr 23, 2024 by
benfiratkaya
ProTip!
Exclude everything labeled
bug
with -label:bug.