New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Created an integration to use llms built for Apple's MLX library #13231
Conversation
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @dwight-foster! Taking an initial first pass on it. Looks like we need to tailor pants BUILD files as well.
Could you run pants tailor ::
in the root of this project and commit/push changes?
If you don't have pants
installed, then you can alternatively provide me access to your fork and I can commit changes to your fork/main-branch.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please do write a detailed README for this integration. This appears in our llamahub.ai where a lot of our users will discover our integration packages.
license = "MIT" | ||
name = "llama-index-llms-mlx" | ||
readme = "README.md" | ||
version = "0.1.11" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: should we start this at 0.1.0
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please add base test classes, similar to this one found here: https://github.com/run-llama/llama_index/blob/main/llama-index-integrations/llms/llama-index-llms-anthropic/tests/test_llms_anthropic.py
Thank you for the help @nerdai. I am not really sure why it is still failing. The error log does not give a clear reason. Can you infer anything from it. This is my first time contributing a large integration to this repo so sorry if any of this stuff is obvious. |
docs/BUILD
Outdated
@@ -1 +1,9 @@ | |||
python_sources() | |||
|
|||
poetry_requirements( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @dwight-foster,
Could you share some insights into why this change is required?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It was not required. I was trying to figure out why the build failed the first time and so I tried a bunch of ways at the same time. This was what other build files in llm integrations looked like so I changed it to that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @dwight-foster, I left two comments.
|
||
import logging | ||
|
||
from typing import Any, Callable, Optional, Union |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This import can be extended with Sequence
. Sequence
was imported in the first line.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok
My tests locally are succeeding. I have run pants test :: and pytest. I am not sure why these are failing. I feel like I missed something obvious but any help would be appreciated thank you. |
@dwight-foster maybe it was overkill, but since this package depends on specific hardware, I don't think it makes sense to run tests for this in our CICD Since the existing test wasn't really doing anything, I just deleted it for now |
Ok thank you |
Description
I created an integration to use llms built for or converted to Apples MLX library. I use my mac often to do machine learning prototyping and wanted to be able to use llama index tools with llms that are optimized to run on it. It uses mlx-lm specifically. I referenced the anthropic and huggingface llm integrations to create this one.
Fixes # (issue)
New Package?
Did I fill in the
tool.llamahub
section in thepyproject.toml
and provide a detailed README.md for my new integration or package?Version Bump?
Did I bump the version in the
pyproject.toml
file of the package I am updating? (Except for thellama-index-core
package)Type of Change
Please delete options that are not relevant.
How Has This Been Tested?
Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration
Suggested Checklist:
make format; make lint
to appease the lint gods