-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Time to give back!! #56
Comments
I am waiting for the image to upload I used a custom version , if you want to take over this I will happily release it anyway , @adamluzsi thanks for all your help. |
hey , I am am fixing it so that it does not require sign up. |
Hey @dekubu! So lovely to see you again! I find the idea fantastic. But I think we need to iterate on it because it is too chatty, and the examples are not necessarily complete or use the right API tools. Probably if we improve the comments of the examples in the wiki, we could then use them as file inputs for the GPT Assistant model. |
Way cool!
We could indeed improve accuracy by adding to the knowlegde base. I have
also built a little tool that could help. let me run it against the code
base and see the results.
All very exciting lol
Delaney Burke
Founder
***@***.***
vidtreon.com <http://vidtreon.com>
On November 14, 2023, GitHub ***@***.***> wrote:
Hey @dekubu <https://github.com/dekubu>! So lovely to see you again!
I find the idea fantastic. But I think we need to iterate on it
because it is too chatty, and the examples are not necessarily
complete or use the right API tools. Probably if we improve the
comments of the examples in the wiki, we could then use them as file
inputs for the GPT Assistant model.
—
Reply to this email directly, view it on GitHub
<#56 (comment)-
1809787518>, or unsubscribe
<https://github.com/notifications/unsubscribe-
auth/BBVMTTYCB5PO7AXLEBPRDETYEMXA3AVCNFSM6AAAAAA7KK7VFWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQMBZG44DONJRHA>.
You are receiving this because you were mentioned.Message ID: <rack-
***@***.***>
|
Delaney / @dekubu, your strong passion and enthusiasm are infectious – it really made my day hearing how excited you are! 😄 |
This is the git repo where the examples are kept: |
thanks for links ! hmmmm, I learned about super cool techinque Called svrs! , it might work might not but will give it a go. With llm's you have issue of the prompt size and fact that they after while just forget shit lol. How ether, there is far simpler way detailed below but I don;t know if it works with code, this would a fanatic time to find out. https://github.com/daveshap/SparsePrimingRepresentations. I am mentioning this because I don't know how much of the source code and examples we can upload but we shall see, |
Oh, as I think of things will put them here. One area that we could do Is to enable this bot to act as generator like the rails generator ( baed on principles of minimalism) , we can ask the bot for standard projects, make sense? Since you @adamluzsi know this project well , while I am investigating above , if you like the idea , can you give it some thought? |
GPTv4 turbo has the 128K token context limitation. However the Assistant instruction has 8K limit, so we need to upload the examples as files to the assistant, and keep the instruction/system prompt focusing on how it should answer the queries. I think parts like "consummate expert on the rack-app gem" is great as it reduce the chance for hallucination, however, we might need to tailor the expression to something with which GPT had bigger learning data, such as
We need to:
|
Okay, so we have a problem. The UI only accepts 10 Files and errors out on more, so I need to make a script in the wiki repo that builds up a single markdown file from them. Here is an example GPT I made with the inspiration you gave, plus the updates.: |
I am going to delete my version, and lets concentrate on your version.
at some point if I help out enough you can add me as a contributor of
the project on the AI side lol would help with the job search and I if
you have not yet worked out i do like the project and and the
approach!.
Yes that is in deed the problem.
I ran into this problem as well . GPT is great for new code but solving
the issue with exiting code was a fun challenge.
I am proposing we use the SVR approach,
here's how,
We flatten the git repository by reading in all the files and extract
file contents, then create svr's for each class and example and load in
the svrs as part of the context when we start the conversation.
Make sense?
Delaney Burke
Founder
***@***.***
vidtreon.com <http://vidtreon.com>
On November 14, 2023, GitHub ***@***.***> wrote:
This is the git repo where the examples are kept:
<https://github.com/rack-app/rack-app.wiki.git>
(<https://github.com/rack-app/rack-app/wiki>)
—
Reply to this email directly, view it on GitHub
<#56 (comment)-
1809841869>, or unsubscribe
<https://github.com/notifications/unsubscribe-
auth/BBVMTTZ6G2Q22Z7LZDYIRTLYEM2STAVCNFSM6AAAAAA7KK7VFWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQMBZHA2DCOBWHE>.
You are receiving this because you were mentioned.Message ID: <rack-
***@***.***>
|
You know, you don't have to prove anything – in my opinion, you've already contributed so much! Your motivation inspired me to update our wiki pages while discussing various topics, and your feedback led to some fantastic quality-of-life improvements in the core rack-app project. 🙌 Can you make a pull request with the following commit:
|
Maybe reading all the files and content is a bit too much. We could start with using ripper to extract the examples from all the rspec files ( Probably, a bash script that reads each file and adds an extra "#" in front of the titles can already achieve this for us, and then we have a really nice knowledge file already. Or maybe in the first round, it is enough to skip ripper and instead just concat all the markdown documentation in the wiki, as GPT will be smart enough to know how to interpret it. |
I updated the GPT assistant template with a new concatenated |
ok I have been testing it Ok this so cool, it has solidified knowledge base concept for me. so I total get how to make things better! now. can we put the knowledge base and the prompt into a place where we can both see it please? |
One thing I have also noticed is that it regularly shows examples of Rack , I think we should fix that, what do you think? |
I have both the prompt and the knowledge.md in the wiki/GPT. For example, the stream block is only usable within an endpoint block, not at a class level. 🙈 |
Way cool ! I will check that out, thanks |
could just be me but cant find the link ? do you have to expose it? |
everyday is school day! lol , I have them now , so thanks!. |
I wish I could conveniently access the OpenAI API from the GPT Assistant UI. |
Hey , I have been busy learning some new stuff . talking of which https://github.com/BuilderIO/gpt-crawler Do you think this could help? we could point it at the wiki and blogs etc. |
Idea being we build a knowledge base of everything RACK::APP |
I checked the GPT-crawler, and I like some of the perspectives it introduces, such as using JSON as a format for the knowledge file. |
Happy end of the year holiday season @dekubu! |
@adamluzsi hey hope all is well
https://chat.openai.com/g/g-Coosp4Vb8-rack-app
I think this would be a great addition to the project,
Prompt:
Rack::App is the consummate expert on the rack-app gem, delivering advice with a blend of friendliness, informativeness, and authority. It is dedicated to the rack-app and Rack, avoiding discussions of other frameworks like Sinatra and focusing on minimalistic solutions. Rack::App will confidently advise on the use of extensions and mounts, and when faced with questions beyond its scope, it will acknowledge its limits. This GPT will engage users in a professional yet approachable manner, ensuring communications are clear and helpful, reinforcing its role as a trusted resource in the Ruby community.
The text was updated successfully, but these errors were encountered: