Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tiktoken and cost tracking issues on gpt-4o #2702

Closed
sonichi opened this issue May 16, 2024 Discussed in #2701 · 7 comments
Closed

Tiktoken and cost tracking issues on gpt-4o #2702

sonichi opened this issue May 16, 2024 Discussed in #2701 · 7 comments
Labels
dependencies Pull requests that update a dependency file good first issue Good for newcomers

Comments

@sonichi
Copy link
Collaborator

sonichi commented May 16, 2024

Discussed in #2701

Originally posted by Nathan-Intergral May 16, 2024
Hey all,

I understand gpt-4o has just come out so apologies if you are already on it, just wanted to bring it to your attention. Of course you guys are on an older version of tiktoken which doesnt support gpt-4o yet, so the chain crashes when you try to run a GroupChat instance, i was able to fix this by just upgrading the dependency myself on my side.

Another issue is that your token_count_utils.py doesnt have gpt-4o of course so cost calculations dont work right now.

Any chance I could get a time estimate on when you will have a release updating these? Thanks!

@sonichi sonichi added good first issue Good for newcomers dependencies Pull requests that update a dependency file labels May 16, 2024
@Hk669
Copy link
Collaborator

Hk669 commented May 16, 2024

@sonichi do we not use the updated version of the tiktoken?

@sonichi
Copy link
Collaborator Author

sonichi commented May 16, 2024

We don't restrict the tiktoken version. So @Nathan-Intergral could you elaborate the old version comment?

@Nathan-Intergral
Copy link
Collaborator

@sonichi Yeah the version is not restricted, its just that the version you download as part of the dependencies don't work right now, not a big deal. As i said in the original post, i simply upgraded the dependency in my personal project to fix the issue.

The only real issue is the second one mentioned in token_count_utils.py

@Nathan-Intergral
Copy link
Collaborator

Nathan-Intergral commented May 17, 2024

Also I have been further testing with gpt-4o and i seem to be getting this error fairly often.

openai.InternalServerError: Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}

here is the full stacktrace:

Traceback (most recent call last):
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/src/group_chat/service.py", line 116, in process_metrics_callback
    await overseer.a_initiate_chat(manager, message=plan)
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1057, in a_initiate_chat
    await self.a_send(msg2send, recipient, silent=silent)
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 682, in a_send
    await recipient.a_receive(message, self, request_reply, silent)
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 829, in a_receive
    reply = await self.a_generate_reply(sender=sender)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1996, in a_generate_reply
    final, reply = await reply_func(
                   ^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/groupchat.py", line 746, in a_run_chat
    reply = await speaker.a_generate_reply(sender=self)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1996, in a_generate_reply
    final, reply = await reply_func(
                   ^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1363, in a_generate_oai_reply
    return await asyncio.get_event_loop().run_in_executor(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1361, in _generate_oai_reply
    return self.generate_oai_reply(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1300, in generate_oai_reply
    extracted_response = self._generate_oai_reply_from_client(
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1319, in _generate_oai_reply_from_client
    response = llm_client.create(
               ^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/oai/client.py", line 638, in create
    response = client.create(params)
               ^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/oai/client.py", line 285, in create
    response = completions.create(**params)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_utils/_utils.py", line 275, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 581, in create
    return self._post(
           ^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 1233, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 922, in request
    return self._request(
           ^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 998, in _request
    return self._retry_request(
           ^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 1046, in _retry_request
    return self._request(
           ^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 998, in _request
    return self._retry_request(
           ^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 1046, in _retry_request
    return self._request(
           ^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 1013, in _request
    raise self._make_status_error_from_response(err.response) from None

i wasnt able to get the printed response returned but id be happy to try to get it if you could point me where i should put a print line

@Metanesia
Copy link

Different version different capabilities for All AI cause the will great if enterprise version if free, still need knowledment. Learn more to model Lang

@Hk669
Copy link
Collaborator

Hk669 commented May 18, 2024

Also I have been further testing with gpt-4o and i seem to be getting this error fairly often.

openai.InternalServerError: Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}

here is the full stacktrace:

Traceback (most recent call last):
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/src/group_chat/service.py", line 116, in process_metrics_callback
    await overseer.a_initiate_chat(manager, message=plan)
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1057, in a_initiate_chat
    await self.a_send(msg2send, recipient, silent=silent)
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 682, in a_send
    await recipient.a_receive(message, self, request_reply, silent)
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 829, in a_receive
    reply = await self.a_generate_reply(sender=sender)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1996, in a_generate_reply
    final, reply = await reply_func(
                   ^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/groupchat.py", line 746, in a_run_chat
    reply = await speaker.a_generate_reply(sender=self)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1996, in a_generate_reply
    final, reply = await reply_func(
                   ^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1363, in a_generate_oai_reply
    return await asyncio.get_event_loop().run_in_executor(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1361, in _generate_oai_reply
    return self.generate_oai_reply(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1300, in generate_oai_reply
    extracted_response = self._generate_oai_reply_from_client(
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1319, in _generate_oai_reply_from_client
    response = llm_client.create(
               ^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/oai/client.py", line 638, in create
    response = client.create(params)
               ^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/oai/client.py", line 285, in create
    response = completions.create(**params)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_utils/_utils.py", line 275, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 581, in create
    return self._post(
           ^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 1233, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 922, in request
    return self._request(
           ^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 998, in _request
    return self._retry_request(
           ^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 1046, in _retry_request
    return self._request(
           ^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 998, in _request
    return self._retry_request(
           ^^^^^^^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 1046, in _retry_request
    return self._request(
           ^^^^^^^^^^^^^^
  File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 1013, in _request
    raise self._make_status_error_from_response(err.response) from None

i wasnt able to get the printed response returned but id be happy to try to get it if you could point me where i should put a print line

Got it. Thanks

@ekzhu
Copy link
Collaborator

ekzhu commented May 21, 2024

#2717

@ekzhu ekzhu closed this as completed May 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

5 participants