Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: Object of type ChatResponse is not JSON serializable #5

Open
suc1 opened this issue Dec 5, 2024 · 2 comments
Open

TypeError: Object of type ChatResponse is not JSON serializable #5

suc1 opened this issue Dec 5, 2024 · 2 comments

Comments

@suc1
Copy link

suc1 commented Dec 5, 2024

Your team did an excellent job.

However, when I run the code from README.md, I encountered the following error with json.dumps():
TypeError: Object of type ChatResponse is not JSON serializable

Here’s the relevant snippet from wrapper.py (line 183):

response = self.client.chat(**ollama_kwargs)
logger.debug(
            "Received response: %s",
            json.dumps(response, indent=2, ensure_ascii=False),
            )

If I comment out the logger.debug() line, everything works fine.

@davidaparicio
Copy link
Owner

Hello @suc1, Thanks a lot for this feedback. I will give a lot this week-end in order to fix it 👍

@suc1
Copy link
Author

suc1 commented Dec 19, 2024

@davidaparicio Thank for your work.

Initially, I studied your code to understand the implementation. However, I discovered a simpler approach, as Ollama is compatible with OpenAI.

onLineLLM = False
if onLineLLM:
    modelName = 'gpt-4o-mini'
    client = Swarm()
else:
    modelName = 'llama3.2:1b'       # https://ollama.com/search?c=tools
    from openai import OpenAI
    clientLLM = OpenAI(
        base_url="http://localhost:11434/v1",
        api_key="ollama"
    )
    client = Swarm(client=clientLLM)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants