-
-
Notifications
You must be signed in to change notification settings - Fork 31
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Setup #10
Comments
Please refer to the how it works section for a diagram of how it works Configure MCP-Bridge to point to your inference provider and then point your client at MCP Bridge. OWUI --> MCP-Bridge --> Open Router note that open router must have OAI style tool calling for this to work, and I have not tested it. |
So our application should call the MCP Bridge as a "drop in" for the OpenAI API? I am guessing scaling wouldn't be an issue since deployment via docker can take care of that. |
Yes, you can use MCP-Bridge as a drop in replacement for openai API |
Does this work with docker type of mcp server. How does docker command works inside a container? |
@buryhuang there are a few ways to go about this. You can:
both methods will require you to install the docker CLI in the container build file, and you would specify an STDIO server with the I have considered adding special support to run docker containers directly but I have not gotten around to it yet. |
How exactly do you set it up? Do you point mcp-bridge at my llm provider (openrouter) and then point open webui to the bridge or what?
The text was updated successfully, but these errors were encountered: