This package contains the LangChain integrations for OPENAI Compatible OPEA Microservices.
You can install LangChain OPEA package in several ways:
To install the package from the source, run:
pip install poetry && poetry install --with test
To install the package from a pre-built wheel, run:
-
Build the Wheels: Ensure the wheels are built using Poetry.
poetry build
-
Install via Wheel File: Install the package using the generated wheel file.
pip install dist/langchain_opea-0.1.0-py3-none-any.whl
Additionally, you’ll need to run the OpenAI compatible model servers locally. Please refer to these instructions for guidance.
ChatOPEA
class exposes OPENAI Compatible chat models from OPEA.
from langchain_opea import ChatOPEA
llm = ChatOPEA(
model="Intel/neural-chat-7b-v3-3", opea_api_key="my_secret_value", opea_api_base="http://localhost:9009/v1"
)
llm.invoke("Sing a ballad of LangChain.")
OPEAEmbeddings
class exposes OPENAI Compatible embeddings from OPEA.
from langchain_opea import OPEAEmbeddings
embeddings = OPEAEmbeddings(
model="BAAI/bge-large-en-v1.5",
opea_api_key="my_secret_value",
opea_api_base="http://localhost:6006/v1",
)
embeddings.embed_query("What is the meaning of life?")
OPEALLM
class exposes OPENAI Compatible LLMs from OPEA.
from langchain_opea import OPEALLM
llm = OPEALLM(
model="Intel/neural-chat-7b-v3-3", opea_api_key="my_secret_value", opea_api_base="http://localhost:9009/v1"
)
llm.invoke("The meaning of life is")
Check out Samples for more examples using the OPEA Langchain package.