ModelScopeChatEndpoint
ModelScope (Home | GitHub) is built upon the notion of “Model-as-a-Service” (MaaS). It seeks to bring together most advanced machine learning models from the AI community, and streamlines the process of leveraging AI models in real-world applications. The core ModelScope library open-sourced in this repository provides the interfaces and implementations that allow developers to perform model inference, training and evaluation.
This will help you getting started with ModelScope Chat Endpoint.
Overview
Integration details
Provider | Class | Package | Local | Serializable | Package downloads | Package latest |
---|---|---|---|---|---|---|
ModelScope | ModelScopeChatEndpoint | langchain-modelscope-integration | ❌ | ❌ |
Setup
To access ModelScope chat endpoint you'll need to create a ModelScope account, get an SDK token, and install the langchain-modelscope-integration
integration package.
Credentials
Head to ModelScope to sign up to ModelScope and generate an SDK token. Once you've done this set the MODELSCOPE_SDK_TOKEN
environment variable:
import getpass
import os
if not os.getenv("MODELSCOPE_SDK_TOKEN"):
os.environ["MODELSCOPE_SDK_TOKEN"] = getpass.getpass(
"Enter your ModelScope SDK token: "
)
Installation
The LangChain ModelScope integration lives in the langchain-modelscope-integration
package:
%pip install -qU langchain-modelscope-integration
Instantiation
Now we can instantiate our model object and generate chat completions:
from langchain_modelscope import ModelScopeChatEndpoint
llm = ModelScopeChatEndpoint(
model="Qwen/Qwen2.5-Coder-32B-Instruct",
temperature=0,
max_tokens=1024,
timeout=60,
max_retries=2,
# other params...
)