Step-by-step guide to integrate new language models into the system for Python, Node.js, and Next.js stacks.
Integration Flow Overview
OpenRouter Integration (Python stack)
src/chatflow_langchain/service/
custom_gpt
, title
, simple_chat
)src/chatflow_langchain/controller/
src/prompt/langchain/
src/customlib/langchain/
Direct SDK Integration (Python stack)
requirements.txt
openai
, anthropic
, mistral
, etc.src/chatflow_langchain/service/model/config.py
chatopenai_cache.py
src/chatflow_langchain/controller/
src/prompt/langchain/
Node.js Backend Model Configuration
src/config/constants/aimodal.js
OPENROUTER_PROVIDER
, MODAL_NAME
src/config/constants/common.js
MODEL_CODE
src/services/company.js
checkApiKey(...)
to support the new model
Next.js Frontend Integration
public/
directory
src/utils/constant.ts
src/utils/helper.ts
allowImageGeneration
allowImageConversation
Summary of Changes by Path
Path | Purpose |
---|---|
src/chatflow_langchain/service/ | Define services per model |
src/chatflow_langchain/controller/ | Register & route models |
src/prompt/langchain/ | Add prompt logic/templates |
src/customlib/langchain/ | Callbacks & token tracking |
requirements.txt | Add required SDKs |
chatopenai_cache.py | OpenAI-specific cache logic |
src/config/constants/ | Node model constants |
src/services/company.js | API key checker integration |
src/utils/constant.ts | Model image and metadata |
public/ | Model logos/icons |