Files
MoviePilot/test_openai_copilot_patch.py
jxxghp 34e7c4ac14 feat: enhance openai-compatible provider support and patch responses API instructions handling
- Add compatibility patch for langchain-openai responses API to ensure system messages are extracted as top-level instructions, addressing Codex endpoint requirements.
- Update provider list: add Alibaba, Volcengine, and Tencent TokenHub; adjust SiliconFlow and MiniMax endpoints; refine provider ordering and model list strategies.
- Extend models.dev-only listing logic for providers lacking stable models.list endpoints.
- Increase models.dev cache TTL for improved efficiency.
- Add tests for openai responses API and streaming compatibility patches.
2026-04-30 11:32:55 +08:00

10 lines
514 B
Python

from app.agent.llm.helper import _patch_openai_responses_instructions_support
from langchain_openai import ChatOpenAI
from langchain_core.messages import SystemMessage, HumanMessage
import json
_patch_openai_responses_instructions_support()
model = ChatOpenAI(model="gpt-4o", openai_api_key="sk-123", base_url="https://api.githubcopilot.com", stream_usage=True)
payload = model._get_request_payload([SystemMessage(content="Hello system"), HumanMessage(content="Hello user")])
print(json.dumps(payload, indent=2))