Files
MoviePilot/test_openai_stream_patch.py
jxxghp 34e7c4ac14 feat: enhance openai-compatible provider support and patch responses API instructions handling
- Add compatibility patch for langchain-openai responses API to ensure system messages are extracted as top-level instructions, addressing Codex endpoint requirements.
- Update provider list: add Alibaba, Volcengine, and Tencent TokenHub; adjust SiliconFlow and MiniMax endpoints; refine provider ordering and model list strategies.
- Extend models.dev-only listing logic for providers lacking stable models.list endpoints.
- Increase models.dev cache TTL for improved efficiency.
- Add tests for openai responses API and streaming compatibility patches.
2026-04-30 11:32:55 +08:00

15 lines
313 B
Python

import asyncio
from app.agent.llm.helper import LLMHelper
from app.core.config import settings
import json
async def run():
llm = await LLMHelper.get_llm(
streaming=False,
provider="chatgpt",
model="gpt-5.1-codex",
)
print("streaming:", llm.streaming)
asyncio.run(run())