GPT
相关工具集
GPT私有化部署
部署ChatGPT网页应用
DeepSeek
VsCode集成
Open Web-UI
Ollama
keep AI告警
FastGPT工作流
Dify 工作流
n8n工作流
Hexstrike-AI 渗透测试
Opencode
nodejs
Vibe kanban(AI任务管理看板)
Openclaw
配置示例
飞书对接(群互@)
局域网访问
ACP
cli
claude code
codex
本文档使用MrDoc发布
返回首页
-
+
codex
2026年3月24日 03:44
admin
#安装 npm i -g @openai/codex --- #安装liteLLM(做中转) ####https://docs.litellm.ai/docs/ --- unset http_proxy https_proxy all_proxy HTTP_PROXY HTTPS_PROXY ALL_PROXY --- uv init --- uv add litellm --- uv tool install 'litellm[proxy]' --- ####配置 vim litellm-config/config.yaml --- model_list: # GLM-4.7 - model_name: glm-4.7 litellm_params: model: zai/glm-4.7 api_base: https://open.bigmodel.cn/api/coding/paas/v4 api_key: xxxxxxxxxxxxxxxxxxx # GLM-5 - model_name: glm-5.1 litellm_params: model: zai/glm-5.1 api_base: https://open.bigmodel.cn/api/coding/paas/v4 api_key: xxxxxxxxxxxxxxxxxxx litellm_settings: drop_params: True --- ####启动litellm #前台运行 litellm --config config.yaml --port 9002 --- #后台运行 nohup litellm --config config.yaml --port 9002 & --- #配置 --- vim .codex/config.toml --- model = "glm-5.1" model_provider = "litellm-glm" [model_providers.litellm-glm] name = "litellm-glm" base_url = "http://127.0.0.1:9002" experimental_bearer_token = "any-key" requires_openai_auth = false wire_api = "responses" [projects."/Users/bao"] trust_level = "trusted" [projects."/Users/bao/auto-reply-workflow"] trust_level = "trusted" [tui.model_availability_nux] "gpt-5.5" = 3
分享到: