Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Doc][OPENAI]嵌入模型和总结模型都是openai api,源码部署运行脚本时,说缺model_path,但是没有,咋填?文档没有提示,写None也没用 #2036

Open
2 tasks done
CatyWong10086 opened this issue Sep 22, 2024 · 4 comments
Labels
documentation Improvements or additions to documentation Waiting for reply

Comments

@CatyWong10086
Copy link

Search before asking

  • I had searched in the issues and found no similar feature requirement.

Description

嵌入模型和总结模型都是openai api,源码部署运行脚本时,说缺model_path,但是没有,咋填?文档没有提示,写None也没用
image

Documentation Links

https://docs.dbgpt.site/docs/faq/install

Are you willing to submit PR?

  • Yes I am willing to submit a PR!
@CatyWong10086 CatyWong10086 added documentation Improvements or additions to documentation Waiting for reply labels Sep 22, 2024
@Aries-ckt
Copy link
Collaborator

embedding model use openai

#EMBEDDING_MODEL=proxy_openai
#proxy_openai_proxy_server_url=http://xxx/api/openai/v1
#proxy_openai_proxy_api_key=sk-xx
#proxy_openai_proxy_backend=text-embedding-ada-002

llm service use openai

PROXY_API_KEY=sk-xxx
PROXY_SERVER_URL=http://xxx/api/openai/v1

@ychuest
Copy link

ychuest commented Sep 27, 2024

嵌入模型使用 openai

#EMBEDDING_MODEL=proxy_openai
#proxy_openai_proxy_server_url=http://xxx/api/openai/v1
#proxy_openai_proxy_api_key=sk-xx
#proxy_openai_proxy_backend=text-embedding-ada-002

llm 服务使用 openai

PROXY_API_KEY=sk-xxx
PROXY_SERVER_URL=http://xxx/api/openai/v1

我是使用swift微调了qwen2.5-14B-instruct,然后部署到本地,
我可以使用curl http://172.16.15.242:8000/v1/chat/completions
-H "Content-Type: application/json"
-d '{
"model": "qwen2_5-14b-instruct",
"messages": [{"role": "user", "content": "你是谁?"}],
"stream":true
}'
进行访问我部署的微调模型,但是我该怎么使用DB- GPT去调起我本地部署的大模型服务?
我之前试过像调OpenAI的接口那样,但是会报错,报错信息如下:
image
我该怎么解决这个问题呢,这个问题已经困惑我很久,希望能得到解决。

@riverhell
Copy link

LLM_MODEL=proxyllm
PROXY_SERVER_URL=http://172.16.15.242:8000/v1/chat/completions
PROXYLLM_BACKEND=qwen2_5-14b-instruct

@ychuest
Copy link

ychuest commented Sep 27, 2024

LLM_MODEL=proxyllm PROXY_SERVER_URL=http://172.16.15.242:8000/v1/chat/completions PROXYLLM_BACKEND=qwen2_5-14b-instruct

谢谢你,已解决!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation Waiting for reply
Projects
None yet
Development

No branches or pull requests

4 participants