Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add api server #417

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 12 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -146,6 +146,18 @@ WebUI(bot).run() # bot is the agent defined in the above code, we do not repeat
```
Now you can chat with the Agent in the web UI. Please refer to the [examples](https://github.com/QwenLM/Qwen-Agent/blob/main/examples) directory for more usage examples.

If you need to provide API interfaces, you can quickly launch an API server using the following code:

```py
from qwen_agent.api import ChatApi
ChatApi(bot).run_apiserver() # bot is the agent defined in the above code, we do not repeat the definition here for saving space.
```
Now you can chat with the Agent with api.

```
curl -vvv http://127.0.0.1:8080/chat -H "Content-Type: application/json" -d '{"messages": [{"role":"user","content":"介绍一下自己"}],"stream":true}'
```

# FAQ

## Do you have function calling (aka tool calling)?
Expand Down
15 changes: 14 additions & 1 deletion README_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,10 +17,11 @@ Qwen-Agent是一个开发框架。开发者可基于本框架开发Agent应用

- 从 PyPI 安装稳定版本:
```bash
pip install -U "qwen-agent[rag,code_interpreter,python_executor,gui]"
pip install -U "qwen-agent[rag,code_interpreter,python_executor,gui,apiserver]"
# 或者,使用 `pip install -U qwen-agent` 来安装最小依赖。
# 可使用双括号指定如下的可选依赖:
# [gui] 用于提供基于 Gradio 的 GUI 支持;
# [apiserver] 用于提供apiserver的支持;
# [rag] 用于支持 RAG;
# [code_interpreter] 用于提供代码解释器相关支持;
# [python_executor] 用于支持 Qwen2.5-Math 基于工具的推理。
Expand Down Expand Up @@ -147,6 +148,18 @@ WebUI(bot).run() # bot is the agent defined in the above code, we do not repeat

现在您可以在Web UI中和Agent对话了。更多使用示例,请参阅[examples](./examples)目录。

如果需要提供api接口,可以使用以下代码快速启动apiserver:

```py
from qwen_agent.api import ChatApi
ChatApi(bot).run_apiserver() # bot is the agent defined in the above code, we do not repeat the definition here for saving space.
```
现在您可以通过api接口和Agent对话了。

```
curl -vvv http://127.0.0.1:8080/chat -H "Content-Type: application/json" -d '{"messages": [{"role":"user","content":"介绍一下自己"}],"stream":true}'
```

# FAQ

## 支持函数调用(也称为工具调用)吗?
Expand Down
3 changes: 3 additions & 0 deletions qwen_agent/api/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
from .chat_api import ChatApi

__all__ = ['ChatApi']
33 changes: 33 additions & 0 deletions qwen_agent/api/api_server.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@

import json
from flask import Flask, Response, request, jsonify


app = Flask("apiserver")


def event_stream(items):
for item in items:
msg = json.dumps(item, ensure_ascii=False)
yield f'data: {msg}\n\n'
yield f'data: [DONE]\n\n'


@app.route("/chat", methods=['POST'])
def chat():
if not request.is_json:
return jsonify({"status": 400}), 400
data = request.json
if 'messages' not in data:
return jsonify({"status": 400}), 400
messages = data['messages']
stream = data.get("stream", False)
if stream:
return Response(event_stream(app.chat.chat(messages, stream)), mimetype='text/event-stream')
else:
return jsonify({"messages": list(app.chat.chat(messages, stream))}), 200


def start_apiserver(chat, server_host='0.0.0.0', server_port=8080):
app.chat = chat
app.run(server_host, server_port)
71 changes: 71 additions & 0 deletions qwen_agent/api/chat_api.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,71 @@

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个功能适合注册为一个llm class,参考这个,即当在agent中指定使用这个llm时,会自动启动这个llm server,之后进行交互。

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LLM_REGISTRY 是增加支持的后端llm接口把,我这个是接入qwen-agent后提供给前端使用的

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

和WebUI是平级的,WebUI是提供一个webui的供用户使用的界面,我的pr提供的是http api形式的供用户调用的api。
具体是实现 #247 (comment) 中提到的

(推荐,但qwen-agent还没自带这个功能,开发中)用户自行学习下FastAPI,通过FastAPI把qwen-agent封装成http api服务(如果有遇到阻塞问题,可以开多线程解决)。

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Get了,我尽快测试下,没问题的话会merge这个PR


class ChatApi:

def __init__(self, agent):
self.agent = agent

def run_webui(self, chatbot_config=None, server_name='0.0.0.0', server_port=7860):
from qwen_agent.gui import WebUI
WebUI(
self.agent,
chatbot_config=chatbot_config,
).run(server_name=server_name, server_port=server_port)

def run_apiserver(self, server_name='0.0.0.0', server_port=8080):
from .api_server import start_apiserver
start_apiserver(self, server_name, server_port)

def add_type(self, item):
if 'function_call' in item:
item['type'] = 'function_call'
elif item['role'] == 'function':
item['type'] = 'function_call_output'
elif 'chunk' in item:
item['type'] = 'chunk'
else:
item['type'] = 'message'
return item

def gen_stream(self, response, add_full_msg=False):
last = []
last_msg = ""
for rsp in response:
now = rsp[-1]
now_is_msg = 'function_call' not in now and now['role'] != 'function'
is_new_line = len(last) != len(rsp)
if is_new_line and last:
res = self.add_type(last[-1])
last_msg = ''
if add_full_msg or res['type'] != 'message':
yield res
if now_is_msg:
msg = now['content']
assert msg.startswith(last_msg)
stream_msg = msg[len(last_msg):]
yield self.add_type({'role': now['role'], 'content': '', 'chunk': stream_msg})
last_msg = msg
last = rsp

if last:
res = self.add_type(last[-1])
last_msg = ''
if add_full_msg or res['type'] != 'message':
yield res

def gen(self, response):
last = None
for rsp in response:
if last is not None:
if len(last) != len(rsp):
yield last[-1]
last = rsp
if last:
yield last[-1]

def chat(self, messages, stream=True):
response = self.agent.run(messages)
if stream:
return self.gen_stream(response)
else:
return self.gen(response)
3 changes: 3 additions & 0 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -103,6 +103,9 @@ def read_description() -> str:
'gradio-client==1.4.0',
'modelscope_studio==1.0.0-beta.8',
],
'apiserver': [
'flask>=3'
],
},
url='https://github.com/QwenLM/Qwen-Agent',
)