-
Notifications
You must be signed in to change notification settings - Fork 680
[Bug] 我用了最新版的latest-cu128,运行GLM-4.7-Flash,在opencode中使用报了下面的错误! #4475
Description
Checklist
- 1. I have searched related issues but cannot get the expected help.
- 2. The bug has not been fixed in the latest version.
- 3. Please note that if the bug-related issue you submitted lacks corresponding environment info and a minimal reproducible demo, it will be challenging for us to reproduce and resolve the issue, reducing the likelihood of receiving feedback.
Describe the bug
这个是参数不对吗?
我只使用了
environment:
CUDA_VISIBLE_DEVICES: "0,1,2,3"
CUDA_DEVICE_ORDER: "PCI_BUS_ID"
NCCL_DEBUG: "WARN"
NCCL_P2P_LEVEL: "NVL"
NCCL_P2P_DISABLE: "0"
NCCL_SHM_DISABLE: "0"
NCCL_IB_DISABLE: "0"
NCCL_SOCKET_IFNAME: "eth0"
NCCL_TIMEOUT: "1800"
NCCL_NVLS_ENABLE: "1"
CUDA_LAUNCH_BLOCKING: "0"
CUDA_DEVICE_MAX_CONNECTIONS: "32"
OMP_NUM_THREADS: "8"
OMP_PROC_BIND: "close"
OMP_PLACES: "cores"
TOKENIZERS_PARALLELISM: "false"
ipc: host
shm_size: "32gb"
cpuset: "0-23,48-71"
ulimits:
memlock:
soft: -1
hard: -1
nofile:
soft: 65536
hard: 65536
ports:
- "9991:8000"
volumes:
- /data/ai/models/vllm:/models
- /tmp/lmdeploy_cache_1:/root/.cache
command: >
lmdeploy serve api_server
/models/GLM-4.7-Flash-AWQ
--model-name GLM_4.7_30B
--server-name 0.0.0.0
--server-port 8000
--backend turbomind
--tp 4
--session-len 102400
--log-level INFO
--communicator nccl
在opencode中使用,只发了一句你好, 但是报了下面的错误。
Reproduction
在oepncode中使用,使用的上面的配置
Environment
docker:latest-cu128Error traceback
ERROR: Exception in ASGI application
+ Exception Group Traceback (most recent call last):
| File "/opt/py3/lib/python3.10/site-packages/starlette/_utils.py", line 81, in collapse_excgroups
| yield
| File "/opt/py3/lib/python3.10/site-packages/starlette/responses.py", line 274, in __call__
| async with anyio.create_task_group() as task_group:
| File "/opt/py3/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 783, in __aexit__
| raise BaseExceptionGroup(
| exceptiongroup.ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
+-+---------------- 1 ----------------
| Traceback (most recent call last):
| File "/opt/py3/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 410, in run_asgi
| result = await app( # type: ignore[func-returns-value]
| File "/opt/py3/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in __call__
| return await self.app(scope, receive, send)
| File "/opt/py3/lib/python3.10/site-packages/fastapi/applications.py", line 1159, in __call__
| await super().__call__(scope, receive, send)
| File "/opt/py3/lib/python3.10/site-packages/starlette/applications.py", line 90, in __call__
| await self.middleware_stack(scope, receive, send)
| File "/opt/py3/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in __call__
| raise exc
| File "/opt/py3/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in __call__
| await self.app(scope, receive, _send)
| File "/opt/py3/lib/python3.10/site-packages/starlette/middleware/cors.py", line 88, in __call__
| await self.app(scope, receive, send)
| File "/opt/py3/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 63, in __call__
| await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
| File "/opt/py3/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
| raise exc
| File "/opt/py3/lib/python3.10/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
| await app(scope, receive, sender)
| File "/opt/py3/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__
| await self.app(scope, receive, send)
| File "/opt/py3/lib/python3.10/site-packages/starlette/routing.py", line 660, in __call__
| await self.middleware_stack(scope, receive, send)
| File "/opt/py3/lib/python3.10/site-packages/starlette/routing.py", line 680, in app
| await route.handle(scope, receive, send)
| File "/opt/py3/lib/python3.10/site-packages/starlette/routing.py", line 276, in handle
| await self.app(scope, receive, send)
| File "/opt/py3/lib/python3.10/site-packages/fastapi/routing.py", line 134, in app
| await wrap_app_handling_exceptions(app, request)(scope, receive, send)
| File "/opt/py3/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
| raise exc
| File "/opt/py3/lib/python3.10/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
| await app(scope, receive, sender)
| File "/opt/py3/lib/python3.10/site-packages/fastapi/routing.py", line 121, in app
| await response(scope, receive, send)
| File "/opt/py3/lib/python3.10/site-packages/starlette/responses.py", line 273, in __call__
| with collapse_excgroups():
| File "/usr/lib/python3.10/contextlib.py", line 153, in __exit__
| self.gen.throw(typ, value, traceback)
| File "/opt/py3/lib/python3.10/site-packages/starlette/_utils.py", line 87, in collapse_excgroups
| raise exc
| File "/opt/py3/lib/python3.10/site-packages/starlette/responses.py", line 277, in wrap
| await func()
| File "/opt/py3/lib/python3.10/site-packages/starlette/responses.py", line 250, in stream_response
| async for chunk in self.body_iterator:
| File "/opt/py3/lib/python3.10/site-packages/lmdeploy/serve/openai/api_server.py", line 516, in completion_stream_generator
| async for res in result_generator:
| File "/opt/py3/lib/python3.10/site-packages/lmdeploy/serve/core/async_engine.py", line 335, in generate
| prompt_input = await self.prompt_processor.get_prompt_input(prompt=prompt,
| File "/opt/py3/lib/python3.10/site-packages/lmdeploy/serve/processors/multimodal.py", line 206, in get_prompt_input
| return await self._get_text_prompt_input(prompt=prompt,
| File "/opt/py3/lib/python3.10/site-packages/lmdeploy/serve/processors/multimodal.py", line 333, in _get_text_prompt_input
| prompt = chat_template.messages2prompt(prompt,
| File "/opt/py3/lib/python3.10/site-packages/lmdeploy/model.py", line 724, in messages2prompt
| prompt = self.tokenizer.apply_chat_template(messages,
| File "/opt/py3/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 3063, in apply_chat_template
| rendered_chat, generation_indices = render_jinja_template(
| File "/opt/py3/lib/python3.10/site-packages/transformers/utils/chat_template_utils.py", line 555, in render_jinja_template
| rendered_chat = compiled_template.render(
| File "/opt/py3/lib/python3.10/site-packages/jinja2/environment.py", line 1295, in render
| self.environment.handle_exception()
| File "/opt/py3/lib/python3.10/site-packages/jinja2/environment.py", line 942, in handle_exception
| raise rewrite_traceback_stack(source=source)
| File "<template>", line 56, in top-level template code
| File "/opt/py3/lib/python3.10/site-packages/jinja2/sandbox.py", line 399, in call
| if not __self.is_safe_callable(__obj):
| File "/opt/py3/lib/python3.10/site-packages/jinja2/sandbox.py", line 265, in is_safe_callable
| getattr(obj, "unsafe_callable", False) or getattr(obj, "alters_data", False)
| jinja2.exceptions.UndefinedError: 'str object' has no attribute 'items'
+------------------------------------
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/py3/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 410, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "/opt/py3/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in __call__
return await self.app(scope, receive, send)
File "/opt/py3/lib/python3.10/site-packages/fastapi/applications.py", line 1159, in __call__
await super().__call__(scope, receive, send)
File "/opt/py3/lib/python3.10/site-packages/starlette/applications.py", line 90, in __call__
await self.middleware_stack(scope, receive, send)
File "/opt/py3/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in __call__
raise exc
File "/opt/py3/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in __call__
await self.app(scope, receive, _send)
File "/opt/py3/lib/python3.10/site-packages/starlette/middleware/cors.py", line 88, in __call__
await self.app(scope, receive, send)
File "/opt/py3/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 63, in __call__
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/opt/py3/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
raise exc
File "/opt/py3/lib/python3.10/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
await app(scope, receive, sender)
File "/opt/py3/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__
await self.app(scope, receive, send)
File "/opt/py3/lib/python3.10/site-packages/starlette/routing.py", line 660, in __call__
await self.middleware_stack(scope, receive, send)
File "/opt/py3/lib/python3.10/site-packages/starlette/routing.py", line 680, in app
await route.handle(scope, receive, send)
File "/opt/py3/lib/python3.10/site-packages/starlette/routing.py", line 276, in handle
await self.app(scope, receive, send)
File "/opt/py3/lib/python3.10/site-packages/fastapi/routing.py", line 134, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/opt/py3/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
raise exc
File "/opt/py3/lib/python3.10/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
await app(scope, receive, sender)
File "/opt/py3/lib/python3.10/site-packages/fastapi/routing.py", line 121, in app
await response(scope, receive, send)
File "/opt/py3/lib/python3.10/site-packages/starlette/responses.py", line 273, in __call__
with collapse_excgroups():
File "/usr/lib/python3.10/contextlib.py", line 153, in __exit__
self.gen.throw(typ, value, traceback)
File "/opt/py3/lib/python3.10/site-packages/starlette/_utils.py", line 87, in collapse_excgroups
raise exc
File "/opt/py3/lib/python3.10/site-packages/starlette/responses.py", line 277, in wrap
await func()
File "/opt/py3/lib/python3.10/site-packages/starlette/responses.py", line 250, in stream_response
async for chunk in self.body_iterator:
File "/opt/py3/lib/python3.10/site-packages/lmdeploy/serve/openai/api_server.py", line 516, in completion_stream_generator
async for res in result_generator:
File "/opt/py3/lib/python3.10/site-packages/lmdeploy/serve/core/async_engine.py", line 335, in generate
prompt_input = await self.prompt_processor.get_prompt_input(prompt=prompt,
File "/opt/py3/lib/python3.10/site-packages/lmdeploy/serve/processors/multimodal.py", line 206, in get_prompt_input
return await self._get_text_prompt_input(prompt=prompt,
File "/opt/py3/lib/python3.10/site-packages/lmdeploy/serve/processors/multimodal.py", line 333, in _get_text_prompt_input
prompt = chat_template.messages2prompt(prompt,
File "/opt/py3/lib/python3.10/site-packages/lmdeploy/model.py", line 724, in messages2prompt
prompt = self.tokenizer.apply_chat_template(messages,
File "/opt/py3/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 3063, in apply_chat_template
rendered_chat, generation_indices = render_jinja_template(
File "/opt/py3/lib/python3.10/site-packages/transformers/utils/chat_template_utils.py", line 555, in render_jinja_template
rendered_chat = compiled_template.render(
File "/opt/py3/lib/python3.10/site-packages/jinja2/environment.py", line 1295, in render
self.environment.handle_exception()
File "/opt/py3/lib/python3.10/site-packages/jinja2/environment.py", line 942, in handle_exception
raise rewrite_traceback_stack(source=source)
File "<template>", line 56, in top-level template code
File "/opt/py3/lib/python3.10/site-packages/jinja2/sandbox.py", line 399, in call
if not __self.is_safe_callable(__obj):
File "/opt/py3/lib/python3.10/site-packages/jinja2/sandbox.py", line 265, in is_safe_callable
getattr(obj, "unsafe_callable", False) or getattr(obj, "alters_data", False)
jinja2.exceptions.UndefinedError: 'str object' has no attribute 'items'