Skip to content

fix: handle GPT-5.x models not supporting the stop API parameter#5144

Open
lucasgomide wants to merge 1 commit intomainfrom
luzk/gpt5-stop-parameter-support
Open

fix: handle GPT-5.x models not supporting the stop API parameter#5144
lucasgomide wants to merge 1 commit intomainfrom
luzk/gpt5-stop-parameter-support

Conversation

@lucasgomide
Copy link
Copy Markdown
Contributor

@lucasgomide lucasgomide commented Mar 27, 2026

Note

Medium Risk
Changes request parameter construction and retry logic for LiteLLM/OpenAI calls; incorrect model detection could alter stop-word behavior or truncation for some models.

Overview
Prevents GPT‑5 family models from receiving the unsupported stop API parameter by gating it behind supports_stop_words() in both the LiteLLM fallback (LLM._prepare_completion_params) and native OpenAI provider (OpenAICompletion.supports_stop_words).

Improves the “retry without stop” detection to also match LiteLLM’s UnsupportedParamsError message format, and adds unit/integration coverage (including a VCR cassette) to ensure GPT‑5 calls succeed while still applying stop words client-side.

Written by Cursor Bugbot for commit 5ed8633. This will update automatically on new commits. Configure here.

@mintlify
Copy link
Copy Markdown

mintlify bot commented Mar 27, 2026

Preview deployment for your docs. Learn more about Mintlify Previews.

Project Status Preview Updated (UTC)
crewai 🟢 Ready View Preview Mar 27, 2026, 3:05 PM

@lucasgomide lucasgomide changed the title Luzk/gpt5 stop parameter support fix: handle GPT-5.x models not supporting the stop API parameter Mar 27, 2026
@lucasgomide lucasgomide force-pushed the luzk/gpt5-stop-parameter-support branch from f13d617 to 31f52ce Compare March 27, 2026 15:05
@lucasgomide lucasgomide force-pushed the luzk/gpt5-stop-parameter-support branch from 31f52ce to 0553210 Compare March 27, 2026 16:20
Copy link
Copy Markdown

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 1 potential issue.

Fix All in Cursor

Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.

@lucasgomide lucasgomide force-pushed the luzk/gpt5-stop-parameter-support branch from 0553210 to 73b6698 Compare March 27, 2026 16:30
GPT-5.x models reject the `stop` parameter at the API level with "Unsupported parameter: 'stop' is not supported with this model". This breaks CrewAI executions when routing through LiteLLM (e.g. via
OpenAI-compatible gateways like Asimov), because the LiteLLM fallback path always includes `stop` in the API request params.

The native OpenAI provider was unaffected because it never sends `stop` to the API — it applies stop words client-side via `_apply_stop_words()`. However, when the request goes through LiteLLM (custom endpoints, proxy gateways),
`stop` is sent as an API parameter and GPT-5.x rejects it.

Additionally, the existing retry logic that catches this error only matched the OpenAI API error format ("Unsupported parameter") but missed
LiteLLM's own pre-validation error format ("does not support parameters"), so the self-healing retry never triggered for LiteLLM-routed calls.
@lucasgomide lucasgomide force-pushed the luzk/gpt5-stop-parameter-support branch from 73b6698 to 5ed8633 Compare March 27, 2026 19:32

with patch("litellm.completion", side_effect=mock_completion):
with caplog.at_level(logging.INFO):
result = llm.call("What is the capital of France?")
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant