Skip to content

fix(llm): prioritize tool calls over text when available_functions is None#4872

Open
alvinttang wants to merge 1 commit intocrewAIInc:mainfrom
alvinttang:fix/tool-calls-precedence-4788
Open

fix(llm): prioritize tool calls over text when available_functions is None#4872
alvinttang wants to merge 1 commit intocrewAIInc:mainfrom
alvinttang:fix/tool-calls-precedence-4788

Conversation

@alvinttang
Copy link

@alvinttang alvinttang commented Mar 14, 2026

Summary

Fixes #4788 — Native tool calls are silently discarded when the LLM returns both text content and tool calls, and available_functions is None.

Root cause: In _handle_non_streaming_response (and its async counterpart), the check at step 5 used (not tool_calls or not available_functions) and text_response, which matched when available_functions=None even if tool_calls were present — returning the text response and discarding the tool calls.

Fix: Reorder the conditional checks so that the "tool calls without available_functions" path is evaluated before the text-return path. The condition for returning text is also tightened from (not tool_calls or not available_functions) to just not tool_calls, making the logic clearer and eliminating the ambiguity.

Both sync (_handle_non_streaming_response) and async (_ahandle_non_streaming_response) code paths are fixed.

Changes

  • lib/crewai/src/crewai/llm.py: Swap steps 5 and 6 in both sync and async response handlers so tool calls take precedence over text responses when available_functions is None.

How it was before (buggy)

# Step 5 — matches when tool_calls exist but available_functions is None, discarding tool calls
if (not tool_calls or not available_functions) and text_response:
    return text_response  # BUG: tool calls lost

# Step 6 — never reached in the buggy scenario
if tool_calls and not available_functions:
    return tool_calls

How it is now (fixed)

# Step 5 — tool calls without available_functions: return them for external execution
if tool_calls and not available_functions:
    return tool_calls

# Step 6 — no tool calls: return text
if not tool_calls and text_response:
    return text_response

Test plan

  • Verify with an LLM that returns both text + tool calls (e.g. Claude via OpenRouter) that tool calls are properly returned and executed
  • Verify that pure text responses (no tool calls) still work correctly
  • Verify that tool calls with available_functions provided still execute normally
  • Run existing test suite: pytest lib/crewai/tests/

Note: PR #4806 addresses the same issue. This PR takes a minimal approach — it reorders the checks without adding emit-call-event side effects to the tool-call return path, keeping the behavior change focused on the fix.

🤖 Generated with Claude Code


Note

Medium Risk
Changes LLM response handling to return tool_calls instead of text when tool calls are present but available_functions is missing, which can change return types for some callers.

Overview
Fixes non-streaming LLM response handling so native tool_calls are no longer dropped when the model returns both text and tool calls but available_functions is None.

In both _handle_non_streaming_response and _ahandle_non_streaming_response, the conditional order is swapped and the text-return condition is tightened to not tool_calls, ensuring tool calls are returned for external execution before falling back to the text response.

Written by Cursor Bugbot for commit d97a9bf. This will update automatically on new commits. Configure here.

…ctions is None

When an LLM returns both text content and tool calls, the response
handler previously checked for text responses first (step 5), which
caused tool calls to be silently discarded when available_functions
was None. This reorders the checks so tool calls take precedence
over text responses, ensuring they are properly returned to the
caller (e.g. crew_agent_executor) for execution.

Fixes both sync (_handle_non_streaming_response) and async
(_ahandle_non_streaming_response) code paths.

Fixes crewAIInc#4788
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] Native tool calls are discarded if LLM returns a text response

1 participant