Skip to content

Commit

Permalink
community[patch]: Invoke on_llm_new_token callback before yielding ch…
Browse files Browse the repository at this point in the history
…unk (#24938)

**Description**: Invoke on_llm_new_token callback before yielding chunk
in streaming mode
**Issue**:
[#16913](#16913)
  • Loading branch information
anneli-samuel authored Aug 1, 2024
1 parent ff6274d commit 2204d8c
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion libs/community/langchain_community/chat_models/mlx.py
Original file line number Diff line number Diff line change
Expand Up @@ -186,9 +186,9 @@ def _stream(
# yield text, if any
if text:
chunk = ChatGenerationChunk(message=AIMessageChunk(content=text))
yield chunk
if run_manager:
run_manager.on_llm_new_token(text, chunk=chunk)
yield chunk

# break if stop sequence found
if token == eos_token_id or (stop is not None and text in stop):
Expand Down

0 comments on commit 2204d8c

Please sign in to comment.