Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: Standardize Tongyi #25103

Merged
merged 2 commits into from
Aug 6, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
89 changes: 79 additions & 10 deletions libs/community/langchain_community/llms/tongyi.py
Original file line number Diff line number Diff line change
Expand Up @@ -158,33 +158,102 @@ async def agenerate_with_last_element_mark(


class Tongyi(BaseLLM):
"""Tongyi Qwen large language models.
"""Tongyi completion model integration.

To use, you should have the ``dashscope`` python package installed, and the
environment variable ``DASHSCOPE_API_KEY`` set with your API key, or pass
it as a named parameter to the constructor.
Setup:
Install ``dashscope`` and set environment variables ``DASHSCOPE_API_KEY``.

Example:
.. code-block:: bash

pip install dashscope
export DASHSCOPE_API_KEY="your-api-key"

Key init args — completion params:
model: str
Name of Tongyi model to use.
top_p: float
Total probability mass of tokens to consider at each step.
streaming: bool
Whether to stream the results or not.

Key init args — client params:
api_key: Optional[str]
Dashscope API KEY. If not passed in will be read from env var DASHSCOPE_API_KEY.
max_retries: int
Maximum number of retries to make when generating.

See full list of supported init args and their descriptions in the params section.

Instantiate:
.. code-block:: python

from langchain_community.llms import Tongyi
tongyi = tongyi()
"""

llm = Tongyi(
model="qwen-max",
# top_p="...",
# api_key="...",
# other params...
)

Invoke:
.. code-block:: python

messages = [
("system", "你是一名专业的翻译家,可以将用户的中文翻译为英文。"),
("human", "我喜欢编程。"),
]
llm.invoke(messages)

.. code-block:: python

'I enjoy programming.'

Stream:
.. code-block:: python

for chunk in llm.stream(messages):
print(chunk)

.. code-block:: python

I
enjoy
programming
.

Async:
.. code-block:: python

await llm.ainvoke(messages)

# stream:
# async for chunk in llm.astream(messages):
# print(chunk)

# batch:
# await llm.abatch([messages])

.. code-block:: python

'I enjoy programming.'

""" # noqa: E501

@property
def lc_secrets(self) -> Dict[str, str]:
return {"dashscope_api_key": "DASHSCOPE_API_KEY"}

client: Any #: :meta private:
model_name: str = "qwen-plus"
model_name: str = Field(default="qwen-plus", alias="model")

"""Model name to use."""
model_kwargs: Dict[str, Any] = Field(default_factory=dict)

top_p: float = 0.8
"""Total probability mass of tokens to consider at each step."""

dashscope_api_key: Optional[str] = None
dashscope_api_key: Optional[str] = Field(default=None, alias="api_key")
"""Dashscope api key provide by Alibaba Cloud."""

streaming: bool = False
Expand All @@ -202,7 +271,7 @@ def _llm_type(self) -> str:
def validate_environment(cls, values: Dict) -> Dict:
"""Validate that api key and python package exists in environment."""
values["dashscope_api_key"] = get_from_dict_or_env(
values, "dashscope_api_key", "DASHSCOPE_API_KEY"
values, ["dashscope_api_key", "api_key"], "DASHSCOPE_API_KEY"
)
try:
import dashscope
Expand Down
7 changes: 7 additions & 0 deletions libs/community/tests/integration_tests/llms/test_tongyi.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,3 +27,10 @@ def test_tongyi_generate_stream() -> None:
print(output) # noqa: T201
assert isinstance(output, LLMResult)
assert isinstance(output.generations, list)


def test_tongyi_with_param_alias() -> None:
"""Test tongyi parameters alias"""
llm = Tongyi(model="qwen-max", api_key="your-api_key") # type: ignore[call-arg]
assert llm.model_name == "qwen-max"
assert llm.dashscope_api_key == "your-api_key"
Loading