From fa970a45c42ffa40e7b1c98593867801280b92b5 Mon Sep 17 00:00:00 2001 From: Bagatur Date: Mon, 8 Jul 2024 07:49:15 -0700 Subject: [PATCH 1/3] docs: together standard page --- docs/docs/integrations/chat/together.ipynb | 218 +++++++++++++++++---- 1 file changed, 180 insertions(+), 38 deletions(-) diff --git a/docs/docs/integrations/chat/together.ipynb b/docs/docs/integrations/chat/together.ipynb index 4a3b07e57d149..6adc071051fb1 100644 --- a/docs/docs/integrations/chat/together.ipynb +++ b/docs/docs/integrations/chat/together.ipynb @@ -1,103 +1,245 @@ { "cells": [ + { + "cell_type": "raw", + "id": "afaf8039", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: Together\n", + "---" + ] + }, { "cell_type": "markdown", - "id": "2970dd75-8ebf-4b51-8282-9b454b8f356d", + "id": "e49f1e0d", "metadata": {}, "source": [ - "# Together AI\n", + "# ChatTogether\n", + "\n", + "- TODO: Make sure API reference link is correct.\n", + "\n", + "This page will help you get started with Together AI [chat models](/docs/concepts/#chat-models). For detailed documentation of all ChatTogether features and configurations head to the [API reference](https://api.python.langchain.com/en/latest/chat_models/langchain_together.chat_models.ChatTogether.html).\n", + "\n", + "[Together AI](https://www.together.ai/) offers an API to query [50+ leading open-source models](https://docs.together.ai/docs/inference-models)\n", + "\n", + "## Overview\n", + "### Integration details\n", + "\n", + "- TODO: Fill in table features.\n", + "- TODO: Remove JS support link if not relevant, otherwise ensure link is correct.\n", + "- TODO: Make sure API reference links are correct.\n", + "\n", + "| Class | Package | Local | Serializable | [JS support](https://js.langchain.com/v0.2/docs/integrations/chat/together) | Package downloads | Package latest |\n", + "| :--- | :--- | :---: | :---: | :---: | :---: | :---: |\n", + "| [ChatTogether](https://api.python.langchain.com/en/latest/chat_models/langchain_together.chat_models.ChatTogether.html) | [langchain-together](https://api.python.langchain.com/en/latest/together_api_reference.html) | ✅/❌ | beta/❌ | ✅/❌ | ![PyPI - Downloads](https://img.shields.io/pypi/dm/langchain-together?style=flat-square&label=%20) | ![PyPI - Version](https://img.shields.io/pypi/v/langchain-together?style=flat-square&label=%20) |\n", "\n", - "[Together AI](https://www.together.ai/) offers an API to query [50+ leading open-source models](https://docs.together.ai/docs/inference-models) in a couple lines of code.\n", + "### Model features\n", + "| [Tool calling](/docs/how_to/tool_calling) | [Structured output](/docs/how_to/structured_output/) | JSON mode | [Image input](/docs/how_to/multimodal_inputs/) | Audio input | Video input | [Token-level streaming](/docs/how_to/chat_streaming/) | Native async | [Token usage](/docs/how_to/chat_token_usage_tracking/) | [Logprobs](/docs/how_to/logprobs/) |\n", + "| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |\n", + "| ✅/❌ | ✅/❌ | ✅/❌ | ✅/❌ | ✅/❌ | ✅/❌ | ✅/❌ | ✅/❌ | ✅/❌ | ✅/❌ | \n", "\n", - "This example goes over how to use LangChain to interact with Together AI models." + "## Setup\n", + "\n", + "- TODO: Update with relevant info.\n", + "\n", + "To access Together models you'll need to create a/an Together account, get an API key, and install the `langchain-together` integration package.\n", + "\n", + "### Credentials\n", + "\n", + "- TODO: Update with relevant info.\n", + "\n", + "Head to (TODO: link) to sign up to Together and generate an API key. Once you've done this set the TOGETHER_API_KEY environment variable:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "433e8d2b-9519-4b49-b2c4-7ab65b046c94", + "metadata": {}, + "outputs": [], + "source": [ + "import getpass\n", + "import os\n", + "\n", + "os.environ[\"TOGETHER_API_KEY\"] = getpass.getpass(\"Enter your Together API key: \")" ] }, { "cell_type": "markdown", - "id": "1c47fc36", + "id": "72ee0c4b-9764-423a-9dbf-95129e185210", "metadata": {}, "source": [ - "## Installation" + "If you want to get automated tracing of your model calls you can also set your [LangSmith](https://docs.smith.langchain.com/) API key by uncommenting below:" ] }, { "cell_type": "code", "execution_count": null, - "id": "1ecdb29d", + "id": "a15d341e-3e26-4ca3-830b-5aab30ed66de", "metadata": {}, "outputs": [], "source": [ - "%pip install --upgrade langchain-together" + "# os.environ[\"LANGSMITH_API_KEY\"] = getpass.getpass(\"Enter your LangSmith API key: \")\n", + "# os.environ[\"LANGSMITH_TRACING\"] = \"true\"" ] }, { "cell_type": "markdown", - "id": "89883202", + "id": "0730d6a1-c893-4840-9817-5e5251676d5d", "metadata": {}, "source": [ - "## Environment\n", + "### Installation\n", "\n", - "To use Together AI, you'll need an API key which you can find here:\n", - "https://api.together.ai/settings/api-keys. This can be passed in as an init param\n", - "``together_api_key`` or set as environment variable ``TOGETHER_API_KEY``.\n" + "The LangChain Together integration lives in the `langchain-together` package:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "652d6238-1f87-422a-b135-f5abbb8652fc", + "metadata": {}, + "outputs": [], + "source": [ + "%pip install -qU langchain-together" ] }, { "cell_type": "markdown", - "id": "8304b4d9", + "id": "a38cde65-254d-4219-a441-068766c0d4b5", "metadata": {}, "source": [ - "## Example" + "## Instantiation\n", + "\n", + "Now we can instantiate our model object and generate chat completions:\n", + "\n", + "- TODO: Update model instantiation with relevant params." ] }, { "cell_type": "code", "execution_count": null, - "id": "637bb53f", + "id": "cb09c344-1836-4e0c-acf8-11d13ac1dbae", "metadata": {}, "outputs": [], "source": [ - "# Querying chat models with Together AI\n", - "\n", "from langchain_together import ChatTogether\n", "\n", - "# choose from our 50+ models here: https://docs.together.ai/docs/inference-models\n", - "chat = ChatTogether(\n", - " # together_api_key=\"YOUR_API_KEY\",\n", - " model=\"meta-llama/Llama-3-70b-chat-hf\",\n", - ")\n", - "\n", - "# stream the response back from the model\n", - "for m in chat.stream(\"Tell me fun things to do in NYC\"):\n", - " print(m.content, end=\"\", flush=True)\n", + "llm = ChatTogether(\n", + " model=\"model-name\",\n", + " temperature=0,\n", + " max_tokens=None,\n", + " timeout=None,\n", + " max_retries=2,\n", + " # other params...\n", + ")" + ] + }, + { + "cell_type": "markdown", + "id": "2b4f3e15", + "metadata": {}, + "source": [ + "## Invocation\n", "\n", - "# if you don't want to do streaming, you can use the invoke method\n", - "# chat.invoke(\"Tell me fun things to do in NYC\")" + "- TODO: Run cells so output can be seen." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "62e0dbc3", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "messages = [\n", + " (\n", + " \"system\",\n", + " \"You are a helpful assistant that translates English to French. Translate the user sentence.\",\n", + " ),\n", + " (\"human\", \"I love programming.\"),\n", + "]\n", + "ai_msg = llm.invoke(messages)\n", + "ai_msg" ] }, { "cell_type": "code", "execution_count": null, - "id": "e7b7170d-d7c5-4890-9714-a37238343805", + "id": "d86145b3-bfef-46e8-b227-4dda5c9c2705", "metadata": {}, "outputs": [], "source": [ - "# Querying code and language models with Together AI\n", + "print(ai_msg.content)" + ] + }, + { + "cell_type": "markdown", + "id": "18e2bfc0-7e78-4528-a73f-499ac150dca8", + "metadata": {}, + "source": [ + "## Chaining\n", + "\n", + "We can [chain](/docs/how_to/sequence/) our model with a prompt template like so:\n", "\n", - "from langchain_together import Together\n", + "- TODO: Run cells so output can be seen." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "e197d1d7-a070-4c96-9f8a-a0e86d046e0b", + "metadata": {}, + "outputs": [], + "source": [ + "from langchain_core.prompts import ChatPromptTemplate\n", "\n", - "llm = Together(\n", - " model=\"codellama/CodeLlama-70b-Python-hf\",\n", - " # together_api_key=\"...\"\n", + "prompt = ChatPromptTemplate.from_messages(\n", + " [\n", + " (\n", + " \"system\",\n", + " \"You are a helpful assistant that translates {input_language} to {output_language}.\",\n", + " ),\n", + " (\"human\", \"{input}\"),\n", + " ]\n", ")\n", "\n", - "print(llm.invoke(\"def bubble_sort(): \"))" + "chain = prompt | llm\n", + "chain.invoke(\n", + " {\n", + " \"input_language\": \"English\",\n", + " \"output_language\": \"German\",\n", + " \"input\": \"I love programming.\",\n", + " }\n", + ")" + ] + }, + { + "cell_type": "markdown", + "id": "d1ee55bc-ffc8-4cfa-801c-993953a08cfd", + "metadata": {}, + "source": [ + "## TODO: Any functionality specific to this model provider\n", + "\n", + "E.g. creating/using finetuned models via this provider. Delete if not relevant." + ] + }, + { + "cell_type": "markdown", + "id": "3a5bb5ca-c3ae-4a58-be67-2cd18574b9a3", + "metadata": {}, + "source": [ + "## API reference\n", + "\n", + "For detailed documentation of all ChatTogether features and configurations head to the API reference: https://api.python.langchain.com/en/latest/chat_models/langchain_together.chat_models.ChatTogether.html" ] } ], "metadata": { "kernelspec": { - "display_name": ".venv", + "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, @@ -111,7 +253,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.11.4" + "version": "3.11.9" } }, "nbformat": 4, From 29255d9dae7c98ba0d6086776c9b997811ff3f26 Mon Sep 17 00:00:00 2001 From: isaac hershenson Date: Wed, 24 Jul 2024 11:27:27 -0700 Subject: [PATCH 2/3] first draft --- docs/docs/integrations/chat/together.ipynb | 104 ++++++++++++--------- 1 file changed, 61 insertions(+), 43 deletions(-) diff --git a/docs/docs/integrations/chat/together.ipynb b/docs/docs/integrations/chat/together.ipynb index 6adc071051fb1..d2444c5012034 100644 --- a/docs/docs/integrations/chat/together.ipynb +++ b/docs/docs/integrations/chat/together.ipynb @@ -17,44 +17,35 @@ "source": [ "# ChatTogether\n", "\n", - "- TODO: Make sure API reference link is correct.\n", "\n", - "This page will help you get started with Together AI [chat models](/docs/concepts/#chat-models). For detailed documentation of all ChatTogether features and configurations head to the [API reference](https://api.python.langchain.com/en/latest/chat_models/langchain_together.chat_models.ChatTogether.html).\n", + "This page will help you get started with Together AI [chat models](../../concepts.mdx#chat-models). For detailed documentation of all ChatTogether features and configurations head to the [API reference](https://api.python.langchain.com/en/latest/chat_models/langchain_together.chat_models.ChatTogether.html).\n", "\n", "[Together AI](https://www.together.ai/) offers an API to query [50+ leading open-source models](https://docs.together.ai/docs/inference-models)\n", "\n", "## Overview\n", "### Integration details\n", "\n", - "- TODO: Fill in table features.\n", - "- TODO: Remove JS support link if not relevant, otherwise ensure link is correct.\n", - "- TODO: Make sure API reference links are correct.\n", - "\n", - "| Class | Package | Local | Serializable | [JS support](https://js.langchain.com/v0.2/docs/integrations/chat/together) | Package downloads | Package latest |\n", + "| Class | Package | Local | Serializable | [JS support](https://js.langchain.com/v0.2/docs/integrations/chat/togetherai) | Package downloads | Package latest |\n", "| :--- | :--- | :---: | :---: | :---: | :---: | :---: |\n", - "| [ChatTogether](https://api.python.langchain.com/en/latest/chat_models/langchain_together.chat_models.ChatTogether.html) | [langchain-together](https://api.python.langchain.com/en/latest/together_api_reference.html) | ✅/❌ | beta/❌ | ✅/❌ | ![PyPI - Downloads](https://img.shields.io/pypi/dm/langchain-together?style=flat-square&label=%20) | ![PyPI - Version](https://img.shields.io/pypi/v/langchain-together?style=flat-square&label=%20) |\n", + "| [ChatTogether](https://api.python.langchain.com/en/latest/chat_models/langchain_together.chat_models.ChatTogether.html) | [langchain-together](https://api.python.langchain.com/en/latest/together_api_reference.html) | ❌ | beta | ✅ | ![PyPI - Downloads](https://img.shields.io/pypi/dm/langchain-together?style=flat-square&label=%20) | ![PyPI - Version](https://img.shields.io/pypi/v/langchain-together?style=flat-square&label=%20) |\n", "\n", "### Model features\n", - "| [Tool calling](/docs/how_to/tool_calling) | [Structured output](/docs/how_to/structured_output/) | JSON mode | [Image input](/docs/how_to/multimodal_inputs/) | Audio input | Video input | [Token-level streaming](/docs/how_to/chat_streaming/) | Native async | [Token usage](/docs/how_to/chat_token_usage_tracking/) | [Logprobs](/docs/how_to/logprobs/) |\n", + "| [Tool calling](../../how_to/tool_calling.ipynb) | [Structured output](../../how_to/structured_output.ipynb) | JSON mode | [Image input](../../how_to/multimodal_inputs.ipynb) | Audio input | Video input | [Token-level streaming](../../how_to/chat_streaming.ipynb) | Native async | [Token usage](../../how_to/chat_token_usage_tracking.ipynb) | [Logprobs](../../how_to/logprobs.ipynb) |\n", "| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |\n", - "| ✅/❌ | ✅/❌ | ✅/❌ | ✅/❌ | ✅/❌ | ✅/❌ | ✅/❌ | ✅/❌ | ✅/❌ | ✅/❌ | \n", + "| ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ | ✅ | ✅ | \n", "\n", "## Setup\n", "\n", - "- TODO: Update with relevant info.\n", - "\n", "To access Together models you'll need to create a/an Together account, get an API key, and install the `langchain-together` integration package.\n", "\n", "### Credentials\n", "\n", - "- TODO: Update with relevant info.\n", - "\n", - "Head to (TODO: link) to sign up to Together and generate an API key. Once you've done this set the TOGETHER_API_KEY environment variable:" + "Head to [this page](https://api.together.ai) to sign up to Together and generate an API key. Once you've done this set the TOGETHER_API_KEY environment variable:" ] }, { "cell_type": "code", - "execution_count": null, + "execution_count": 1, "id": "433e8d2b-9519-4b49-b2c4-7ab65b046c94", "metadata": {}, "outputs": [], @@ -75,7 +66,7 @@ }, { "cell_type": "code", - "execution_count": null, + "execution_count": 2, "id": "a15d341e-3e26-4ca3-830b-5aab30ed66de", "metadata": {}, "outputs": [], @@ -96,10 +87,21 @@ }, { "cell_type": "code", - "execution_count": null, + "execution_count": 3, "id": "652d6238-1f87-422a-b135-f5abbb8652fc", "metadata": {}, - "outputs": [], + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m A new release of pip is available: \u001b[0m\u001b[31;49m24.0\u001b[0m\u001b[39;49m -> \u001b[0m\u001b[32;49m24.1.2\u001b[0m\n", + "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m To update, run: \u001b[0m\u001b[32;49mpip install --upgrade pip\u001b[0m\n", + "Note: you may need to restart the kernel to use updated packages.\n" + ] + } + ], "source": [ "%pip install -qU langchain-together" ] @@ -118,7 +120,7 @@ }, { "cell_type": "code", - "execution_count": null, + "execution_count": 5, "id": "cb09c344-1836-4e0c-acf8-11d13ac1dbae", "metadata": {}, "outputs": [], @@ -126,7 +128,7 @@ "from langchain_together import ChatTogether\n", "\n", "llm = ChatTogether(\n", - " model=\"model-name\",\n", + " model=\"meta-llama/Llama-3-70b-chat-hf\",\n", " temperature=0,\n", " max_tokens=None,\n", " timeout=None,\n", @@ -140,19 +142,28 @@ "id": "2b4f3e15", "metadata": {}, "source": [ - "## Invocation\n", - "\n", - "- TODO: Run cells so output can be seen." + "## Invocation" ] }, { "cell_type": "code", - "execution_count": null, + "execution_count": 6, "id": "62e0dbc3", "metadata": { "tags": [] }, - "outputs": [], + "outputs": [ + { + "data": { + "text/plain": [ + "AIMessage(content=\"J'adore la programmation.\", response_metadata={'token_usage': {'completion_tokens': 9, 'prompt_tokens': 35, 'total_tokens': 44}, 'model_name': 'meta-llama/Llama-3-70b-chat-hf', 'system_fingerprint': None, 'finish_reason': 'stop', 'logprobs': None}, id='run-79efa49b-dbaf-4ef8-9dce-958533823ef6-0', usage_metadata={'input_tokens': 35, 'output_tokens': 9, 'total_tokens': 44})" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "messages = [\n", " (\n", @@ -167,10 +178,18 @@ }, { "cell_type": "code", - "execution_count": null, + "execution_count": 7, "id": "d86145b3-bfef-46e8-b227-4dda5c9c2705", "metadata": {}, - "outputs": [], + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "J'adore la programmation.\n" + ] + } + ], "source": [ "print(ai_msg.content)" ] @@ -182,17 +201,26 @@ "source": [ "## Chaining\n", "\n", - "We can [chain](/docs/how_to/sequence/) our model with a prompt template like so:\n", - "\n", - "- TODO: Run cells so output can be seen." + "We can [chain](../../how_to/sequence.ipynb) our model with a prompt template like so:" ] }, { "cell_type": "code", - "execution_count": null, + "execution_count": 8, "id": "e197d1d7-a070-4c96-9f8a-a0e86d046e0b", "metadata": {}, - "outputs": [], + "outputs": [ + { + "data": { + "text/plain": [ + "AIMessage(content='Ich liebe das Programmieren.', response_metadata={'token_usage': {'completion_tokens': 7, 'prompt_tokens': 30, 'total_tokens': 37}, 'model_name': 'meta-llama/Llama-3-70b-chat-hf', 'system_fingerprint': None, 'finish_reason': 'stop', 'logprobs': None}, id='run-80bba5fa-1723-4242-8d5a-c09b76b8350b-0', usage_metadata={'input_tokens': 30, 'output_tokens': 7, 'total_tokens': 37})" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "from langchain_core.prompts import ChatPromptTemplate\n", "\n", @@ -216,16 +244,6 @@ ")" ] }, - { - "cell_type": "markdown", - "id": "d1ee55bc-ffc8-4cfa-801c-993953a08cfd", - "metadata": {}, - "source": [ - "## TODO: Any functionality specific to this model provider\n", - "\n", - "E.g. creating/using finetuned models via this provider. Delete if not relevant." - ] - }, { "cell_type": "markdown", "id": "3a5bb5ca-c3ae-4a58-be67-2cd18574b9a3", From 53352a706a2bf11884a89113fd2a9a64638dc951 Mon Sep 17 00:00:00 2001 From: isaac hershenson Date: Thu, 25 Jul 2024 10:13:59 -0700 Subject: [PATCH 3/3] inference to chat --- docs/docs/integrations/chat/together.ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/docs/integrations/chat/together.ipynb b/docs/docs/integrations/chat/together.ipynb index d2444c5012034..87a0f1c39e1d0 100644 --- a/docs/docs/integrations/chat/together.ipynb +++ b/docs/docs/integrations/chat/together.ipynb @@ -20,7 +20,7 @@ "\n", "This page will help you get started with Together AI [chat models](../../concepts.mdx#chat-models). For detailed documentation of all ChatTogether features and configurations head to the [API reference](https://api.python.langchain.com/en/latest/chat_models/langchain_together.chat_models.ChatTogether.html).\n", "\n", - "[Together AI](https://www.together.ai/) offers an API to query [50+ leading open-source models](https://docs.together.ai/docs/inference-models)\n", + "[Together AI](https://www.together.ai/) offers an API to query [50+ leading open-source models](https://docs.together.ai/docs/chat-models)\n", "\n", "## Overview\n", "### Integration details\n",