Skip to content

Commit

Permalink
docs: integrations references update (langchain-ai#25217)
Browse files Browse the repository at this point in the history
Added missed provider pages. Fixed formats and added descriptions and
links.

---------

Co-authored-by: Chester Curme <[email protected]>
  • Loading branch information
2 people authored and olgamurraft committed Aug 16, 2024
1 parent d0166ce commit d8dbe7f
Show file tree
Hide file tree
Showing 9 changed files with 210 additions and 27 deletions.
22 changes: 18 additions & 4 deletions docs/docs/integrations/chat/llamacpp.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,23 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# ChatLlamaCpp\n",
"\n",
"This notebook provides a quick overview for getting started with chat model intergrated with [llama cpp python](https://github.com/abetlen/llama-cpp-python)."
"# Llama.cpp\n",
"\n",
">[llama.cpp python](https://github.com/abetlen/llama-cpp-python) library is a simple Python bindings for `@ggerganov`\n",
">[llama.cpp](https://github.com/ggerganov/llama.cpp).\n",
">\n",
">This package provides:\n",
">\n",
"> - Low-level access to C API via ctypes interface.\n",
"> - High-level Python API for text completion\n",
"> - `OpenAI`-like API\n",
"> - `LangChain` compatibility\n",
"> - `LlamaIndex` compatibility\n",
"> - OpenAI compatible web server\n",
"> - Local Copilot replacement\n",
"> - Function Calling support\n",
"> - Vision API support\n",
"> - Multiple Models\n"
]
},
{
Expand Down Expand Up @@ -410,7 +424,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.8"
"version": "3.10.12"
}
},
"nbformat": 4,
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/integrations/chat/octoai.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.7"
"version": "3.10.12"
},
"vscode": {
"interpreter": {
Expand Down
30 changes: 22 additions & 8 deletions docs/docs/integrations/chat/perplexity.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
"source": [
"# ChatPerplexity\n",
"\n",
"This notebook covers how to get started with Perplexity chat models."
"This notebook covers how to get started with `Perplexity` chat models."
]
},
{
Expand All @@ -37,17 +37,31 @@
"from langchain_core.prompts import ChatPromptTemplate"
]
},
{
"cell_type": "markdown",
"id": "b26e2035-2f81-4451-ba44-fa2e2d5aeb62",
"metadata": {},
"source": [
"The code provided assumes that your PPLX_API_KEY is set in your environment variables. If you would like to manually specify your API key and also choose a different model, you can use the following code:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "d986aac6-1bae-4608-8514-d3ba5b35b10e",
"metadata": {},
"outputs": [],
"source": [
"chat = ChatPerplexity(\n",
" temperature=0, pplx_api_key=\"YOUR_API_KEY\", model=\"llama-3-sonar-small-32k-online\"\n",
")"
]
},
{
"cell_type": "markdown",
"id": "97a8ce3a",
"metadata": {},
"source": [
"The code provided assumes that your PPLX_API_KEY is set in your environment variables. If you would like to manually specify your API key and also choose a different model, you can use the following code:\n",
"\n",
"```python\n",
"chat = ChatPerplexity(temperature=0, pplx_api_key=\"YOUR_API_KEY\", model=\"llama-3-sonar-small-32k-online\")\n",
"```\n",
"\n",
"You can check a list of available models [here](https://docs.perplexity.ai/docs/model-cards). For reproducibility, we can set the API key dynamically by taking it as an input in this notebook."
]
},
Expand Down Expand Up @@ -221,7 +235,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.18"
"version": "3.10.12"
}
},
"nbformat": 4,
Expand Down
44 changes: 34 additions & 10 deletions docs/docs/integrations/providers/llamacpp.mdx
Original file line number Diff line number Diff line change
@@ -1,26 +1,50 @@
# Llama.cpp

This page covers how to use [llama.cpp](https://github.com/ggerganov/llama.cpp) within LangChain.
It is broken into two parts: installation and setup, and then references to specific Llama-cpp wrappers.
>[llama.cpp python](https://github.com/abetlen/llama-cpp-python) library is a simple Python bindings for `@ggerganov`
>[llama.cpp](https://github.com/ggerganov/llama.cpp).
>
>This package provides:
>
> - Low-level access to C API via ctypes interface.
> - High-level Python API for text completion
> - `OpenAI`-like API
> - `LangChain` compatibility
> - `LlamaIndex` compatibility
> - OpenAI compatible web server
> - Local Copilot replacement
> - Function Calling support
> - Vision API support
> - Multiple Models
## Installation and Setup
- Install the Python package with `pip install llama-cpp-python`

- Install the Python package
```bash
pip install llama-cpp-python
````
- Download one of the [supported models](https://github.com/ggerganov/llama.cpp#description) and convert them to the llama.cpp format per the [instructions](https://github.com/ggerganov/llama.cpp)

## Wrappers

### LLM
## Chat models

See a [usage example](/docs/integrations/chat/llamacpp).

```python
from langchain_community.chat_models import ChatLlamaCpp
```

## LLMs

See a [usage example](/docs/integrations/llms/llamacpp).

There exists a LlamaCpp LLM wrapper, which you can access with
```python
from langchain_community.llms import LlamaCpp
```
For a more detailed walkthrough of this, see [this notebook](/docs/integrations/llms/llamacpp)

### Embeddings
## Embedding models

See a [usage example](/docs/integrations/text_embedding/llamacpp).

There exists a LlamaCpp Embeddings wrapper, which you can access with
```python
from langchain_community.embeddings import LlamaCppEmbeddings
```
For a more detailed walkthrough of this, see [this notebook](/docs/integrations/text_embedding/llamacpp)
21 changes: 21 additions & 0 deletions docs/docs/integrations/providers/maritalk.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
# MariTalk

>[MariTalk](https://www.maritaca.ai/en) is an LLM-based chatbot trained to meet the needs of Brazil.
## Installation and Setup

You have to get the MariTalk API key.

You also need to install the `httpx` Python package.

```bash
pip install httpx
```

## Chat models

See a [usage example](/docs/integrations/chat/maritalk).

```python
from langchain_community.chat_models import ChatMaritalk
```
34 changes: 34 additions & 0 deletions docs/docs/integrations/providers/mlx.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
# MLX

>[MLX](https://ml-explore.github.io/mlx/build/html/index.html) is a `NumPy`-like array framework
> designed for efficient and flexible machine learning on `Apple` silicon,
> brought to you by `Apple machine learning research`.

## Installation and Setup

Install several Python packages:

```bash
pip install mlx-lm transformers huggingface_hub
````


## Chat models


See a [usage example](/docs/integrations/chat/mlx).

```python
from langchain_community.chat_models.mlx import ChatMLX
```

## LLMs

### MLX Local Pipelines

See a [usage example](/docs/integrations/llms/mlx_pipelines).

```python
from langchain_community.llms.mlx_pipeline import MLXPipeline
```
37 changes: 37 additions & 0 deletions docs/docs/integrations/providers/octoai.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# OctoAI

>[OctoAI](https://docs.octoai.cloud/docs) offers easy access to efficient compute
> and enables users to integrate their choice of AI models into applications.
> The `OctoAI` compute service helps you run, tune, and scale AI applications easily.

## Installation and Setup

- Install the `openai` Python package:
```bash
pip install openai
````
- Register on `OctoAI` and get an API Token from [your OctoAI account page](https://octoai.cloud/settings).


## Chat models

See a [usage example](/docs/integrations/chat/octoai).

```python
from langchain_community.chat_models import ChatOctoAI
```

## LLMs

See a [usage example](/docs/integrations/llms/octoai).

```python
from langchain_community.llms.octoai_endpoint import OctoAIEndpoint
```

## Embedding models

```python
from langchain_community.embeddings.octoai_embeddings import OctoAIEmbeddings
```
25 changes: 25 additions & 0 deletions docs/docs/integrations/providers/perplexity.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# Perplexity

>[Perplexity](https://www.perplexity.ai/pro) is the most powerful way to search
> the internet with unlimited Pro Search, upgraded AI models, unlimited file upload,
> image generation, and API credits.
>
> You can check a [list of available models](https://docs.perplexity.ai/docs/model-cards).
## Installation and Setup

Install a Python package:

```bash
pip install openai
````

Get your API key from [here](https://docs.perplexity.ai/docs/getting-started).

## Chat models

See a [usage example](/docs/integrations/chat/perplexity).

```python
from langchain_community.chat_models import ChatPerplexity
```
22 changes: 18 additions & 4 deletions docs/docs/integrations/text_embedding/llamacpp.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,23 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Llama-cpp\n",
"# Llama.cpp\n",
"\n",
"This notebook goes over how to use Llama-cpp embeddings within LangChain"
">[llama.cpp python](https://github.com/abetlen/llama-cpp-python) library is a simple Python bindings for `@ggerganov`\n",
">[llama.cpp](https://github.com/ggerganov/llama.cpp).\n",
">\n",
">This package provides:\n",
">\n",
"> - Low-level access to C API via ctypes interface.\n",
"> - High-level Python API for text completion\n",
"> - `OpenAI`-like API\n",
"> - `LangChain` compatibility\n",
"> - `LlamaIndex` compatibility\n",
"> - OpenAI compatible web server\n",
"> - Local Copilot replacement\n",
"> - Function Calling support\n",
"> - Vision API support\n",
"> - Multiple Models\n"
]
},
{
Expand Down Expand Up @@ -80,9 +94,9 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.1"
"version": "3.10.12"
}
},
"nbformat": 4,
"nbformat_minor": 2
"nbformat_minor": 4
}

0 comments on commit d8dbe7f

Please sign in to comment.