-
Notifications
You must be signed in to change notification settings - Fork 489
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
* Examples and README updates --------- Co-authored-by: fujitatomoya <[email protected]> Co-authored-by: Michael Yang <[email protected]>
- Loading branch information
1 parent
139c89e
commit 64c1eb7
Showing
28 changed files
with
457 additions
and
282 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,57 @@ | ||
# Running Examples | ||
|
||
Run the examples in this directory with: | ||
```sh | ||
# Run example | ||
python3 examples/<example>.py | ||
``` | ||
|
||
### Chat - Chat with a model | ||
- [chat.py](chat.py) | ||
- [async-chat.py](async-chat.py) | ||
- [chat-stream.py](chat-stream.py) - Streamed outputs | ||
- [chat-with-history.py](chat-with-history.py) - Chat with model and maintain history of the conversation | ||
|
||
|
||
### Generate - Generate text with a model | ||
- [generate.py](generate.py) | ||
- [async-generate.py](async-generate.py) | ||
- [generate-stream.py](generate-stream.py) - Streamed outputs | ||
- [fill-in-middle.py](fill-in-middle.py) - Given a prefix and suffix, fill in the middle | ||
|
||
|
||
### Tools/Function Calling - Call a function with a model | ||
- [tools.py](tools.py) - Simple example of Tools/Function Calling | ||
- [async-tools.py](async-tools.py) | ||
|
||
|
||
### Multimodal with Images - Chat with a multimodal (image chat) model | ||
- [multimodal_chat.py](multimodal_chat.py) | ||
- [multimodal_generate.py](multimodal_generate.py) | ||
|
||
|
||
### Ollama List - List all downloaded models and their properties | ||
- [list.py](list.py) | ||
|
||
|
||
### Ollama ps - Show model status with CPU/GPU usage | ||
- [ps.py](ps.py) | ||
|
||
|
||
### Ollama Pull - Pull a model from Ollama | ||
Requirement: `pip install tqdm` | ||
- [pull.py](pull.py) | ||
|
||
|
||
### Ollama Create - Create a model from a Modelfile | ||
```python | ||
python create.py <model> <modelfile> | ||
``` | ||
- [create.py](create.py) | ||
|
||
See [ollama/docs/modelfile.md](https://github.com/ollama/ollama/blob/main/docs/modelfile.md) for more information on the Modelfile format. | ||
|
||
|
||
### Ollama Embed - Generate embeddings with a model | ||
- [embed.py](embed.py) | ||
|
This file was deleted.
Oops, something went wrong.
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,19 @@ | ||
import asyncio | ||
from ollama import AsyncClient | ||
|
||
|
||
async def main(): | ||
messages = [ | ||
{ | ||
'role': 'user', | ||
'content': 'Why is the sky blue?', | ||
}, | ||
] | ||
|
||
client = AsyncClient() | ||
response = await client.chat('llama3.2', messages=messages) | ||
print(response['message']['content']) | ||
|
||
|
||
if __name__ == '__main__': | ||
asyncio.run(main()) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,15 @@ | ||
import asyncio | ||
import ollama | ||
|
||
|
||
async def main(): | ||
client = ollama.AsyncClient() | ||
response = await client.generate('llama3.2', 'Why is the sky blue?') | ||
print(response['response']) | ||
|
||
|
||
if __name__ == '__main__': | ||
try: | ||
asyncio.run(main()) | ||
except KeyboardInterrupt: | ||
print('\nGoodbye!') |
Oops, something went wrong.