Skip to content

Commit

Permalink
Add AWS Bedrock model support and update dependencies (#83)
Browse files Browse the repository at this point in the history
* Add AWS Bedrock model support and update dependencies

* Add AWS Bedrock model support and update dependencies

* Fix environment variable reference in LLM availability message

* Update LLM availability message to clarify environment variable requirements

* Fix linter errors

---------

Co-authored-by: Joshua Carroll <[email protected]>
  • Loading branch information
madtank and JoshuaC215 authored Nov 6, 2024
1 parent 8e5d219 commit 017a38e
Show file tree
Hide file tree
Showing 5 changed files with 98 additions and 8 deletions.
4 changes: 4 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -111,6 +111,10 @@ With that said, there are several other interesting projects in this space that
# See: https://docs.anthropic.com/en/api/getting-started
ANTHROPIC_API_KEY=your_anthropic_key

# Optional, to enable AWS Bedrock models Haiku
# See: https://docs.aws.amazon.com/bedrock/latest/userguide/setting-up.html
USE_AWS_BEDROCK=true

# Optional, to enable simple header-based auth on the service
AUTH_SECRET=any_string_you_choose

Expand Down
1 change: 1 addition & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ dependencies = [
"setuptools ~=74.0.0",
"streamlit ~=1.37.0",
"uvicorn ~=0.30.5",
"langchain-aws>=0.2.6",
]

[project.optional-dependencies]
Expand Down
9 changes: 7 additions & 2 deletions src/agents/models.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
import os

from langchain_anthropic import ChatAnthropic
from langchain_aws import ChatBedrock
from langchain_core.language_models.chat_models import BaseChatModel
from langchain_google_genai import ChatGoogleGenerativeAI
from langchain_groq import ChatGroq
Expand All @@ -21,9 +22,13 @@
models["claude-3-haiku"] = ChatAnthropic(
model="claude-3-haiku-20240307", temperature=0.5, streaming=True
)
if os.getenv("USE_AWS_BEDROCK") == "true":
models["bedrock-haiku"] = ChatBedrock(
model_id="anthropic.claude-3-5-haiku-20241022-v1:0", temperature=0.5
)

if not models:
print("No LLM available. Please set API keys to enable at least one LLM.")
print("No LLM available. Please set environment variables to enable at least one LLM.")
if os.getenv("MODE") == "dev":
print("FastAPI initialized failed. Please use Ctrl + C to exit uvicorn.")
print("FastAPI initialization failed. Please use Ctrl + C to exit uvicorn.")
exit(1)
1 change: 1 addition & 0 deletions src/streamlit_app.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,6 +69,7 @@ async def main() -> None:
"Gemini 1.5 Flash (streaming)": "gemini-1.5-flash",
"Claude 3 Haiku (streaming)": "claude-3-haiku",
"llama-3.1-70b on Groq": "llama-3.1-70b",
"AWS Bedrock Haiku (streaming)": "bedrock-haiku",
}
# Config options
with st.sidebar:
Expand Down
91 changes: 85 additions & 6 deletions uv.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

0 comments on commit 017a38e

Please sign in to comment.