IMPORTANT: Please read the following before proceeding. This Community AMP is a third party software package that is not validated or maintained by Cloudera, Inc. ("Cloudera"). By configuring and launching this Community AMP, you will cause such third party software to be downloaded and installed into your environment direct from third party repositories.
If you do not wish to download and install the third party software packages, do not configure, launch or otherwise use this Community AMP. By configuring, launching or otherwise using the Community AMP, you acknowledge the foregoing statement and agree that Cloudera is not responsible or liable in any way for the third party software packages.
Copyright (c) 2024 - Cloudera, Inc. All rights reserved.
This AMP implements A.Y.A., an agentic workflow assistant, to demonstrate how agentic workflows can help you interact with a wide and complex series of AI agents and tools.
By integrating multiple agents into a single agentic application, users can spend less time looking for the right tools and more time discovering insights and making actionable decisions.
As domain data tools mature and gain more advanced AI capabilties, so too does an application like AYA. AYA gains the capabilities and power of each agent that has been registered. Each Agent application can then focus on improving and providing deeper capabilities instead of worrying about general purpose tasks that could be handled by other domains.
Interface
The AYA chat interface is built using
- Gradio for UI
- Crew AI for agentic workflow orchestration
- AWS Bedrock for Primary AYA Agent LLM inference
- Default: anthropic.claude-3-sonnet
Under the Hood
To build AYAs responses to User requests, we use CrewAI, a framework which helps to organize and launch agentic workflows.
The CrewAI libraries give us a a convenient way to describe all the critical parts of AYA
- The primary AYA Agent
- AYA's primary task goals
- All the agent applications AYA can use
These descriptions are put together and presented to the LLM backing AYA to determine the right course of action for a user request.
Sample agents have been pre-built and included here for AYA to use. These agents use and provide insights on a variety of data types using a variety of techniques. Each also represents an Agent capability that could be readily improved via Generative AI techniques and Domain Specific operations
- Source Data: CSV table containing sample product consumption time data.
- Current Agent Capabilities: Computes weekly consumption changes for all customers.
- Source Data: CSV table containing Support Ticket metadata for all customers
- Current Agent Capabilities: Returns lists of recent support tickets for a customer.
- Source Data: Support case history information. Files representing the entirety of a support case, with all comments and status.
- Current Agent Capabilities: Returns a AI generated summary of a support case.
- Source Data: Knowledge graph of customer account metadata. (Products, Account management personnel, Entitlements)
- Current Agent Capabilities: Generates and executes Knowledge Graph Cipher queries from natural language queries.
Deploy this repository as a Cloudera AI Workspace AMP Project.
- CML CPU workloads with resource profiles up to (2 vCPU / 4 GiB Memory) will be provisioned
PBJ Workbench - Python 3.10
The primary interface application and underlying agent applications use AWS Bedrock for LLM inference calls. Please ensure yor AWS account has access to the selected AWS Bedrock model. The following envs are required to make those generations:
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION
AWS_BEDROCK_MODEL
- Default:
anthropic.claude-3-sonnet-20240229-v1:0
- Default:
The AWS identity you set here must have sufficient AWS IAM permissions to call the Amazon Bedrock service.
For example, To grant full bedrock access to your identity, you can:
- Open the AWS IAM Console
- Find your Role
- Select Add Permissions > Create Inline Policy to attach new inline permissions, open the JSON editor and paste in the below example policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "BedrockFullAccess",
"Effect": "Allow",
"Action": ["bedrock:*"],
"Resource": "*"
}
]
}