Organizations need seamless access to their structured data repositories to power intelligent AI agents. However, when these resources span multiple AWS accounts integration challenges can arise. This post explores a practical solution for connecting Amazon Bedrock agents to knowledge bases in Amazon Redshift clusters residing in different AWS accounts.
The challenge
Organizations that build AI agents using Amazon Bedrock can maintain their structured data in Amazon Redshift clusters. When these data repositories exist in separate AWS accounts from their AI agents, they face a significant limitation: Amazon Bedrock Knowledge Bases doesn’t natively support cross-account Redshift integration.
This creates a challenge for enterprises with multi-account architectures who want to:
- Leverage existing structured data in Redshift for their AI agents.
- Maintain separation of concerns across different AWS accounts.
- Avoid duplicating data across accounts.
- Ensure proper security and access controls.
Solution overview
Our solution enables cross-account knowledge base integration through a secure, serverless architecture that maintains secure access controls while allowing AI agents to query structured data. The approach uses AWS Lambda as an intermediary to facilitate secure cross-account data access.

The action flow as shown above:
- Users enter their natural language question in Amazon Bedrock Agents which is configured in the agent account.
- Amazon Bedrock Agents invokes a Lambda function through action groups which provides access to the Amazon Bedrock knowledge base configured in the agent-kb account above.
- Action group Lambda function running in agent account assumes an IAM role created in agent-kb account above to connect to the knowledge base in the agent-kb account.
- Amazon Bedrock Knowledge Base in the agent-kb account uses an IAM role created in the same account to access Amazon Redshift data warehouse and query data in the data warehouse.
The solution follows these key components:
- Amazon Bedrock agent in the agent account that handles user interactions.
- Amazon Redshift serverless workgroup in VPC and private subnet in the agent-kb account containing structured data.
- Amazon Bedrock Knowledge base using the Amazon Redshift serverless workgroup as structured data source.
- Lambda function in the agent account.
- Action group configuration to connect the agent in the agent account to the Lambda function.
- IAM roles and policies that enable secure cross-account access.
Prerequisites
This solution requires you to have the following:
- Two AWS accounts. Create an AWS account if you do not have one. Specific permissions required for both account which will be set up in subsequent steps.
- Install the AWS CLI (2.24.22 – current version)
- Set up authentication using IAM user credentials for the AWS CLI for each account
- Make sure you have jq installed,
jqis lightweight command-line JSON processor. For example, in Mac you can use the commandbrew install jq(jq-1.7.1-apple – current version) to install it. - Navigate to the Amazon Bedrock console and make sure you enable access to the meta.llama3-1-70b-instruct-v1:0 model for the agent-kb account and access for us.amazon.nova-pro-v1:0 model in the agent account in the us-west-2, US West (Oregon) AWS Region.
Assumption
Let’s call the AWS account profile, agent profile that has the Amazon Bedrock agent. Similarly, the AWS account profile be called agent-kb that has the Amazon Bedrock knowledge base with Amazon Redshift Serverless and the structured data source. We will use the us-west-2 US West (Oregon) AWS Region but feel free to choose another AWS Region as necessary (the prerequisites will be applicable to the AWS Region you choose to deploy this solution in). We will use the meta.llama3-1-70b-instruct-v1:0 model for the agent-kb. This is an available on-demand model in us-west-2. You are free to choose other models with cross-Region inference but that would mean changing the roles and polices accordingly and enable model access in all Regions they are available in. Based on our model choice for this solution the AWS Region must be us-west-2. For the agent we will be using an Amazon Bedrock agent optimized model like us.amazon.nova-pro-v1:0.
Implementation walkthrough
The following is a step-by-step implementation guide. Make sure to perform all steps in the same AWS Region in both accounts.
These steps are to deploy and test an end-to-end solution from scratch and if you are already running some of these components, you may skip over those steps.
-
- Make a note of the AWS account numbers in the agent and agent-kb account. In the implementation steps we will refer them as follows:
Profile AWS account Description agent 111122223333 Account for the Bedrock Agent agent-kb 999999999999 Account for the Bedrock Knowledge base Note: These steps use example profile names and account numbers, please replace with actuals before running.
- Create the Amazon Redshift Serverless workgroup in the agent-kb account:
- Log on to the agent-kb account
- Follow the workshop link to create the Amazon Redshift Serverless workgroup in private subnet
- Make a note of the namespace, workgroup, and other details and follow the rest of the hands-on workshop instructions.
- Set up your data warehouse in the agent-kb account.
- Create your AI knowledge base in the agent-kb account. Make a note of the knowledge base ID.
- Train your AI Assistant in the agent-kb account.
- Test natural language queries in the agent-kb account. You can find the code in aws-samples git repository: sample-for-amazon-bedrock-agent-connect-cross-account-kb.
- Create necessary roles and policies in both the accounts. Run the script create_bedrock_agent_kb_roles_policies.sh with the following input parameters.
Input parameter Value Description –agent-kb-profile agent-kb The agent knowledgebase profile that you set up with the AWS CLI with aws_access_key_id, aws_secret_access_key as mentioned in the prerequisites. –lambda-role lambda_bedrock_kb_query_role This is the IAM role the agent account Bedrock agent action group lambda will assume to connect to the Redshift cross account –kb-access-role bedrock_kb_access_role This is the IAM role the agent-kb account which the lambda_bedrock_kb_query_rolein agent account assumes to connect to the Redshift cross account–kb-access-policy bedrock_kb_access_policy IAM policy attached to the IAM role bedrock_kb_access_role–lambda-policy lambda_bedrock_kb_query_policy IAM policy attached to the IAM role lambda_bedrock_kb_query_role–knowledge-base-id XXXXXXXXXX Replace with the actual knowledge base ID created in Step 4 –agent-account 111122223333 Replace with the 12-digit AWS account number where the Bedrock agent is running. (agent account) –agent-kb-account 999999999999 Replace with the 12-digit AWS account number where the Bedrock knowledge base is running. (agent-kb acccount) - Download the script (create_bedrock_agent_kb_roles_policies.sh) from the aws-samples GitHub repository.
- Open Terminal in Mac or similar bash shell for other platforms.
- Locate and change the directory to the downloaded location, provide executable permissions:
- If you are still not clear on the script usage or inputs, then you can run the script with the –help option and the script will display the usage:
./create_bedrock_agent_kb_roles_policies.sh –help
- Run the script with the right input parameters as described in the previous table.

- The script on successful execution shows the summary of the IAM, roles and policies created in both accounts.

- Log on to both the agent and agent-kb account to verify the IAM roles and policies are created.
-
-
- For the agent account: Make a note of the ARN of the
lambda_bedrock_kb_query_roleas that will be the value of CloudFormation stack parameter AgentLambdaExecutionRoleArn in the next step.

- For the agent-kb account: Make a note of the ARN of the
bedrock_kb_access_roleas that will be the value of CloudFormation stack parameterTargetRoleArnin the next step.

- For the agent account: Make a note of the ARN of the
-
-
- Run the AWS CloudFormation script to create a Bedrock agent:
-
-
-
- Download the CloudFormation script: cloudformation_bedrock_agent_kb_query_cross_account.yaml from the aws-samples GitHub repository.
- Log on to the agent account and navigate to the CloudFormation console, and verify you are in the us-west-2 (Oregon) Region, choose Create stack and choose With new resources (standard).

- In the Specify template section choose Upload a template file and then Choose file and select the file from (1). Then, choose Next.

- Enter the following stack details and choose Next.
Parameter Value Description Stack name bedrock-agent-connect-kb-cross-account-agent You can choose any name AgentFoundationModelId us.amazon.nova-pro-v1:0 Do not change AgentLambdaExecutionRoleArn arn:aws:iam:: 111122223333:role/lambda_bedrock_kb_query_role Replace with you agent account number BedrockAgentDescription Agent to query inventory data from Redshift Serverless database Keep this as default BedrockAgentInstructions You are an assistant that helps users query inventory data from our Redshift Serverless database using the action group. Do not change BedrockAgentName bedrock_kb_query_cross_account Keep this as default KBFoundationModelId meta.llama3-1-70b-instruct-v1:0 Do not change KnowledgeBaseId XXXXXXXXXX Knowledge base id from Step 4 TargetRoleArn arn:aws:iam::999999999999:role/bedrock_kb_access_role Replace with you agent-kb account number 
- Complete the acknowledgement and choose Next.

- Scroll down through the page and choose Submit.

- You will see the CloudFormation stack is getting created as shown by the status CREATE_IN_PROGRESS.

- It will take a few minutes, and you will see the status change to CREATE_COMPLETE indicating creation of all resources. Choose the Outputs tab to make a note of the resources that were created.

In summary, the CloudFormation script does the following in the agent account.-
-
- Creates a Bedrock agent
- Creates an action group
- Also creates a Lambda function which is invoked by the Bedrock action group
- Defines the OpenAPI schema
- Creates necessary roles and permissions for the Bedrock agent
- Finally, it prepares the Bedrock agent so that it is ready to test.
-
-
-
-
-
- Check for model access in Oregon (us-west-2)
-
-
-
- Verify Nova Pro (us.amazon.nova-pro-v1:0) model access in the agent account. Navigate to the Amazon Bedrock console and choose Model access under Configure and learn. Search for Model name : Nova Pro to verify access. If not, then enable model access.

- Verify access to the meta.llama3-1-70b-instruct-v1:0 model in the agent-kb account. This should already be enabled as we set up the knowledge base earlier.
- Verify Nova Pro (us.amazon.nova-pro-v1:0) model access in the agent account. Navigate to the Amazon Bedrock console and choose Model access under Configure and learn. Search for Model name : Nova Pro to verify access. If not, then enable model access.
-
-
-
- Run the agent. Log on to agent account. Navigate to Amazon Bedrock console and choose Agents under Build.

- Choose the name of the agent and choose Test. You can test the following questions as mentioned the workshop’s Stage 4: Test Natural Language Queries page. For example:
-
-
-
- Who are the top 5 customers in Saudi Arabia?
- Who are the top parts supplier in the United States by volume?
- What is the total revenue by region for the year 1998?
- Which products have the highest profit margins?
- Show me orders with the highest priority from the last quarter of 1997.
-
-

-
- Choose Show trace to investigate the agent traces.

- Make a note of the AWS account numbers in the agent and agent-kb account. In the implementation steps we will refer them as follows:
Some recommended best practices:
-
-
- Phrase your question to be more specific
- Use terminology that matches your table descriptions
- Try questions similar to your curated examples
- Verify your question relates to data that exists in the TPCH dataset
- Use Amazon Bedrock Guardrails to add configurable safeguards to questions and responses.
-
Clean up resources
It is recommended that you clean up any resources you do not need anymore to avoid any unnecessary charges:
-
-
- Navigate to the CloudFormation console for the agent and agent-kb account, search for the stack and and choose Delete.
- S3 buckets need to be deleted separately.

- For deleting the roles and policies created in both accounts, download the script
delete-bedrock-agent-kb-roles-policies.shfrom the aws-samples GitHub repository.- Open Terminal in Mac or similar bash shell on other platforms.
- Locate and change the directory to the downloaded location, provide executable permissions:
- If you are still not clear on the script usage or inputs, then you can run the script with the –help option then the script will display the usage:
./ delete-bedrock-agent-kb-roles-policies.sh –help
- Run the script:
delete-bedrock-agent-kb-roles-policies.shwith the same values for the same input parameters as in Step7 when running thecreate_bedrock_agent_kb_roles_policies.shscript. Note: Enter the correct account numbers for agent-account and agent-kb-account before running.The script will ask for a confirmation, say yes and press enter.

-
Summary
This solution demonstrates how the Amazon Bedrock agent in the agent account can query the Amazon Bedrock knowledge base in the agent-kb account.
Conclusion
This solution uses Amazon Bedrock Knowledge Bases for structured data to create a more integrated approach to cross-account data access. The knowledge base in agent-kb account connects directly to Amazon Redshift Serverless in a private VPC. The Amazon Bedrock agent in the agent account invokes an AWS Lambda function as part of its action group to make a cross-account connection to retrieve response from the structured knowledge base.
This architecture offers several advantages:
-
-
- Uses Amazon Bedrock Knowledge Bases capabilities for structured data
- Provides a more seamless integration between the agent and the data source
- Maintains proper security boundaries between accounts
- Reduces the complexity of direct database access codes
-
As Amazon Bedrock continues to evolve, you can take advantage of future enhancements to knowledge base functionality while maintaining your multi-account architecture.
About the Authors
Kunal Ghosh is an expert in AWS technologies. He passionate about building efficient and effective solutions on AWS, especially involving generative AI, analytics, data science, and machine learning. Besides family time, he likes reading, swimming, biking, and watching movies, and he is a foodie.
Arghya Banerjee is a Sr. Solutions Architect at AWS in the San Francisco Bay Area, focused on helping customers adopt and use the AWS Cloud. He is focused on big data, data lakes, streaming and batch analytics services, and generative AI technologies.
Indranil Banerjee is a Sr. Solutions Architect at AWS in the San Francisco Bay Area, focused on helping customers in the hi-tech and semi-conductor sectors solve complex business problems using the AWS Cloud. His special interests are in the areas of legacy modernization and migration, building analytics platforms and helping customers adopt cutting edge technologies such as generative AI.
Vinayak Datar is Sr. Solutions Manager based in Bay Area, helping enterprise customers accelerate their AWS Cloud journey. He’s focusing on helping customers to convert ideas from concepts to working prototypes to production using AWS generative AI services.
