Skip to main content
Using Amazon Bedrock
Updated over a week ago

Amazon Bedrock is a manage service for foundation and custom pre-trained models based on those foundation models. It supports many vendors (not Gemini or OpenAI though).

To see the currently supported models of on-demand or provisioned throughput see: https://docs.aws.amazon.com/bedrock/latest/userguide/model-ids.html

(this page also shows the modelId which you will need to setup them as a vendor on LLMAsAService).

LLM Service Setup

To use a Bedrock model that you have access to, you need the modelId. These are available at https://docs.aws.amazon.com/bedrock/latest/userguide/model-ids.html

  1. Inside LLMAsAService, https://dev.llmasaservice.io/services click Add Service...

  2. Change the vendor to Amazon and answer Replace in the confirm dialog:

3. Change the modelID and Region value in the Request Body

For example, for US-EAST-1 and anthropic, use (but leave everything else alone for now:

"region":"us-east-1",

"modelId":"anthropic.claude-v2",


4. Click Save and Close. Then Edit the newly created draft service (we need it saved before we can add the API keys as decribed above)
5. Click on the "Add or Replace API Key button and follow the instructions in the Authentication Section below.

6. Click the Test button to confirm it works. And then set the status to Active to make it accessible from production calls.

Authentication

AWS doesn't use API Keys like the other vendors. They do offer a way to call their services using an AWS Access Key and Secret Key which you generate from the AWS console IAM panel as a user.

Navigate to your regions AWS control panel after logging into the AWS console:

(note: this is for US-EAST-1 and you will need to use your region)

Create a new user without login credentials, but add an access key specifically designed for programmatic access)


This access key comes in two parts, and AWS Access Key Id and a AWSD Secret Access Key. You will need to copy these into the "Add API Key" dialog box when creating the service in LLMAsAService's control panel.

Once you have added the key, it is securely stored and used to authenticate calls to AWS Bedrock through our service.

Model Access

To use models in AWS Bedrock you need to request model access. This is typically just accepting an EULA and takes minutes.

(access the Bedrock service and choose Base Models from the left menu panel. This link id for US-EAST-1 region, replace with your region)

Titan Model Limitations

Amazon have their Titan series of large language models. These do not support system messages. If you use our Policy and brand injection feature (defined in the project) these will fail. Our policy injection feature makes sure EVERY prompt has a pre-amble set of instructions that the LLM must follow (eg. "Don't talk about politics or religion." There isn't a way for Titan to accept these instructions, so we have given you a way to accept that the policy won't be sent to models that don't support it: add a "noSystemMessage":true to the body. Without this setting in the Request Body you will get 500 status code errors saying "Validation Error: Model Doesn't Support System Messages." Add the property, and we DO NOT SEND the policy. Consider using a different model if this is unacceptable.

{
"region":"us-east-1",
"modelId":"amazon.titan-text-lite-v1",
"stream": true,
"inferenceConfig": {
"maxTokens": 4000,
"temperature": 0.1,
"topP": 1
},
"noSystemMessage":true
}

Did this answer your question?