Configuring Cloudera AI Inference service and Amazon Bedrock to set up Cloudera Copilot
To use Cloudera Copilot, as a Site Administrator, you can configure the credentials either in Cloudera AI Inference service or in Amazon Bedrock depending on where you want to deploy your custom model.
To configure Cloudera Copilot to use Cloudera AI Inference service:
- Set up Cloudera AI Inference Service. Recommended
models are:
- Llama 2 70b Chat
- CodeLlama 34b - instruct
- Mistral 7b
- Deploy a model to Cloudera AI Inference Service.
- Navigate to and generate a pair of Access keys and Private keys. Download the Credentials file for future reference.
- Navigate to
- CDP_PRIVATE_KEY: Private key generated in the preceding step
- CDP_ACCESS_KEY_ID: Access key ID generated in the preceding step
- CDP_REGION: Region of your CDP deployment
- CDP_ENDPOINT_URL: Endpoint URL of your CDP deployment
- ENDPOINT_URL: Base URL for other legacy CDP API services. CDP configuration process does not prompt for this value.
and add the following: - Install CDP CLI to authorize requests to Cloudera AI Inference Service Endpoints.
- Create a new project in your ML Workspace.
- Start a new JupyterLab session.
- Open Terminal Access. A terminal window appears.
- Run the pip install cdpcli command.
To configure Cloudera Copilot to use Amazon Bedrock:
- Generate a pair of Access and Secret keys through AWS IAM.
- Navigate to
- AWS_SECRET_ACCESS_KEY
- AWS_ACCESS_KEY_ID
- AWS_DEFAULT_REGION
and add the following: