Launching RAG Studio within a project
RAG Studio is compatible with Cloudera AI Inference service, AWS Bedrock, and Azure OpenAI, enabling you to select different LLM and embedding models tailored to your specific needs.
RAG Studio integrates with three major enterprise inference serviceservices:
- AWS Bedrock: It provides scalable cloud-based inference.
- Cloudera AI Inference Service: It offers enterprise-grade deployment options.
-
Azure OpenAI: It enables the use of natural language processing models provided by OpenAI.
- Host names: For air-gapped installations that use a proxy setup, it is essential to whitelist the necessary URLs in your firewall rules. For a list of hostnames to whitelist, see Host names and endpoints required for AI Studios .