Cloudera on Cloud: February 2026 Release Summary
The Release Summary of Cloudera on cloud summarizes major features introduced in Management Console, Data Hub, and data services.
Cloudera AI
Cloudera AI 2.0.55-b193 introduces the following changes:
Cloudera AI Workbench
- Improved application responsiveness by optimizing database query performance across projects and jobs, significantly reducing page load times for environments with large datasets.
Cloudera AI Control Plane
- Improved consistency across the administrative experience for AI Workbench, AI Registry, and the AI Inference services.
- Added support for Amazon EKS 1.33.
- Added support for Azure AKS 1.33.
- Improved the reliability of Cloudera AI Inference service upgrade processes.
- Cloudera AI now automatically performs side-by-side upgrades for incompatible environments, ensuring a seamless one-click experience with built-in rollback support in case of failure. To upgrade a workbench, Cloudera users must hold both the
MLAdminandEnvironmentAdminroles, and those performing an upgrade must also possess the necessary cloud provider permissions to execute backup and restore operations for the underlying storage and metadata databases. For more information, see Upgrading Cloudera AI Workbenches.
Cloudera AI Inference service
- Cloudera AI Inference service now provides a production-grade serving environment for hosting applications. Applications deployed on Cloudera AI Inference service can scale alongside Model Endpoints, providing a scalable solution for various components. For more information, see Serving Applications on Cloudera AI Inference service (Technical Preview) .
- Cloudera AI Inference service now supports AWS on-demand capacity reservations and capacity blocks to ensure compute availability for inference workloads. For more information, see Configuring AWS on-demand capacity reservations and capacity blocks.
- Cloudera AI now supports the deployment of Hugging Face reranking models using the API.
- Cloudera AI now supports deploying Hugging Face embedding models using the API.
- You can now manually specify model tasks (such as
EMBED, RANK, orCLASSIFICATION) using API during deployment, enabling broader vLLM support for architectures likebertmodelormodernbertfortokenclassificationthat serve tasks like embedding and reranking respectively. - You can now manage Cloudera AI Inference service logging globally using the Serving API ConfigMap, allowing administrators to enable logging and define a storage bucket across all endpoints simultaneously for consistent data collection.
- Added Fine Grained Authorization support. For more information, see Configuring Fine-grained Access Control.
Cloudera AI Registry
- Cloudera AI Inference service now supports direct deployment for XGBoost, PyTorch, and TensorFlow models using the AI Registry. For more information, see Deploying Additional Model Frameworks.
- Cloudera AI Registry now displays structured metadata and comprehensive lineage tracking (provider, model ID, and SHA) for all models imported from Hugging Face and NVIDIA NGC.
Cloudera AI Studios
Agent Studio 2.2.0 introduces the following changes:
- Role-Based Access Control: Agent Studio now includes a comprehensive Role-Based Access Control (RBAC) system to ensure secure access to workflows, models, and tools. This system allows for fine-grained control over who can view, edit, deploy, and delete resources within the agent studio. For more information see, Managing user access with Role-Based Access Control (RBAC).
- Stopping Workflows: A running workflow can be stopped to safely terminate AI agents and halt active processes. For more information see, Stopping a workflow.
- Workflow Evaluations: A new Evaluations feature in Agent Studio, now provides a built-in suite of diagnostic and quality-assurance tools to measure the performance, accuracy, and safety of the Agentic Workflows. Users can now assess workflows during the development phase in Studio and audit historical runs in Deployed Workflows. For more information see, Managing Workflow Evaluations.
RAG Studio 2.2.0 introduces the following changes:
- RAG Studio installation using ML Runtime image: RAG Studio is now exclusively shipped as a prebuilt, containerized ML Runtime Image for both on-premise and cloud deployments. This streamlined approach eliminates the need for compiling raw source code within the customer environment, providing a consistent, reliable, and production-ready installation. For more information see, Deploying RAG Studio using the ML Runtime Image
- Deprecation Notice: The existing methods of deployment (from source code) and AMP mode are now deprecated. The latest releases will be available only through an ML Runtime Image.
For more information about the Known issues, Fixed issues and Behavioral changes, see the Cloudera AI and Cloudera AI Studios Release Notes.
Cloudera Data Warehouse
Cloudera Data Warehouse 1.11.4-b6 is a hotfix release that does not include new features, but a list of fixes. For more information, see the Cloudera Data Warehouse Release Notes.
Cloudera Management Console
This release of the Cloudera Management Console service introduces the following changes:
Load Balancer for FreeIPA
A Network Load Balancer is added to FreeIPA in Cloudera environments to prevent disruption to Cloudera Data Hub clusters and Data Services caused by changing FreeIPA node IP addresses during operations like repair or upgrade. For new environments, the Load Balancer is automatically added during environment creation. For existing environments, the Load Balancer is automatically added during FreeIPA upgrade.
For more information, see the Load Balancer for FreeIPA documentation.
Cloudera Observability
This release of the Cloudera Observability service introduces the following changes:
Managing cluster reports
Cluster reports in Cloudera Observability help you track job and query changes by comparing current performance with historical data. You can configure these digests to arrive daily, weekly, or monthly through email to identify potential problems. For more information, see the Managing cluster reports documentation.
Cloudera Operational Database
Cloudera Operational Database 1.58 introduces the following changes:
Deprecation of OMID service from the Cloudera Operational Database
The OMID service is removed from the default installation of Cloudera Operational Database. This change is implemented because OMID appears to be unused in these environments. Consequently, the OMID service is no longer automatically deployed, effective immediately.
Users who still require OMID are not losing access to the service. You can still install OMID manually through Cloudera Manager if needed for your operations.
