Homepage
/
Cloudera AI
Search Documentation
▶︎
Cloudera
Reference Architectures
▼
Cloudera Public Cloud
Getting Started
Patterns
Preview Features
Data Catalog
Data Engineering
DataFlow
Data Hub
Data Warehouse
Data Warehouse Runtime
Cloudera AI
Management Console
Operational Database
Replication Manager
DataFlow for Data Hub
Runtime
▶︎
Cloudera Private Cloud
Data Services
Getting Started
Cloudera Manager
Management Console
Replication Manager
Data Catalog
Data Engineering
Data Warehouse
Data Warehouse Runtime
Cloudera AI
Base
Getting Started
Runtime
Upgrade
Storage
Flow Management
Streaming Analytics
Flow Management Operator
Streaming Analytics Operator
Streams Messaging Operator
▶︎
Cloudera Manager
Cloudera Manager
▶︎
Applications
Cloudera Streaming Community Edition
Data Science Workbench
Data Visualization
Edge Management
Observability SaaS
Observability on premises
Workload XM On-Prem
▶︎
Legacy
Cloudera Enterprise
Flow Management
Stream Processing
HDP
HDF
Streams Messaging Manager
Streams Replication Manager
▶︎
Getting Started
Patterns
Preview Features
Data Catalog
Data Engineering
DataFlow
Data Hub
Data Warehouse
Data Warehouse Runtime
Cloudera AI
Management Console
Operational Database
Replication Manager
DataFlow for Data Hub
Runtime
«
Filter topics
Cloudera AI
▶︎
Top Tasks
▶︎
Top Tasks for Cloudera AI
Exploratory Data Science and Visualization
Analytical Applications
Monitoring Active Models
▶︎
Release Notes
What's New
▶︎
Older releases
January 29, 2025
November 21, 2024
October 10, 2024
October 8, 2024
September 26, 2024
July 17, 2024
June 20, 2024
June 11, 2024
May 29, 2024
May 15, 2024
April 25, 2024
February 20, 2024
March 6, 2024
February 8, 2024
January 23, 2024
December 15, 2023
November 15, 2023
October 19, 2023
August 31, 2023
July 25, 2023
July 12, 2023
May 31, 2023
May 16, 2023
April 26, 2023
April 5, 2023
March 27, 2023
February 14, 2023
February 10, 2023
February 7, 2023
November 29, 2022
October 19, 2022
September 27, 2022
August 30, 2022
July 21, 2022
May 31, 2022
April 21, 2022
March 14, 2022
February 10, 2022
January 12, 2022
December 15, 2021
December 13, 2021
October 27, 2021
October 18, 2021
August 31, 2021
August 23, 2021
July 8, 2021
June 24, 2021
May 13, 2021
March 22, 2021
February 3, 2021
December 21, 2020
November 23, 2020
October 29, 2020
August 04, 2020
June 30, 2020
June 9, 2020
May 5, 2020
April 14, 2020
March 16, 2020
February 13, 2020
January 30, 2020
December 19, 2019
November 1, 2019
September 23, 2019
August 22, 2019
Compatibility for Cloudera AI and Runtime components
CVE-2021-44228 Remediation for Cloudera AI Data Service
Known Issues and Limitations
▶︎
ML Runtimes Release Notes
▶︎
ML Runtimes What's New
What's new in ML Runtimes version 2025.01.2
What's new in ML Runtimes version 2025.01.1
What's New in ML Runtimes version 2024.10.1
▶︎
What's New in ML Runtimes older releases
What's New in ML Runtimes version 2024.05.2
What's New in ML Runtimes version 2024.05.1
What's New in ML Runtimes version 2024.02.1
ML Runtimes Version 2023.12.1
ML Runtimes Version 2023.08.2
ML Runtimes Version 2023.08.1
ML Runtimes Version 2023.05.2
ML Runtimes Version 2023.05.1
ML Runtimes Version 2022.11.2
ML Runtimes Version 2022.11
ML Runtimes Version 2022.04
ML Runtimes Version 2021.12
ML Runtimes Version 2021.09.02
ML Runtimes Version 2021.09
ML Runtimes Version 2021.06
ML Runtimes Version 2021.04
ML Runtimes Version 2021.02
ML Runtimes Version 2020.11
▶︎
ML Runtimes Known Issues and Limitations
Known Issues and Limitations in ML Runtimes version 2025.01.2
Known Issues and Limitations in ML Runtimes version 2025.01.01
Known Issues and Limitations in ML Runtimes version 2024.10.01
Known Issues and Limitations in ML Runtimes version 2024.05.02
Known Issues and Limitations in ML Runtimes older releases
▶︎
ML Runtimes Pre-installed Packages
ML Runtimes Pre-installed Packages overview
▶︎
ML Runtimes 2025.01.2
Python 3.12 Libraries for Conda
Python 3.12 Libraries for Workbench
Python 3.11 Libraries for Workbench
Python 3.10 Libraries for Workbench
Python 3.9 Libraries for Workbench
Scala 2.12 Libraries for Workbench
Python 3.12 Libraries for JupyterLab
Python 3.11 Libraries for JupyterLab
Python 3.10 Libraries for JupyterLab
Python 3.9 Libraries for JupyterLab
R 4.4 Libraries
▶︎
ML Runtimes 2025.01.1
Python 3.12 Libraries for Conda
Python 3.12 Libraries for Workbench
Python 3.11 Libraries for Workbench
Python 3.10 Libraries for Workbench
Python 3.9 Libraries for Workbench
Scala 2.12 Libraries for Workbench
Python 3.12 Libraries for JupyterLab
Python 3.11 Libraries for JupyterLab
Python 3.10 Libraries for JupyterLab
Python 3.9 Libraries for JupyterLab
R 4.4 Libraries
▶︎
ML Runtimes 2024.10.1
Python 3.10 Libraries for Conda
Python 3.12 Libraries for Workbench
Python 3.11 Libraries for Workbench
Python 3.10 Libraries for Workbench
Python 3.9 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.7 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Python 3.12 Libraries for JupyterLab
Python 3.11 Libraries for JupyterLab
Python 3.10 Libraries for JupyterLab
Python 3.9 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
R 4.4 Libraries
▶︎
ML Runtimes 2024.05.2
Python 3.10 Libraries for Conda
Python 3.11 Libraries for Workbench
Python 3.10 Libraries for Workbench
Python 3.9 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.7 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Python 3.11 Libraries for JupyterLab
Python 3.10 Libraries for JupyterLab
Python 3.9 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
R 4.4 Libraries
▶︎
ML Runtimes 2024.05.1
Python 3.10 Libraries for Conda
Python 3.11 Libraries for Workbench
Python 3.10 Libraries for Workbench
Python 3.9 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.7 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Python 3.11 Libraries for JupyterLab
Python 3.10 Libraries for JupyterLab
Python 3.9 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
R 4.4 Libraries
▶︎
ML Runtimes 2024.02.1
Python 3.10 Libraries for Conda
Python 3.11 Libraries for Workbench
Python 3.10 Libraries for Workbench
Python 3.9 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.7 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Python 3.11 Libraries for JupyterLab
Python 3.10 Libraries for JupyterLab
Python 3.9 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
R 4.3 Libraries
▶︎
ML Runtimes 2023.12.1
Python 3.10 Libraries for Conda
Python 3.11 Libraries for Workbench
Python 3.10 Libraries for Workbench
Python 3.9 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.7 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Python 3.11 Libraries for JupyterLab
Python 3.10 Libraries for JupyterLab
Python 3.9 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
R 4.3 Libraries
▶︎
ML Runtimes 2023.08.2
Python 3.10 Libraries for Conda
Python 3.10 Libraries for Workbench
Python 3.9 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.7 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Python 3.10 Libraries for JupyterLab
Python 3.9 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
R 4.3 Libraries
R 4.1 Libraries
R 4.0 Libraries
R 3.6 Libraries
▶︎
ML Runtimes 2023.08
Python 3.10 Libraries for Conda
Python 3.10 Libraries for Workbench
Python 3.9 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.7 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Python 3.10 Libraries for JupyterLab
Python 3.9 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
R 4.3 Libraries
R 4.1 Libraries
R 4.0 Libraries
R 3.6 Libraries
▶︎
ML Runtimes 2023.05
Python 3.10 Libraries for Conda
Python 3.10 Libraries for Workbench
Python 3.9 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.7 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Python 3.10 Libraries for JupyterLab
Python 3.9 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
R 4.0 Libraries
R 4.1 Libraries
R 3.6 Libraries
▶︎
ML Runtimes 2022.11
Python 3.9 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.7 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Python 3.9.6 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
R 4.1 Libraries
R 4.0 Libraries
R 3.6 Libraries
Python 3.9 Libraries for PBJ Workbench
Python 3.8 Libraries for PBJ Workbench
Python 3.7 Libraries for PBJ Workbench
PBJ R 4.1 Libraries
PBJ R 4.0 Libraries
PBJ R 3.6 Libraries
▶︎
ML Runtimes 2022.04
Python 3.9 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.7 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Python 3.9.6 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
R 4.0 Libraries
R 4.1 Libraries
R 3.6 Libraries
▶︎
ML Runtimes 2021.12
Python 3.9 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.7 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Python 3.9.6 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
R 4.0 Libraries
R 4.1 Libraries
R 3.6 Libraries
▶︎
ML Runtimes 2021.09
Python 3.9 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.7 Libraries for Workbench
Python 3.6 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Python 3.9.6 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
Python 3.6 Libraries for JupyterLab
R 4.0 Libraries
R 3.6 Libraries
▶︎
ML Runtimes 2021.06
Python 3.8.6 Libraries for Workbench
Python 3.7.9 Libraries for Workbench
Python 3.6.12 Libraries for Workbench
Python 3.8.6 Libraries for JupyterLab
Python 3.7.9 Libraries for JupyterLab
Python 3.6.12 Libraries for JupyterLab
R 4.0 Libraries
R 3.6 Libraries
▶︎
ML Runtimes 2021.04
RAPIDS Runtime PIP Python 3.7.8 Libraries for Workbench
RAPIDS Runtime PIP Python 3.8.6 Libraries for Workbench
RAPIDS Runtime PIP Python 3.7.8 Libraries for JupyterLab
RAPIDS Runtime PIP Python 3.8.6 Libraries for JupyterLab
▶︎
ML Runtimes 2021.02
Python 3.8 Libraries for Workbench
Python 3.7 Libraries for Workbench
Python 3.6 Libraries for Workbench
Python 3.8 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
Python 3.6 Libraries for JupyterLab
R 4.0 Libraries
R 3.6 Libraries
ML Runtimes 2020.11
▶︎
Product Overview
Cloudera AI Overview
▶︎
Planning
▶︎
Architecture Overview
▶︎
Architecture Overview
Provisioning
Cloudera AI Architecture
ML Runtimes
Spark on Kubernetes
▶︎
Autoscaling Workloads with Kubernetes
Autoscale Groups
Critical and Non-critical Pods
▶︎
AWS Cloud
AWS Account Prerequisites
Limitations on AWS
Network Planning for Cloudera AI on AWS
▶︎
AWS IAM restricted roles and policies for compute and Cloudera AI
Create IAM roles and instance profile pair
Create role and policy used to deploy Cloudera environments for Cloudera AI
Use a non-transparent proxy with Cloudera AI on AWS environments
Certified scale limitations for Cloudera AI Workbenches on AWS
▶︎
Azure Cloud
Azure Account Requirements
Limitations on Azure
Network Planning for Cloudera AI on Azure
▶︎
Azure Environment Setup
Create subnets
Create Azure Files Storage Account and File Share
Create Azure NetApp Files Account, Capacity Pool and Volume
Other NFS Options
Set up minimum permissions
▶︎
Migrating from generic NFS to Azure Files NFS in Cloudera AI
Backing up workbench
Restoring old data to a new workbench
▼
How To
▶︎
Cloudera AI Workbenches
Provisioning Cloudera AI Workbenches
▶︎
Configuring User access to Cloudera AI
Granting Cloudera Users access to Cloudera AI Workbenches
Grant remote access to Cloudera AI Workbench
Accessing Cloudera AI Workbenches via SOCKS Proxy
Monitoring Cloudera AI Workbenches
Suspend and resume Cloudera AI Workbenches
▶︎
Backing up Cloudera AI Workbenches
Workbench backup and restore prerequisites
Backing up an Cloudera AI Workbench
Restore a Cloudera AI Workbench
Restoring to a different environment
Removing Cloudera AI Workbenches
Upgrading Cloudera AI Workbenches
Cloudera AI upgrades using Backup/Restore
Tagging disks to avoid garbage collection
Modify Instance Group Type
▶︎
User Roles and Team Accounts
User Roles
Business Users and Cloudera AI
Managing your Personal Account
Creating a Team
Managing a Team Account
▶︎
Projects
Collaboration Models
Sharing Job and Session Console Outputs
▶︎
Managing Projects
Creating a Project with Legacy Engine Variants
Creating a Project with ML Runtimes variants
Creating a project from a password-protected Git repo
Configuring Project-level Runtimes
Adding Project Collaborators
Modifying Project Settings
Managing Project Files
Custom Template Projects
Deleting a Project
▶︎
Native Workbench Console and Editor
Launch a Session
Run Code
Access the Terminal
Stop a Session
Workbench editor file types
Environmental Variables
▶︎
Third-party Editors
Modes of configuration
▶︎
Configure a browser-based IDE as an Editor
Testing a browser-based IDE in a Session
Configuring a browser-based IDE at the Project level
Legacy Engine level configuration
Configuring a local IDE using an SSH gateway
▶︎
Configure PyCharm as a local IDE
Add Cloudera AI as an Interpreter for PyCharm
Configure PyCharm to use Cloudera AI as the remote console
(Optional) Configure the Sync between Cloudera AI and PyCharm
▶︎
Configure VS Code as a local IDE
Download cdswctl and add an SSH Key
Initialize an SSH connection to Cloudera AI for VS code
Setting up VS Code
(Optional) Using VS Code with Python
(Optional) Using VS Code with R
(Optional) Using VS Code with Jupyter
(Optional) Using VS Code with Git integration
Limiting files in Explorer view
▶︎
Git for Collaboration
Linking an existing Project to a Git remote
▶︎
Embedded Web Applications
Example: A Shiny Application
Example: Flask application
▶︎
Runtimes
▶︎
Managing ML Runtimes
Adding new ML Runtimes
Adding Custom ML Runtimes through the Runtime Catalog
Adding ML Runtimes using Runtime Repo files
ML Runtimes versus Legacy Engine
Using Runtime Catalog
Changing Docker credential setting for ML Runtime
Disabling and Deleting Runtimes
▶︎
PBJ Workbench
Dockerfile compatible with PBJ Workbench
PBJ Runtimes and Models
Example models with PBJ Runtimes
▶︎
Using ML Runtimes Addons
Adding Hadoop CLI to ML Runtime Sessions
Adding Spark to ML Runtime Sessions
Turning off ML Runtimes Addons
▶︎
ML Runtimes NVIDIA GPU Edition
Testing ML Runtime GPU Setup
ML Runtimes NVIDIA RAPIDS Edition
▶︎
Using Editors for ML Runtimes
▶︎
Using JupyterLab with ML Runtimes
Installing a Jupyter extension
Installing a Jupyter kernel
Using Conda Runtime
Installing Additional ML Runtimes Packages
Restrictions for upgrading R and Python packages
Custom Runtime Addons with Cloudera AI
▶︎
ML Runtimes Environment Variables
ML Runtimes Environment Variables List
Accessing Environmental Variables from Projects
▶︎
Customized Runtimes
▶︎
Creating Customized ML Runtimes
Create a Dockerfile for the Custom Runtime Image
Metadata for Custom ML Runtimes
Editor Customization
Build the New Docker Image
Distribute the Image
Adding a new customized ML Runtime through the Runtime Catalog
Limitations
Adding Docker registry credentials and certificates
Pre-Installed Packages in ML Runtimes
▶︎
Legacy Engines
▶︎
Managing Engines
Creating Resource profiles
Configuring the engine environment
Set up a custom repository location
Burstable CPUs
▶︎
Installing additional packages
Using Conda to manage dependencies
▶︎
Engine environment variables
Engine environment variables
Accessing environmental variables from projects
▶︎
Customized engine images
▶︎
Creating a customized engine image
Create a Dockerfile for the custom image
Build the new Docker image
Distribute the image
Including images in allowlist for Cloudera AI projects
Add Docker registry credentials
Limitations with customized engines
End-to-end example: MeCab
▶︎
Pre-Installed Packages in engines
Base Engine 15-cml-2021.09-1
Base Engine 14-cml-2021.05-1
Base Engine 13-cml-2020.08-1
Base Engine 12-cml-2020.06-2
Base Engine 11-cml1.4
Base Engine 10-cml1.3
Base Engine 9-cml1.2
▶︎
Spark
Spark on Cloudera AI
Apache Spark supported versions
Spark configuration files
Managing memory available for Spark drivers
Managing dependencies for Spark 2 jobs
Spark Log4j Configuration
Setting up an HTTP Proxy for Spark 2
Spark web UIs
▶︎
Using Spark 2 from Python
Example: Monte Carlo estimation
Example: Locating and adding JARs to Spark 2 configuration
Using Spark 3 from R
▶︎
Using Spark 2 from Scala
Managing dependencies for Spark 2 and Scala
▶︎
GPUs
Using GPUs for Cloudera AI projects
Testing GPU Setup
Testing ML Runtime GPU Setup
▶︎
Experiments
Experiments with MLflow
Cloudera AI Experiment Tracking through MLflow API
Running an Experiment using MLflow
Visualizing Experiment Results
Using an MLflow Model Artifact in a Model REST API
Deploying an MLflow model as a Cloudera AI Model REST API
Automatic Logging
Setting Permissions for an Experiment
MLflow transformers
▶︎
Evaluating LLM with MLFlow
Using Heuristic-based metrics
Using LLM-as-a-Judge metrics
Known issues and limitations
▶︎
Exploratory Data Science and Visualization
Exploratory Data Science and Visualization
Prerequisites for Cloudera AI discovery and exploration
Starting Data Discovery and Visualization
Working with Data Discovery and Visualization
Set up a Hive or Impala data connection manually
Setting up a Spark data connection
Setting up Amazon S3 data connection
Setting up a data connection to Cloudera Data Hub
Data connection management
Setting the workload password
▶︎
Setting up a Custom Data Connection
Custom Data Connection Development
Developing and testing your first custom connection
Loading custom connections
Using data connection snippets
Managing default and backup data connections
API Permissions For Projects
Troubleshooting: 401 Unauthorized
Troubleshooting: 401 Unauthorized when accessing Hive
Troubleshooting: Existing connection name
Troubleshooting: Empty data page
Troubleshooting: Some connections not shown
▶︎
Models
Cloudera AI Project Lifecycle
▶︎
Models
Models - Concepts and Terminology
▶︎
Challenges with Machine Learning in production
Challenges with model deployment and serving
Challenges with model monitoring
▶︎
Challenges with model governance
Model visibility
Model explainability, interpretability, and reproducibility
Model governance using Apache Atlas
▶︎
Using Cloudera AI Registry
▶︎
Setting up Cloudera AI Registry
Creating a Cloudera AI Registry
Creating a Cloudera AI Registry on an Azure UDR Private Cluster
Setting up access for Cloudera AI Registry in a RAZ-enabled environment
Setting up access for Cloudera AI Registry in a non-RAZ-enabled environment
Synchronizing Cloudera AI Registry with a workbench
Viewing details for Cloudera AI Registries
Cloudera AI Registry permissions
Model access control
▶︎
Deleting Cloudera AI Registry
Force delete a Cloudera AI Registry
▶︎
Registering and deploying models with Cloudera AI Registry
Creating a model using MLflow
Registering a model using the AI Registry user interface
▶︎
Registering a model using MLflow SDK
Using MLflow SDK to register customized models
Viewing registered model information
Creating a new version of a registered model
▶︎
Deploying a model from the AI Registry page
Deploying a model from the Cloudera AI Registry using APIv2
Deploying a model from the destination Project page
Viewing Details for Cloudera AI Registry
Delete a model from Cloudera AI Registry
Disabling Cloudera AI Registry
▶︎
Upgrade Cloudera AI Registry
Roll back the registry upgrade
▶︎
Cloudera AI Registry standalone API
Prerequisites for Cloudera AI Registry standalone API
Authenticating clients for interacting with Cloudera AI Registry API
Role-based authorization
Using the REST Client
Cloudera AI Registry CLI Client
Known issues with Cloudera AI Registry standalone API
Updating Cloudera AI Registry configuration
▶︎
Troubleshooting issues with Cloudera AI Registry API
Cloudera AI Inference service cannot discover Cloudera AI Registry
Debugging the model import failure
Creating and deploying a Model
Usage guidelines for deploying models with Cloudera AI
Known Issues and Limitations with Model Builds and Deployed Models
Request/Response Formats (JSON)
Testing calls to a Model
▶︎
Securing Models
Access Keys for Models
▶︎
API Key for Models
Enabling authentication
Generating an API key
Managing API Keys
Workflows for active Models
Technical metrics for Models
Debugging issues with Models
Deleting a Model
Configuring model request payload size
▶︎
Managing Cloudera Copilot
Prerequisites to set up Cloudera Copilot
Configuring Cloudera Copilot
Using Cloudera Copilot
▶︎
Example - Model training and deployment (Iris)
Training the Model
Deploying the Model
▶︎
Model Hub
▶︎
Using Model Hub
Role-based authorization in Model Hub
Importing models from NVIDIA NGC
Importing models from Hugging Face (Technical Preview)
▶︎
Registered Models
▶︎
Using Registered Models
Deploying a model from Registered Models
Importing a Hugging Face Model (Technical Preview)
Viewing details of a registered model
Editing model visibility
Deleting a registered model version
▼
Cloudera AI Inference service
▶︎
Using Cloudera AI Inference service
Key Features
Key Applications
Terminology
Limitations and Restrictions
Supported Model Artifact Formats
▶︎
Prerequisites for setting up Cloudera AI Inference service
Authentication of Cloudera AI Inference service
Authorization of Cloudera AI Inference service
Importing Models
Register an ONNX model to Cloudera AI Registry
Cloudera AI Inference service Concepts
Cloudera AI Inference service Configuration and Sizing
▼
Managing Cloudera AI Inference service
▶︎
Managing Cloudera AI Inference service using Cloudera CLI
Creating a Cloudera AI Inference service instance
Listing Cloudera AI Inference service instances
Describing Cloudera AI Inference service instance
▶︎
Managing Node Groups
Adding a Node Group to an existing instance
Modifying a node group in an existing instance
Deleting a node group from an existing Cloudera AI Inference service instance
Deleting Cloudera AI Inference service instance
Obtaining Control Plane Audit Logs for Cloudera AI Inference service
▶︎
Obtaining the kubeconfig for the Cloudera AI Inference service Cluster
Obtaining kubeconfig on AWS
Obtaining kubeconfig on Azure
▼
Managing Cloudera AI Inference service using the UI
Creating a Cloudera AI Inference service instance using the UI
Listing Cloudera AI Inference service instances using the UI
Viewing details of a Cloudera AI Inference service instances using the UI
▼
Managing node groups using the UI
Adding a node group to an existing instance using the UI
Modifying a node group in an existing instance using the UI
Deleting a node group from an existing Cloudera AI Inference service instance using the UI
Deleting Cloudera AI Inference service instances using the UI
Obtaining Control Plane Audit Logs for Cloudera AI Inference service using the UI
Obtaining the kubeconfig of Cloudera AI Inference service using the UI
▶︎
Managing Model Endpoints using UI
Creating a Model Endpoint using UI
Listing Model Endpoints using UI
Viewing details of a Model Endpoint using UI
Editing a Model Endpoint Configuration using UI
▶︎
Managing Model Endpoints using API
Preparing to interact with the Cloudera AI Inference service API
Creating a Model Endpoint using API
Listing Model Endpoints using API
Describing a Model Endpoint using API
Deleting a Model Endpoint using API
Autoscaling Model Endpoints using API
Tuning auto-scaling sensitivity using the API
Running Models on GPU
Deploying models with Canary deployment using API
▶︎
Interacting with Model Endpoints
▶︎
Making an inference call to a Model Endpoint with an OpenAI API
Cloudera AI Inference service using OpenAI Python SDK client in a Cloudera AI Workbench Session
Cloudera AI Inference service using OpenAI Python SDK client on a local machine
OpenAI Inference Protocol Using Curl
▶︎
Making an inference call to a Model Endpoint with Open Inference Protocol
Open Inference Protocol Using Python SDK
Open Inference Protocol Using Curl
Deploying Predictive Models
Accessing Cloudera AI Inference service Metrics
Known issues
▶︎
Model Governance
Enabling model governance
Registering training data lineage using a linking file
Viewing lineage for a model deployment in Atlas
▶︎
Model Metrics
Enabling model metrics
Tracking model metrics without deploying a model
Tracking metrics for deployed models
▶︎
Applications
Analytical Applications
Securing Applications
Limitations with Analytical Applications
Monitoring applications
▶︎
Jobs and Pipelines
Creating a Job
Creating a Pipeline
Viewing Job History
Jobs API
▶︎
Data Access
Data Access
Upload and work with local files
▶︎
Connect to Cloudera Data Lake
Setup Data Lake Access
Example: Connect a Spark session to the Data Lake
Use Direct Reader Mode with PySpark
Use Direct Reader Mode with SparklyR
Create an Iceberg data connection
Accessing Data from HDFS
▶︎
Connecting to Cloudera Data Warehouse
▶︎
Accessing data with Spark
Use JDBC Connection with PySpark
Connect to a Cloudera Data Hub cluster
Connecting to external Amazon S3 buckets
Connect to External SQL Databases
▶︎
Accessing Ozone storage
Connecting to Ozone filesystem
Accessing local files in Ozone
▶︎
Distributed Computing
▶︎
Distributed Computing with Workers
Workers API
Worker Network Communication
▶︎
Accelerators for ML Projects (AMP)
Cloudera Accelerators for Machine Learning Projects
HuggingFace Spaces and Community AMPs
Creating New AMPs
Creating New AMPs using API
Custom AMP Catalog
Add a catalog
Catalog File Specification
AMP Project Specification
Restarting a failed AMP setup
▶︎
AMPs in airgapped environments
AMPs in airgapped environment with Proxy
AMPs in fully airgapped environments
Configuring Traefik readTimeout for large file uploads
▶︎
Site Administration
Managing Users
▶︎
Service Accounts
Creating a machine user and synchronizing to workbench
Sync machine users from the Synced team
Run workloads using a service account
Configuring Quotas
Creating Resource profiles
Disable or deprecate Runtime addons
Onboarding Business Users
Adding a Collaborator
▶︎
User Roles
Business Users and Cloudera AI
Managing your Personal Account
Creating a Team
Managing a Team Account
Managing a Synced Team
▶︎
Monitoring User Activity
Tracked User events
Monitoring User Events
Monitoring active Models
Monitoring and alerts
Application polling endpoint
Choosing default engine
Controlling User access to features
Cloudera AI email notifications
Downloading diagnostic bundles for a workbench
Web session timeouts
Project garbage collection
Ephemeral storage
Installing a non-transparent proxy in a Cloudera AI environment
Ports used by Cloudera AI
Export Usage List
▶︎
Private cluster support
Enable a private cluster
User Defined Routing (UDR)
Embed a Cloudera AI application in an external website
Setting up Cloudera AI Workbenches for high volume Workloads
Host name required by Learning Hub
▶︎
Security
▶︎
Configuring external authentication with LDAP and SAML
▶︎
Configuring LDAP/Active Directory authentication
LDAP general settings
LDAP group settings
Test LDAP Configuration
▶︎
Configuring SAML authentication
Configuration options
▶︎
Configuring HTTP Headers for Cloudera AI
Enable HTTP security headers
Enable HTTP Strict Transport Security (HSTS)
Enable Cross-Origin Resource Sharing (CORS)
▶︎
SSH Keys
Personal key
Team key
Adding an SSH key to GitHub
Creating an SSH tunnel
Hadoop authentication for Cloudera AI Workbenches
Cloudera AI and outbound network access
Non-transparent proxy and egress trusted list
▶︎
Troubleshooting
Recommended troubleshooting workflow
▶︎
Preflight checks
Instance type preflight check fails
Cloudera AI service with Data Lake upgrades
Debugging common Cloudera AI errors
Troubleshooting Cloudera AI Workbenches on AWS
Troubleshooting Cloudera AI Workbenches on Azure
▶︎
Logs for Cloudera AI Workbenches
Downloading diagnostic bundles for a workbench
▶︎
Troubleshooting Issues with Workloads
Troubleshooting Spark issues
Troubleshooting Kerberos issues
Handling project volume size increase in Cloudera AI
▶︎
Reference
▶︎
API
Cloudera AI API v2
API v2 usage
REST API v2 for AI Registry
REST API v2 for AI Workbench
REST API v2 for Cloudera AI Inference service
▶︎
CLI
Command Line Tools in Cloudera AI
▶︎
cdswctl Command Line Interface Client
Download and configure cdswctl
Initialize an SSH Endpoint
Log into cdswctl
Prepare to manage models using the model CLI
Create a model using the CLI
Build and deployment commands for models
Deploy a new model with updated resources
View replica logs for a model
▶︎
Using ML Runtimes with cdswctl
Querying the engine type
Listing runtimes
Starting sessions and creating SSH endpoints
Creating a model
cdswctl command reference
Azure NetApp files management with the CLI
▶︎
Visualizations
▶︎
Built-in Cloudera AI Visualizations
Simple Plots
Saved Images
HTML Visualizations
IFrame Visualizations
Grid Displays
Documenting Your Analysis
Cloudera Data Visualization for Cloudera AI
▶︎
Jupyter Magics
▶︎
Jupyter Magic Commands
Python
Scala
(Optional) Configure the Sync between Cloudera AI and PyCharm
(Optional) Using VS Code with Git integration
(Optional) Using VS Code with Jupyter
(Optional) Using VS Code with Python
(Optional) Using VS Code with R
Accelerators for ML Projects (AMP)
Access Keys for Models
Access the Terminal
Accessing Cloudera AI Inference service Metrics
Accessing Cloudera AI Workbenches via SOCKS Proxy
Accessing Data from HDFS
Accessing data with Spark
Accessing Environmental Variables from Projects
Accessing environmental variables from projects
Accessing local files in Ozone
Accessing Ozone storage
Add a catalog
Add Cloudera AI as an Interpreter for PyCharm
Add Docker registry credentials
Adding a Collaborator
Adding a new customized ML Runtime through the Runtime Catalog
Adding a Node Group to an existing instance
Adding a node group to an existing instance using the UI
Adding an SSH key to GitHub
Adding Custom ML Runtimes through the Runtime Catalog
Adding Docker registry credentials and certificates
Adding Hadoop CLI to ML Runtime Sessions
Adding ML Runtimes using Runtime Repo files
Adding new ML Runtimes
Adding Project Collaborators
Adding Spark to ML Runtime Sessions
AMP Project Specification
AMPs in airgapped environment with Proxy
AMPs in airgapped environments
AMPs in fully airgapped environments
Analytical Applications
Analytical Applications
Apache Spark supported versions
API
API Key for Models
API Permissions For Projects
API v2 usage
Application polling endpoint
Applications
April 14, 2020
April 21, 2022
April 25, 2024
April 26, 2023
April 5, 2023
Architecture Overview
Architecture Overview
August 04, 2020
August 22, 2019
August 23, 2021
August 30, 2022
August 31, 2021
August 31, 2023
Authenticating clients for interacting with Cloudera AI Registry API
Authentication of Cloudera AI Inference service
Authorization of Cloudera AI Inference service
Automatic Logging
Autoscale Groups
Autoscaling Model Endpoints using API
Autoscaling Workloads with Kubernetes
AWS Account Prerequisites
AWS Cloud
AWS IAM restricted roles and policies for compute and Cloudera AI
Azure Account Requirements
Azure Cloud
Azure Environment Setup
Azure NetApp files management with the CLI
Backing up an Cloudera AI Workbench
Backing up Cloudera AI Workbenches
Backing up workbench
Base Engine 10-cml1.3
Base Engine 11-cml1.4
Base Engine 12-cml-2020.06-2
Base Engine 13-cml-2020.08-1
Base Engine 14-cml-2021.05-1
Base Engine 15-cml-2021.09-1
Base Engine 9-cml1.2
Build and deployment commands for models
Build the New Docker Image
Build the new Docker image
Built-in Cloudera AI Visualizations
Burstable CPUs
Business Users and Cloudera AI
Business Users and Cloudera AI
Catalog File Specification
cdswctl Command Line Interface Client
cdswctl command reference
Certified scale limitations for Cloudera AI Workbenches on AWS
Challenges with Machine Learning in production
Challenges with model deployment and serving
Challenges with model governance
Challenges with model monitoring
Changing Docker credential setting for ML Runtime
Choosing default engine
CLI
Cloudera Accelerators for Machine Learning Projects
Cloudera AI
Cloudera AI and outbound network access
Cloudera AI API v2
Cloudera AI Architecture
Cloudera AI email notifications
Cloudera AI Experiment Tracking through MLflow API
Cloudera AI Inference service
Cloudera AI Inference service cannot discover Cloudera AI Registry
Cloudera AI Inference service Concepts
Cloudera AI Inference service Configuration and Sizing
Cloudera AI Inference service using OpenAI Python SDK client in a Cloudera AI Workbench Session
Cloudera AI Inference service using OpenAI Python SDK client on a local machine
Cloudera AI Overview
Cloudera AI Project Lifecycle
Cloudera AI Registry CLI Client
Cloudera AI Registry permissions
Cloudera AI Registry standalone API
Cloudera AI service with Data Lake upgrades
Cloudera AI upgrades using Backup/Restore
Cloudera AI Workbenches
Cloudera Data Visualization for Cloudera AI
Collaboration Models
Command Line Tools in Cloudera AI
Compatibility for Cloudera AI and Runtime components
Configuration options
Configure a browser-based IDE as an Editor
Configure PyCharm as a local IDE
Configure PyCharm to use Cloudera AI as the remote console
Configure VS Code as a local IDE
Configuring a browser-based IDE at the Project level
Configuring a local IDE using an SSH gateway
Configuring Cloudera Copilot
Configuring external authentication with LDAP and SAML
Configuring HTTP Headers for Cloudera AI
Configuring LDAP/Active Directory authentication
Configuring model request payload size
Configuring Project-level Runtimes
Configuring Quotas
Configuring SAML authentication
Configuring the engine environment
Configuring Traefik readTimeout for large file uploads
Configuring User access to Cloudera AI
Connect to a Cloudera Data Hub cluster
Connect to Cloudera Data Lake
Connect to External SQL Databases
Connecting to Cloudera Data Warehouse
Connecting to external Amazon S3 buckets
Connecting to Ozone filesystem
Controlling User access to features
Create a Dockerfile for the custom image
Create a Dockerfile for the Custom Runtime Image
Create a model using the CLI
Create an Iceberg data connection
Create Azure Files Storage Account and File Share
Create Azure NetApp Files Account, Capacity Pool and Volume
Create IAM roles and instance profile pair
Create role and policy used to deploy Cloudera environments for Cloudera AI
Create subnets
Creating a Cloudera AI Inference service instance
Creating a Cloudera AI Inference service instance using the UI
Creating a Cloudera AI Registry
Creating a Cloudera AI Registry on an Azure UDR Private Cluster
Creating a customized engine image
Creating a Job
Creating a machine user and synchronizing to workbench
Creating a model
Creating a Model Endpoint using API
Creating a Model Endpoint using UI
Creating a model using MLflow
Creating a new version of a registered model
Creating a Pipeline
Creating a project from a password-protected Git repo
Creating a Project with Legacy Engine Variants
Creating a Project with ML Runtimes variants
Creating a Team
Creating a Team
Creating an SSH tunnel
Creating and deploying a Model
Creating Customized ML Runtimes
Creating New AMPs
Creating New AMPs using API
Creating Resource profiles
Creating Resource profiles
Critical and Non-critical Pods
Custom AMP Catalog
Custom Data Connection Development
Custom Runtime Addons with Cloudera AI
Custom Template Projects
Customized engine images
Customized Runtimes
CVE-2021-44228 Remediation for Cloudera AI Data Service
Data Access
Data Access
Data connection management
Debugging common Cloudera AI errors
Debugging issues with Models
Debugging the model import failure
December 13, 2021
December 15, 2021
December 15, 2023
December 19, 2019
December 21, 2020
Delete a model from Cloudera AI Registry
Deleting a Model
Deleting a Model Endpoint using API
Deleting a node group from an existing Cloudera AI Inference service instance
Deleting a node group from an existing Cloudera AI Inference service instance using the UI
Deleting a Project
Deleting a registered model version
Deleting Cloudera AI Inference service instance
Deleting Cloudera AI Inference service instances using the UI
Deleting Cloudera AI Registry
Deploy a new model with updated resources
Deploying a model from Registered Models
Deploying a model from the AI Registry page
Deploying a model from the Cloudera AI Registry using APIv2
Deploying a model from the destination Project page
Deploying an MLflow model as a Cloudera AI Model REST API
Deploying models with Canary deployment using API
Deploying Predictive Models
Deploying the Model
Describing a Model Endpoint using API
Describing Cloudera AI Inference service instance
Developing and testing your first custom connection
Disable or deprecate Runtime addons
Disabling and Deleting Runtimes
Disabling Cloudera AI Registry
Distribute the Image
Distribute the image
Distributed Computing
Distributed Computing with Workers
Dockerfile compatible with PBJ Workbench
Documenting Your Analysis
Download and configure cdswctl
Download cdswctl and add an SSH Key
Downloading diagnostic bundles for a workbench
Downloading diagnostic bundles for a workbench
Editing a Model Endpoint Configuration using UI
Editing model visibility
Editor Customization
Embed a Cloudera AI application in an external website
Embedded Web Applications
Enable a private cluster
Enable Cross-Origin Resource Sharing (CORS)
Enable HTTP security headers
Enable HTTP Strict Transport Security (HSTS)
Enabling authentication
Enabling model governance
Enabling model metrics
End-to-end example: MeCab
Engine environment variables
Engine environment variables
Environmental Variables
Ephemeral storage
Evaluating LLM with MLFlow
Example - Model training and deployment (Iris)
Example models with PBJ Runtimes
Example: A Shiny Application
Example: Connect a Spark session to the Data Lake
Example: Flask application
Example: Locating and adding JARs to Spark 2 configuration
Example: Monte Carlo estimation
Experiments
Experiments with MLflow
Exploratory Data Science and Visualization
Exploratory Data Science and Visualization
Exploratory Data Science and Visualization
Export Usage List
February 10, 2022
February 10, 2023
February 13, 2020
February 14, 2023
February 20, 2024
February 3, 2021
February 7, 2023
February 8, 2024
Force delete a Cloudera AI Registry
Generating an API key
Git for Collaboration
GPUs
Grant remote access to Cloudera AI Workbench
Granting Cloudera Users access to Cloudera AI Workbenches
Grid Displays
Hadoop authentication for Cloudera AI Workbenches
Handling project volume size increase in Cloudera AI
Host name required by Learning Hub
HTML Visualizations
HuggingFace Spaces and Community AMPs
IFrame Visualizations
Importing a Hugging Face Model (Technical Preview)
Importing Models
Importing models from Hugging Face (Technical Preview)
Importing models from NVIDIA NGC
Including images in allowlist for Cloudera AI projects
Initialize an SSH connection to Cloudera AI for VS code
Initialize an SSH Endpoint
Installing a Jupyter extension
Installing a Jupyter kernel
Installing a non-transparent proxy in a Cloudera AI environment
Installing Additional ML Runtimes Packages
Installing additional packages
Instance type preflight check fails
Interacting with Model Endpoints
January 12, 2022
January 23, 2024
January 29, 2025
January 30, 2020
Jobs and Pipelines
Jobs API
July 12, 2023
July 17, 2024
July 21, 2022
July 25, 2023
July 8, 2021
June 11, 2024
June 20, 2024
June 24, 2021
June 30, 2020
June 9, 2020
Jupyter Magic Commands
Jupyter Magics
Key Applications
Key Features
Known issues
Known Issues and Limitations
Known issues and limitations
Known Issues and Limitations in ML Runtimes older releases
Known Issues and Limitations in ML Runtimes version 2024.05.02
Known Issues and Limitations in ML Runtimes version 2024.10.01
Known Issues and Limitations in ML Runtimes version 2025.01.01
Known Issues and Limitations in ML Runtimes version 2025.01.2
Known Issues and Limitations with Model Builds and Deployed Models
Known issues with Cloudera AI Registry standalone API
Launch a Session
LDAP general settings
LDAP group settings
Legacy Engine level configuration
Legacy Engines
Limitations
Limitations and Restrictions
Limitations on AWS
Limitations on Azure
Limitations with Analytical Applications
Limitations with customized engines
Limiting files in Explorer view
Linking an existing Project to a Git remote
Listing Cloudera AI Inference service instances
Listing Cloudera AI Inference service instances using the UI
Listing Model Endpoints using API
Listing Model Endpoints using UI
Listing runtimes
Loading custom connections
Log into cdswctl
Logs for Cloudera AI Workbenches
Making an inference call to a Model Endpoint with an OpenAI API
Making an inference call to a Model Endpoint with Open Inference Protocol
Managing a Synced Team
Managing a Team Account
Managing a Team Account
Managing API Keys
Managing Cloudera AI Inference service
Managing Cloudera AI Inference service using Cloudera CLI
Managing Cloudera AI Inference service using the UI
Managing Cloudera Copilot
Managing default and backup data connections
Managing dependencies for Spark 2 and Scala
Managing dependencies for Spark 2 jobs
Managing Engines
Managing memory available for Spark drivers
Managing ML Runtimes
Managing Model Endpoints using API
Managing Model Endpoints using UI
Managing Node Groups
Managing node groups using the UI
Managing Project Files
Managing Projects
Managing Users
Managing your Personal Account
Managing your Personal Account
March 14, 2022
March 16, 2020
March 22, 2021
March 27, 2023
March 6, 2024
May 13, 2021
May 15, 2024
May 16, 2023
May 29, 2024
May 31, 2022
May 31, 2023
May 5, 2020
Metadata for Custom ML Runtimes
Migrating from generic NFS to Azure Files NFS in Cloudera AI
ML Runtimes
ML Runtimes 2020.11
ML Runtimes 2021.02
ML Runtimes 2021.04
ML Runtimes 2021.06
ML Runtimes 2021.09
ML Runtimes 2021.12
ML Runtimes 2022.04
ML Runtimes 2022.11
ML Runtimes 2023.05
ML Runtimes 2023.08
ML Runtimes 2023.08.2
ML Runtimes 2023.12.1
ML Runtimes 2024.02.1
ML Runtimes 2024.05.1
ML Runtimes 2024.05.2
ML Runtimes 2024.10.1
ML Runtimes 2025.01.1
ML Runtimes 2025.01.2
ML Runtimes Environment Variables
ML Runtimes Environment Variables List
ML Runtimes Known Issues and Limitations
ML Runtimes NVIDIA GPU Edition
ML Runtimes NVIDIA RAPIDS Edition
ML Runtimes Pre-installed Packages
ML Runtimes Pre-installed Packages overview
ML Runtimes Release Notes
ML Runtimes Version 2020.11
ML Runtimes Version 2021.02
ML Runtimes Version 2021.04
ML Runtimes Version 2021.06
ML Runtimes Version 2021.09
ML Runtimes Version 2021.09.02
ML Runtimes Version 2021.12
ML Runtimes Version 2022.04
ML Runtimes Version 2022.11
ML Runtimes Version 2022.11.2
ML Runtimes Version 2023.05.1
ML Runtimes Version 2023.05.2
ML Runtimes Version 2023.08.1
ML Runtimes Version 2023.08.2
ML Runtimes Version 2023.12.1
ML Runtimes versus Legacy Engine
ML Runtimes What's New
MLflow transformers
Model access control
Model explainability, interpretability, and reproducibility
Model Governance
Model governance using Apache Atlas
Model Hub
Model Metrics
Model visibility
Models
Models
Models - Concepts and Terminology
Modes of configuration
Modify Instance Group Type
Modifying a node group in an existing instance
Modifying a node group in an existing instance using the UI
Modifying Project Settings
Monitoring Active Models
Monitoring active Models
Monitoring and alerts
Monitoring applications
Monitoring Cloudera AI Workbenches
Monitoring User Activity
Monitoring User Events
Native Workbench Console and Editor
Network Planning for Cloudera AI on AWS
Network Planning for Cloudera AI on Azure
Non-transparent proxy and egress trusted list
November 1, 2019
November 15, 2023
November 21, 2024
November 23, 2020
November 29, 2022
Obtaining Control Plane Audit Logs for Cloudera AI Inference service
Obtaining Control Plane Audit Logs for Cloudera AI Inference service using the UI
Obtaining kubeconfig on AWS
Obtaining kubeconfig on Azure
Obtaining the kubeconfig for the Cloudera AI Inference service Cluster
Obtaining the kubeconfig of Cloudera AI Inference service using the UI
October 10, 2024
October 18, 2021
October 19, 2022
October 19, 2023
October 27, 2021
October 29, 2020
October 8, 2024
Older releases
Onboarding Business Users
Open Inference Protocol Using Curl
Open Inference Protocol Using Python SDK
OpenAI Inference Protocol Using Curl
Other NFS Options
PBJ R 3.6 Libraries
PBJ R 4.0 Libraries
PBJ R 4.1 Libraries
PBJ Runtimes and Models
PBJ Workbench
Personal key
Ports used by Cloudera AI
Pre-Installed Packages in engines
Pre-Installed Packages in ML Runtimes
Preflight checks
Prepare to manage models using the model CLI
Preparing to interact with the Cloudera AI Inference service API
Prerequisites for Cloudera AI discovery and exploration
Prerequisites for Cloudera AI Registry standalone API
Prerequisites for setting up Cloudera AI Inference service
Prerequisites to set up Cloudera Copilot
Private cluster support
Product Overview
Project garbage collection
Projects
Provisioning
Provisioning Cloudera AI Workbenches
Python
Python 3.10 Libraries for Conda
Python 3.10 Libraries for Conda
Python 3.10 Libraries for Conda
Python 3.10 Libraries for Conda
Python 3.10 Libraries for Conda
Python 3.10 Libraries for Conda
Python 3.10 Libraries for Conda
Python 3.10 Libraries for Conda
Python 3.10 Libraries for JupyterLab
Python 3.10 Libraries for JupyterLab
Python 3.10 Libraries for JupyterLab
Python 3.10 Libraries for JupyterLab
Python 3.10 Libraries for JupyterLab
Python 3.10 Libraries for JupyterLab
Python 3.10 Libraries for JupyterLab
Python 3.10 Libraries for JupyterLab
Python 3.10 Libraries for JupyterLab
Python 3.10 Libraries for JupyterLab
Python 3.10 Libraries for Workbench
Python 3.10 Libraries for Workbench
Python 3.10 Libraries for Workbench
Python 3.10 Libraries for Workbench
Python 3.10 Libraries for Workbench
Python 3.10 Libraries for Workbench
Python 3.10 Libraries for Workbench
Python 3.10 Libraries for Workbench
Python 3.10 Libraries for Workbench
Python 3.10 Libraries for Workbench
Python 3.11 Libraries for JupyterLab
Python 3.11 Libraries for JupyterLab
Python 3.11 Libraries for JupyterLab
Python 3.11 Libraries for JupyterLab
Python 3.11 Libraries for JupyterLab
Python 3.11 Libraries for JupyterLab
Python 3.11 Libraries for JupyterLab
Python 3.11 Libraries for Workbench
Python 3.11 Libraries for Workbench
Python 3.11 Libraries for Workbench
Python 3.11 Libraries for Workbench
Python 3.11 Libraries for Workbench
Python 3.11 Libraries for Workbench
Python 3.11 Libraries for Workbench
Python 3.12 Libraries for Conda
Python 3.12 Libraries for Conda
Python 3.12 Libraries for JupyterLab
Python 3.12 Libraries for JupyterLab
Python 3.12 Libraries for JupyterLab
Python 3.12 Libraries for Workbench
Python 3.12 Libraries for Workbench
Python 3.12 Libraries for Workbench
Python 3.6 Libraries for JupyterLab
Python 3.6 Libraries for JupyterLab
Python 3.6 Libraries for Workbench
Python 3.6 Libraries for Workbench
Python 3.6.12 Libraries for JupyterLab
Python 3.6.12 Libraries for Workbench
Python 3.7 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
Python 3.7 Libraries for JupyterLab
Python 3.7 Libraries for PBJ Workbench
Python 3.7 Libraries for Workbench
Python 3.7 Libraries for Workbench
Python 3.7 Libraries for Workbench
Python 3.7 Libraries for Workbench
Python 3.7 Libraries for Workbench
Python 3.7 Libraries for Workbench
Python 3.7 Libraries for Workbench
Python 3.7 Libraries for Workbench
Python 3.7 Libraries for Workbench
Python 3.7 Libraries for Workbench
Python 3.7 Libraries for Workbench
Python 3.7 Libraries for Workbench
Python 3.7 Libraries for Workbench
Python 3.7.9 Libraries for JupyterLab
Python 3.7.9 Libraries for Workbench
Python 3.8 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.8 Libraries for JupyterLab
Python 3.8 Libraries for PBJ Workbench
Python 3.8 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.8 Libraries for Workbench
Python 3.8.6 Libraries for JupyterLab
Python 3.8.6 Libraries for Workbench
Python 3.9 Libraries for JupyterLab
Python 3.9 Libraries for JupyterLab
Python 3.9 Libraries for JupyterLab
Python 3.9 Libraries for JupyterLab
Python 3.9 Libraries for JupyterLab
Python 3.9 Libraries for JupyterLab
Python 3.9 Libraries for JupyterLab
Python 3.9 Libraries for JupyterLab
Python 3.9 Libraries for JupyterLab
Python 3.9 Libraries for JupyterLab
Python 3.9 Libraries for PBJ Workbench
Python 3.9 Libraries for Workbench
Python 3.9 Libraries for Workbench
Python 3.9 Libraries for Workbench
Python 3.9 Libraries for Workbench
Python 3.9 Libraries for Workbench
Python 3.9 Libraries for Workbench
Python 3.9 Libraries for Workbench
Python 3.9 Libraries for Workbench
Python 3.9 Libraries for Workbench
Python 3.9 Libraries for Workbench
Python 3.9 Libraries for Workbench
Python 3.9 Libraries for Workbench
Python 3.9 Libraries for Workbench
Python 3.9 Libraries for Workbench
Python 3.9.6 Libraries for JupyterLab
Python 3.9.6 Libraries for JupyterLab
Python 3.9.6 Libraries for JupyterLab
Python 3.9.6 Libraries for JupyterLab
Querying the engine type
R 3.6 Libraries
R 3.6 Libraries
R 3.6 Libraries
R 3.6 Libraries
R 3.6 Libraries
R 3.6 Libraries
R 3.6 Libraries
R 3.6 Libraries
R 3.6 Libraries
R 4.0 Libraries
R 4.0 Libraries
R 4.0 Libraries
R 4.0 Libraries
R 4.0 Libraries
R 4.0 Libraries
R 4.0 Libraries
R 4.0 Libraries
R 4.0 Libraries
R 4.1 Libraries
R 4.1 Libraries
R 4.1 Libraries
R 4.1 Libraries
R 4.1 Libraries
R 4.1 Libraries
R 4.3 Libraries
R 4.3 Libraries
R 4.3 Libraries
R 4.3 Libraries
R 4.4 Libraries
R 4.4 Libraries
R 4.4 Libraries
R 4.4 Libraries
R 4.4 Libraries
RAPIDS Runtime PIP Python 3.7.8 Libraries for JupyterLab
RAPIDS Runtime PIP Python 3.7.8 Libraries for Workbench
RAPIDS Runtime PIP Python 3.8.6 Libraries for JupyterLab
RAPIDS Runtime PIP Python 3.8.6 Libraries for Workbench
Recommended troubleshooting workflow
Register an ONNX model to Cloudera AI Registry
Registered Models
Registering a model using MLflow SDK
Registering a model using the AI Registry user interface
Registering and deploying models with Cloudera AI Registry
Registering training data lineage using a linking file
Release Notes
Removing Cloudera AI Workbenches
Request/Response Formats (JSON)
Restarting a failed AMP setup
Restore a Cloudera AI Workbench
Restoring old data to a new workbench
Restoring to a different environment
Restrictions for upgrading R and Python packages
Role-based authorization
Role-based authorization in Model Hub
Roll back the registry upgrade
Run Code
Run workloads using a service account
Running an Experiment using MLflow
Running Models on GPU
Runtimes
Saved Images
Scala
Scala 2.11 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Scala 2.11 Libraries for Workbench
Scala 2.12 Libraries for Workbench
Scala 2.12 Libraries for Workbench
Securing Applications
Securing Models
Security
September 23, 2019
September 26, 2024
September 27, 2022
Service Accounts
Set up a custom repository location
Set up a Hive or Impala data connection manually
Set up minimum permissions
Setting Permissions for an Experiment
Setting the workload password
Setting up a Custom Data Connection
Setting up a data connection to Cloudera Data Hub
Setting up a Spark data connection
Setting up access for Cloudera AI Registry in a non-RAZ-enabled environment
Setting up access for Cloudera AI Registry in a RAZ-enabled environment
Setting up Amazon S3 data connection
Setting up an HTTP Proxy for Spark 2
Setting up Cloudera AI Registry
Setting up Cloudera AI Workbenches for high volume Workloads
Setting up VS Code
Setup Data Lake Access
Sharing Job and Session Console Outputs
Simple Plots
Site Administration
Spark
Spark configuration files
Spark Log4j Configuration
Spark on Cloudera AI
Spark on Kubernetes
Spark web UIs
SSH Keys
Starting Data Discovery and Visualization
Starting sessions and creating SSH endpoints
Stop a Session
Supported Model Artifact Formats
Suspend and resume Cloudera AI Workbenches
Sync machine users from the Synced team
Synchronizing Cloudera AI Registry with a workbench
Tagging disks to avoid garbage collection
Team key
Technical metrics for Models
Terminology
Test LDAP Configuration
Testing a browser-based IDE in a Session
Testing calls to a Model
Testing GPU Setup
Testing ML Runtime GPU Setup
Testing ML Runtime GPU Setup
Third-party Editors
Top Tasks
Top Tasks for Cloudera AI
Tracked User events
Tracking metrics for deployed models
Tracking model metrics without deploying a model
Training the Model
Troubleshooting
Troubleshooting Cloudera AI Workbenches on AWS
Troubleshooting Cloudera AI Workbenches on Azure
Troubleshooting issues with Cloudera AI Registry API
Troubleshooting Issues with Workloads
Troubleshooting Kerberos issues
Troubleshooting Spark issues
Troubleshooting: 401 Unauthorized
Troubleshooting: 401 Unauthorized when accessing Hive
Troubleshooting: Empty data page
Troubleshooting: Existing connection name
Troubleshooting: Some connections not shown
Tuning auto-scaling sensitivity using the API
Turning off ML Runtimes Addons
Updating Cloudera AI Registry configuration
Upgrade Cloudera AI Registry
Upgrading Cloudera AI Workbenches
Upload and work with local files
Usage guidelines for deploying models with Cloudera AI
Use a non-transparent proxy with Cloudera AI on AWS environments
Use Direct Reader Mode with PySpark
Use Direct Reader Mode with SparklyR
Use JDBC Connection with PySpark
User Defined Routing (UDR)
User Roles
User Roles
User Roles and Team Accounts
Using an MLflow Model Artifact in a Model REST API
Using Cloudera AI Inference service
Using Cloudera AI Registry
Using Cloudera Copilot
Using Conda Runtime
Using Conda to manage dependencies
Using data connection snippets
Using Editors for ML Runtimes
Using GPUs for Cloudera AI projects
Using Heuristic-based metrics
Using JupyterLab with ML Runtimes
Using LLM-as-a-Judge metrics
Using ML Runtimes Addons
Using ML Runtimes with cdswctl
Using MLflow SDK to register customized models
Using Model Hub
Using Registered Models
Using Runtime Catalog
Using Spark 2 from Python
Using Spark 2 from Scala
Using Spark 3 from R
Using the REST Client
View replica logs for a model
Viewing details for Cloudera AI Registries
Viewing Details for Cloudera AI Registry
Viewing details of a Cloudera AI Inference service instances using the UI
Viewing details of a Model Endpoint using UI
Viewing details of a registered model
Viewing Job History
Viewing lineage for a model deployment in Atlas
Viewing registered model information
Visualizations
Visualizing Experiment Results
Web session timeouts
What's New
What's New in ML Runtimes older releases
What's New in ML Runtimes version 2024.02.1
What's New in ML Runtimes version 2024.05.1
What's New in ML Runtimes version 2024.05.2
What's New in ML Runtimes version 2024.10.1
What's new in ML Runtimes version 2025.01.1
What's new in ML Runtimes version 2025.01.2
Workbench backup and restore prerequisites
Workbench editor file types
Worker Network Communication
Workers API
Workflows for active Models
Working with Data Discovery and Visualization
«
Filter topics
Managing node groups using the UI
▶︎
Using Cloudera AI Inference service
Key Features
Key Applications
Terminology
Limitations and Restrictions
Supported Model Artifact Formats
▶︎
Prerequisites for setting up Cloudera AI Inference service
Authentication of Cloudera AI Inference service
Authorization of Cloudera AI Inference service
Importing Models
Register an ONNX model to Cloudera AI Registry
Cloudera AI Inference service Concepts
Cloudera AI Inference service Configuration and Sizing
▼
Managing Cloudera AI Inference service
▶︎
Managing Cloudera AI Inference service using Cloudera CLI
Creating a Cloudera AI Inference service instance
Listing Cloudera AI Inference service instances
Describing Cloudera AI Inference service instance
▶︎
Managing Node Groups
Adding a Node Group to an existing instance
Modifying a node group in an existing instance
Deleting a node group from an existing Cloudera AI Inference service instance
Deleting Cloudera AI Inference service instance
Obtaining Control Plane Audit Logs for Cloudera AI Inference service
▶︎
Obtaining the kubeconfig for the Cloudera AI Inference service Cluster
Obtaining kubeconfig on AWS
Obtaining kubeconfig on Azure
▼
Managing Cloudera AI Inference service using the UI
Creating a Cloudera AI Inference service instance using the UI
Listing Cloudera AI Inference service instances using the UI
Viewing details of a Cloudera AI Inference service instances using the UI
▼
Managing node groups using the UI
Adding a node group to an existing instance using the UI
Modifying a node group in an existing instance using the UI
Deleting a node group from an existing Cloudera AI Inference service instance using the UI
Deleting Cloudera AI Inference service instances using the UI
Obtaining Control Plane Audit Logs for Cloudera AI Inference service using the UI
Obtaining the kubeconfig of Cloudera AI Inference service using the UI
▶︎
Managing Model Endpoints using UI
Creating a Model Endpoint using UI
Listing Model Endpoints using UI
Viewing details of a Model Endpoint using UI
Editing a Model Endpoint Configuration using UI
▶︎
Managing Model Endpoints using API
Preparing to interact with the Cloudera AI Inference service API
Creating a Model Endpoint using API
Listing Model Endpoints using API
Describing a Model Endpoint using API
Deleting a Model Endpoint using API
Autoscaling Model Endpoints using API
Tuning auto-scaling sensitivity using the API
Running Models on GPU
Deploying models with Canary deployment using API
▶︎
Interacting with Model Endpoints
▶︎
Making an inference call to a Model Endpoint with an OpenAI API
Cloudera AI Inference service using OpenAI Python SDK client in a Cloudera AI Workbench Session
Cloudera AI Inference service using OpenAI Python SDK client on a local machine
OpenAI Inference Protocol Using Curl
▶︎
Making an inference call to a Model Endpoint with Open Inference Protocol
Open Inference Protocol Using Python SDK
Open Inference Protocol Using Curl
Deploying Predictive Models
Accessing Cloudera AI Inference service Metrics
Known issues
»
Cloudera AI Inference service
Managing node groups using the UI
You can add, modify, or delete node groups to or from your
Cloudera AI Inference service
instance.
Adding a node group to an existing instance using the UI
You can add one or more node groups to your cluster if you do not have the right worker node hardware (for example, incorrect GPU models for a new workload) in your existing cluster configuration.
Modifying a node group in an existing instance using the UI
You can reconfigure an existing node group including the autoscaling range of an AI Inference service instance.
Deleting a node group from an existing Cloudera AI Inference service instance using the UI
You can delete a node group from an existing Cloudera AI Inference service instance.
Parent topic:
Managing Cloudera AI Inference service using the UI