Prerequisites for upgrading Cloudera AI Inference service
Before upgrading Cloudera AI Inference service, you must fulfill all prerequisites.
Check the following requirements, permissions and roles for upgrading Cloudera AI Inference service:
Roles and permissions
You must have the following roles:
- MLAdmin role
- EnvironmentAdmin role
Platform requirements
You must meet the following platform requirements:
- For OpenShift Container Platform-based deployments:
- The OpenShift Container Platform cluster must be upgraded before upgrading the Cloudera AI Inference service.
- The OpenShift Container Platform must be version 4.19 or higher.
Service state
Check the statuses of the following services:
- The Cloudera AI Inference service must be in the
Readystate. - The
Upgradeoption is available only when a higher Cloudera AI Inference service version is supported.
Infrastructure requirements
You must meet the following infrastructure requirements:
- The Kubernetes cluster is healthy and reachable.
- Network connectivity between the on-premises cluster and the configured custom Docker Registry (if any) is stable.
- The Kubeconfig credentials with appropriate permissions are configured and up to date.
- Sufficient resources, CPU, GPU, and memory, are available in the cluster.
Storage requirements
You must meet the following storage requirements:
- The configured storage backend, which is NFS, S3-compatible, or equivalent, is accessible.
- Storage credentials are valid.
