Known issues

There are some known issues you might run into while using Cloudera AI Inference service.

  • The following compute instance types are not supported by Cloudera AI Inference service:
  • Updating the description after a model has been added to a model endpoint will lead to a UI mismatch in the model builder for models listed by the model builder and the models deployed.
  • When you create a model endpoint from the Create Endpoint page, even though the instance type selection is not mandatory, the instance creation fails if the instance type is not selected.
  • DSE-39626: If no worker node can be found within 10 minutes to schedule a model endpoint that is either newly created or is scaling up from 0, the system will give up trying to create and schedule the replica. A common reason for this behavior is insufficient cloud quota, or capacity constraints on the cloud service provider’s side. You could either ask for increased quota, or try to use an instance type that is more readily available.