Cloudera AI Workbench Models (Legacy)PDF version

Prerequisites for Cloudera AI Registry standalone API

To set up the Cloudera AI Registry standalone API, configure the Cloudera AI Inference service and import pretrained Models.

Cloudera AI Registry is a prerequisite for Cloudera AI Inference service because the Cloudera AI Inference service needs to deploy the models that are stored in the Cloudera AI Registry.

  • To use the Cloudera AI Inference service, the latest Cloudera AI Registry must be present in the same Cloudera environment before the Cloudera AI Inference service is created.
  • If there is an older Cloudera AI Registry in the environment that is created before May 14, 2024, follow the Upgrade Cloudera AI Registry instructions to upgrade the Cloudera AI Registry to the latest version before you create the Cloudera AI Inference service.
  • If the Cloudera AI Registry is recreated, upgraded, or cert-renewed while the Cloudera AI Inference service is present, then follow the steps listed in the Manually updating Cloudera AI Registry configuration topic to ensure that the configuration of Cloudera AI Registry and Cloudera AI Inference service are synchronized.

You must add the URL details to allow them in the firewall rules.

NVIDIA GPU Cloud (NGC)

Add the following URL details so they can be allowed in the firewall's rules.

  • prod.otel.kaizen.nvidia.com (NVIDIA open telemetry)
  • api.ngc.nvidia.com
  • files.ngc.nvidia.com

Hugging Face

Add the following URL details so they can be allowed in the firewall's rules.
  • huggingface.co
  • cdn-lfs.huggingface.co
  • *.cloudfront.net (CDN)

We want your opinion

How can we improve this page?

What kind of feedback do you have?