Prerequisites for Model Registry Standalone API

To set up the Cloudera Machine Learning Registry standalone API, configure the Cloudera AI Inference and import pretrained Models.

Prerequisites for Cloudera AI Inference

CML Registry is a prerequisite for Cloudera AI Inference because the AI inference service needs to deploy the models that are stored in the CML Registry.

  • To use the Cloudera AI Inference service, the latest CML Registry must be present in the same CDP environment before the Cloudera AI Inference service is created.
  • If there is an older CML Registry in the environment that is created before May 14, 2024, follow the Upgrade Model Registry instructions to upgrade the Registry to the latest version before you create the Cloudera AI Inference service.
  • If the CML Registry is re-created, upgraded, or cert-renewed while the Cloudera AI Inference is present, then follow the steps listed in the Manually Updating Model Registry Configuration topic to ensure that the configuration of CML Registry and Cloudera AI Inference are synchronized.

Prerequisites to import pretrained models

You must add the URL details to allow them in the firewall rules.

NVIDIA GPU Cloud (NGC)

Add the following URL details so they can be allowed in the firewall's rules.

  • prod.otel.kaizen.nvidia.com (NVIDIA open telemetry)
  • api.ngc.nvidia.com
  • files.ngc.nvidia.com

Hugging Face

Add the following URL details so they can be allowed in the firewall's rules.
  • huggingface.co
  • cdn-lfs.huggingface.co
  • *.cloudfront.net (CDN)