Terminology
Lists the Cloudera AI Inference service terminology and usage.
- CML Serving App: This is the term used by the Cloudera Data Platform CLI to refer to a specific instance of Cloudera AI Inference service.
- Model Endpoint: This refers to a deployed model that has a URL endpoint accessible over the network.
- Model Artifacts: Files stored in AI Registry that are necessary for deploying an instance of the model, such as model weights, metadata, and so on.
- API standard: The protocol that is exposed by a Model Endpoint. It can be either OpenAI (for NVIDIA NIM) or Open Inference Protocol for predictive models.
- Cloudera Data Platform Workload Authentication Token: The bearer token used for authentication / authorization when accessing Cloudera AI Inference service API and model endpoints. Throughout this document this is referred to as “CDP_TOKEN”.
- Model ID: This is the ID assigned to the model when it is registered to the AI Registry.
- Model Version: This is the version of a registered model in the AI Registry.