What's New

Major features and updates for the Cloudera AI data service.

November 21, 2024

Release notes and fixed issues for version 2.0.46-b238.

New Features / Improvements

  • Model Hub Enhancement: The model size is now shown in the user-friendly format both in the Model Hub UI and AI Registry UI.
  • Cloudera AI Inference service Enhancement: New AI Inference Services menu item is added to the left-navigation pane of the Cloudera AI UI to manage the lifecycle of Cloudera AI Inference service using UI. For more information, see Managing Cloudera AI Inference serviceusing the UI.
  • Added Spark 3.5 ML Runtime Addon
  • Product and features named:
    • Clouder Machine Learning (CML) is renamed to Cloudera AI.
    • Cloudera Machine Learning Model Registry is renamed to Cloudera AI Registry.
    • Cloudera Machine Learning Workspaces is renamed to Cloudera AI Workbenches.
    • Cloudera Applied Machine Learning Prototypes and Accelerators for ML Projects is renamed to Cloudera Accelerators for Machine Learning Projects.

Fixed Issues

  • CVE fixes - This release includes numerous security fixes for critical and high Common Vulnerability and Exposures (CVE).
  • Previously, the public and private settings did not carry forward after the AI Registry upgrade. This issue is now resolved. (DSE-36799)
  • Enhanced the error message that was displayed when importing a model from Model Hub to Registered Models. (DSE-39897)
  • Generic (vLLM) NIM profile deployment was returning an empty GPU list in the UI. This issue is now resolved. (DSE-39913)
  • Previously, public cloud CDP CLI was not showing the instance type's GPU count. This issue is now resolved. (DSE-39539)
  • Cloudera AI v2 API deployed application did not inherit user-level environment variables and site-level environment variables. This issue has been solved, and now an application created using APIv2 does not only inherit project-level environment variables but also user-level environment variables and site-level environment variables. (DSE-37611)
  • Previously, scheduled jobs skipped job runs and did not specify the error. Now, the skipped jobs runs have improved exit code to distinguish them from failed jobs. (DSE-39976)
  • Previously, the Next buttons on the Site Administration page did not work. This issue is now resolved. (DSE-34133).
  • Previously, restarting the application using the Cloudera AI v2 API did not inherit account application-level environment variables. This issue is now resolved. (DSE-39894)
  • Users can now view the existing applications in the Cloudera AI UI even if the creation of a new application is disabled. (DSE-39980)
  • Previously, Python logging did not work with PBJ Runtimes. This issue is now resolved. (DSE-39929)
  • Previously, reloading the session page would result in an incorrect state where the PBJ session's editor cell could appear green even if it is in a processing state (executing some commands). With this fix, an accurate representation of the processing state is displayed even after a refresh. (DSE-40049)

October 10, 2024

Release notes and fixed issues for version 2.0.46-b210.

New Features / Improvements

  • Model Hub: Model Hub is now a fully supported feature. Model Hub is a catalog of top-performing models LLM and generative AI models. You can now easily import the models listed in the Model Hub into the AI Registry and then deploy it using the Cloudera AI Inference service.

    For more information, see Using Model Hub.

  • Cloudera AI Inference service Enhancements:
    • Added support for NVIDIA's NIM profiles requiring for the L40S GPU models.
    • Made auto-scale configuration which is rendered in UI during the creation of model endpoint user-friendly. (DSE-38845)
    • Optimized Cloudera AI UI service to become more responsive.
    • User actionable error messages are now rendered in Cloudera AI service UI.

      For more information, see Using Cloudera AI Inference service.

Fixed Issues

  • Addressed scaling issues with web services to support high active user concurrency (DSE-39597).
  • CVE fixes - This release includes numerous security fixes for critical and high Common Vulnerability and Exposures (CVE).
  • Fixed CORS issue to ensure that DELETE/PATCH V1 API can be used from within a workbench. (DSE-39357)
  • Made the NGC service key used to download Nvidia’s optimized models more restrictive. (DSE-39475)
  • Previously, users were unable to copy the model-id from Cloudera AI UI. This issue is now resolved. (DSE-38889)
  • Authorization issues related to the listing of Cloudera AI applications have been addressed. (DSE-39386)
  • Fixed an issue to ensure that instance type validation is correctly carried out during the creation of a new model endpoint. (DSE-39634)
  • Added required validation rules for the creation of a new model endpoint. (DSE-38412)
  • Addressed an issue around empty model list during navigation from registry models to deployment of models. (DSE-39634)

October 8, 2024

Release notes and fixed issues for Cloudera AI Inference service version 1.2.0-b73.

New Features / Improvements

  • Cloudera AI Inference service: Cloudera AI Inference service is now a fully supported data service. Cloudera AI Inference service is a production-grade serving environment for traditional, generative AI, and Large Language Models. It is designed to handle the challenges of production deployments, such as high availability, fault tolerance, and scalability. The service is now available to carry out inference on the following categories of models:
    • Optimized open-source Large Language Models.
    • Traditional machine learning models like classification, regression, and so on. Models need to be imported to the AI Registry to be served using the Cloudera AI Inference service.

    For more information, see Using Cloudera AI Inference service.