Provisioning a Cloudera AI Workbench

In Cloudera AI on Private Cloud, the Cloudera AI Workbench provides a space for the data scientists' work. After your Administrator has created or given you access to an environment, you can set up a workbench.

The first user to access the Cloudera AI Workbench after it is created must have the EnvironmentAdmin role assigned.
  1. Log in to the Cloudera Private Cloud web interface using your corporate credentials or other credentials that you received from your Cloudera administrator.
  2. Click Cloudera AI Workbench.
  3. Click Provision Workbench. The Provision Workbench panel displays.
  4. In Provision Workbench, fill out the following fields.
    1. Workbench Name - Give the Cloudera AI Workbench a name. For example, test-cml. Do not use capital letters in the workbench name.
    2. Select Environment - From the dropdown, select the environment where the Cloudera AI Workbench must be provisioned. If you do not have any environments available to you in the dropdown, contact your Cloudera administrator to gain access.
    3. Namespace - Enter the namespace to use for the Cloudera AI Workbench.
    4. NFS Server - Select Internal to use an NFS server that is integrated into the Kubernetes cluster. This is the recommended selection at this time.
      The path to the internal NFS server is already set in the environment.
  5. In Production Cloudera AI enable the following features.
    1. Enable Governance - Enables advanced lineage and governance features.
      Governance Principal Name - If Enable Governance is selected, you can use the default value of mlgov, or enter an alternative name. The alternative name must be present in your environment and be given permissions in Ranger to allow the Cloudera AI Governance service deliver events to Atlas.
    2. Enable Model Metrics - Enables exporting metrics for models to a PostgreSQL database.
  6. In Other Settings enable the following features.
    1. Enable TLS - Select this to enable https access to the workbench.
      To enable TLS, follow the guidelines in Deploying an Cloudera AI Workbench with support for TLS .
    2. Enable Monitoring - Administrators (users with the EnvironmentAdmin role) can use a Grafana dashboard to monitor resource usage in the provisioned workbench.
    3. Cloudera AI Static Subdomain - This is a custom name for the workbench endpoint, and it is also used for the URLs of models, applications, and experiments. Only one workbench with the specific subdomain endpoint name can be running at a time. You can create a wildcard certificate for this endpoint in advance. The workbench name has this format: <static subdomain name>.<application domain>
  7. Click Provision Workbench. The new workbench provisioning process takes several minutes.

After the workbench is provisioned, you can log in by clicking the workbench name on the Cloudera AI Workbenches page. The first user to log in must be the administrator.

Test backing up of the Cloudera AI Workbench. Ensure that the backup completes successfully, and then ensure you have a process to back up the workbench at regular intervals.