Launching Fine Tuning Studio within a project

You can launch Fine Tuning Studio on the Cloudera AI Platform to manage the life cycle of LLMs from training, and fine-tuning, to evaluating LLMs.

  • A GPU must be available in the cluster. For optimal performance with newer LLMs, it is recommended to use GPUs such as A100 or H100.
  1. In the Cloudera console, click the Cloudera AI tile.

    The Cloudera AI Workbenches page displays.

  2. Click on the name of the workbench.

    The workbenches Home page displays.

  3. Click Projects, and then click New Project to create a new project.

    In the left navigation pane, the new AI Studios option is displayed.

  4. Click AI Studios.
  5. Click the Launch button in the Fine Tuning Studio box.

    The Configure Studio: Fine Tuning page is displayed.

  6. Set the environment variables for the Fine Tuning Studio.
  7. Select the Runtime version.
  8. Click Launch AI Studio.
    The Fine Tuning Studio page is displayed.

    After launching, you can view the list of tasks being executed as part of the AI studio deployment.

After the configuration, Fine Tuning Studio is displayed in the left navigation page under AI Studios.

You can train, manage, and evaluate large language models on this page.