You can launch Fine Tuning Studio on the Cloudera AI Platform
to manage the life cycle of LLMs from training, and fine-tuning, to evaluating
LLMs.
- A GPU must be available in the cluster. For optimal performance with newer
LLMs, it is recommended to use GPUs such as A100 or H100.
-
In the Cloudera
console, click the Cloudera AI
tile.
The Cloudera AI Workbenches page displays.
-
Click on the name of the workbench.
The workbenches Home page displays.
-
Click Projects, and then click New
Project to create a new project.
In the left navigation pane, the new AI
Studios option is displayed.
-
Click AI Studios.
-
Click the Launch button in the Fine Tuning
Studio box.
The Configure Studio: Fine Tuning page is displayed.
-
Set the environment variables for the Fine Tuning Studio.
-
Select the Runtime version.
-
Click Launch AI Studio.
The
Fine Tuning Studio page is displayed.
After launching, you can view the list of tasks being executed as part of
the AI studio deployment.
After the configuration, Fine Tuning Studio is displayed in the
left navigation page under AI Studios.
You can train, manage, and evaluate large language models on this page.