Launching Fine Tuning Studio within a project
You can launch Fine Tuning Studio on the Cloudera AI Platform to manage the life cycle of LLMs from training, and fine-tuning, to evaluating LLMs.
- A GPU must be available in the cluster. For optimal performance with newer LLMs, it is recommended to use GPUs such as A100 or H100.
- Host names: For air-gapped installations that use a proxy setup, it is essential to whitelist the necessary URLs in your firewall rules. For a list of hostnames to whitelist, see Host names and endpoints required for AI Studios.
After the configuration, Fine Tuning Studio is displayed in the left navigation page under AI Studios.
You can train, manage, and evaluate large language models on this page.
