Downloading Model Repositories for an air-gapped environment
To use Models from NVIDIA NGC and Hugging Face, the Administrator must download Model artifacts from these sources on specially networked hosts.
Downloading a HuggingFace model
- Download the Llama-3.1-Nemotron-70B-Instruct-HF Model
from Hugging Face to your local file system with the following command:
python3 import_to_airgap.py -do -rt hf -t hf_hVQbUCkpCicZYjnqsNAfafafafafafaAEkj -p $PWD/models -ri Nvidia/Llama-3.1-Nemotron-70B-Instruct-HFThe download includes all Model files along with metadata in the specified destination directory.
- Download a different Hugging Face Model to your local
file system with the following command:
python3 import_to_arigap.py -do -rt hf -t <your-hf-token> -p $PWD/models -ri meta-llama/Llama-2-70b-chat-hf-
You can obtain the
riargument for Hugging Face as follows:- Open up the Hugging Face page at: https://huggingface.co/.
-
Search for the required model.
The page of the model displays.
- Click the
icon next to the name of
the model, and copy the model ID, that is the
riargument to be used.
The download includes all Model files along with metadata in the specified destination directory. -
Downloading NGC model
- Download the NVIDIA NGC Model to your local file system
with the following command:
python3 import_to_airgap.py -do -rt ngc -p $PWD/models -ri nim/meta/llama-3_1-70b-instruct:0.11.1+14957bf8-h100x4-fp8-throughput.1.2.18099809 -ns ngc_spec.yamlThe download includes all Model files along with metadata in the specified destination directory.
