In VMware Private AI Foundation with NVIDIA, as a DevOps engineer, by using the Kubernetes API, you provision a TKG cluster that uses NVIDIA GPUs. Then, you can deploy container AI workloads from the NVIDIA NGC catalog.
You use kubectl to deploy the TKG cluster on the namespace configured by the cloud administrator.
Prerequisites
Verify with the cloud administrator that the following prerequisites are in place for the AI-ready infrastructure.
- VMware Private AI Foundation with NVIDIA is configured. See Preparing VMware Cloud Foundation for Private AI Workload Deployment.
- In a disconnected environment, a content library with Ubuntu TKr images is added to the vSphere namespace for AI workloads. See Configure a Content Library with Ubuntu TKr for a Disconnected VMware Private AI Foundation with NVIDIA Environment.