@@ -99,13 +99,46 @@ In this repository you find following examples or blueprints that you can adapt
-[transformers](transformers): ...
-[pytorch](nvidia_pytorch): ...
- ...
-[tensorflow](): ...
...
## Official base containers
You probably want to build your own container with your own custom software stack and environment in it.
We recommend, however, that you alway build on top of certain *base containers* depending on your application and the kind of GPUs you want to use.
Nvidia and AMD both provide containers for common AI frameworks that come with all the necessary software to optimally use their hardware.
Below is a list of links to these containers sorted by vendor and application.
To use these in your custom container you have to specify their path and tag in the [Apptainer definition file]().
For example, to build on top of Nvidia's PyTorch container your definition file should start like this (see also the [build section]() below)
```bash
BootStrap: docker
From: nvcr.io/nvidia/pytorch:25.04-py3
...
```
> [!tip]
> Use the [release notes](https://docs.nvidia.com/deeplearning/frameworks/pytorch-release-notes/index.html) to match Nvidia's container tag to the actual PyTorch version installed in the container.
> [!warning]
> Most of the these base containers are quite large and it will take some time to download and build them.