Apptainer (Singularity) container management for HPC workloads. Build SIF images, run containers with GPU passthrough. Use when users need HPC-compatible containerization or need to pull/run Apptainer images.
Manages Apptainer containers for HPC workloads, building SIF images and running with GPU passthrough.
/plugin marketplace add atrawog/overthink-plugins/plugin install overthink@overthink-pluginsThis skill inherits all available tools. When active, it can use any tool Claude has access to.
The apptainer command manages Apptainer (formerly Singularity) containers for HPC-compatible workloads. It provides SIF image management with automatic GPU detection.
Key Concept: Apptainer is the HPC standard. Unlike Docker/Podman, containers run as the user (no root). SIF files are single-file images.
| Action | Command | Description |
|---|---|---|
| Build | ujust apptainer build DEF | Build SIF from definition file |
| Cache | ujust apptainer cache [clean|status] | Manage Apptainer cache |
| Exec | ujust apptainer exec IMAGE CMD | Execute specific command in container |
| Inspect | ujust apptainer inspect IMAGE | Show SIF file metadata |
| Pull | ujust apptainer pull IMAGE | Download container image to SIF file |
| Run | ujust apptainer run IMAGE | Run container with default command |
| Shell | ujust apptainer shell [-- CMD] | Open interactive shell in container |
| Parameter | Long Flag | Short | Default | Description |
|---|---|---|---|---|
| action | (positional) | - | required | Action: pull, run, shell, exec, build, inspect, gpu, cache |
| image | --image | -i | "" | SIF file path, image name, or DEF file |
| tag | --tag | -t | "" | Image tag, output file, or cache subaction |
| cmd | (variadic) | - | "" | Command to execute (use -- separator) |
# Pull nvidia-python (long form)
ujust apptainer pull --image=nvidia-python
# Pull with tag (long form)
ujust apptainer pull --image=nvidia-python --tag=testing
# Pull nvidia-python (short form)
ujust apptainer pull -i nvidia-python
# Pull with tag (short form)
ujust apptainer pull -i nvidia-python -t testing
# Pull jupyter
ujust apptainer pull --image=jupyter --tag=stable
# Docker Hub
ujust apptainer pull --image=docker://ubuntu:22.04
# NVIDIA NGC
ujust apptainer pull --image=docker://nvcr.io/nvidia/pytorch:latest
# Sylabs Cloud
ujust apptainer pull --image=library://sylabsed/examples/lolcow
Images are saved as SIF files:
~/.local/share/apptainer/overthink-pod-nvidia-python.sif
# Run nvidia-python (long form)
ujust apptainer run --image=nvidia-python
# Run nvidia-python (short form)
ujust apptainer run -i nvidia-python
# Run specific SIF file
ujust apptainer run --image=./my-container.sif
# Run Python in container (use -- separator for commands)
ujust apptainer run --image=nvidia-python -- python
# Run script
ujust apptainer run --image=nvidia-python -- python script.py
# Short form
ujust apptainer run -i nvidia-python -- python train.py
GPU flags are auto-detected:
--nv--rocm# GPU is automatically enabled
ujust apptainer run --image=nvidia-python -- python -c "import torch; print(torch.cuda.is_available())"
# Shell into container (long form)
ujust apptainer shell --image=nvidia-python
# Shell into container (short form)
ujust apptainer shell -i nvidia-python
# Now inside container
python --version
nvidia-smi
exit
# Execute single command (use -- separator)
ujust apptainer exec --image=nvidia-python -- pip list
# Execute Python one-liner
ujust apptainer exec -i nvidia-python -- python -c 'print(1+1)'
Bootstrap: docker
From: ubuntu:22.04
%post
apt-get update
apt-get install -y python3 python3-pip
%runscript
python3 "$@"
# Build SIF from definition (image=DEF, tag=OUTPUT)
ujust apptainer build --image=mydef.def --tag=myimage.sif
# Build to default location
ujust apptainer build --image=mydef.def
# Short form
ujust apptainer build -i mydef.def -t myimage.sif
# Detect and test GPU
ujust apptainer gpu
| GPU | Flag | Auto-Detection |
|---|---|---|
| NVIDIA | --nv | Yes |
| AMD | --rocm | Yes |
| Intel | (none yet) | No |
# Direct apptainer command with GPU
apptainer run --nv nvidia-python.sif nvidia-smi
# Long form
ujust apptainer cache --tag=list
# Or
ujust apptainer cache list
# Long form
ujust apptainer cache --tag=clean
# Or
ujust apptainer cache clean
Cache is stored in ~/.apptainer/cache/.
# Pull HPC-ready image
ujust apptainer pull --image=nvidia-python
# Test GPU
ujust apptainer gpu
# Development shell
ujust apptainer shell --image=nvidia-python
# Run production workload
ujust apptainer run --image=nvidia-python -- python train.py
# Pull NVIDIA PyTorch
ujust apptainer pull --image=docker://nvcr.io/nvidia/pytorch:23.10-py3
# Run training
ujust apptainer run --image=pytorch_23.10-py3.sif -- python train.py
# Create definition file
cat > myenv.def << 'EOF'
Bootstrap: docker
From: python:3.11
%post
pip install numpy pandas scikit-learn
%runscript
python "$@"
EOF
# Build
ujust apptainer build --image=myenv.def --tag=myenv.sif
# Test
ujust apptainer run --image=myenv.sif -- python -c "import numpy; print(numpy.__version__)"
| Feature | Apptainer | Docker/Podman |
|---|---|---|
| Root required | No | Sometimes |
| Single file | Yes (SIF) | No (layers) |
| HPC compatible | Yes | Limited |
| GPU support | --nv, --rocm | nvidia-docker |
| Security model | User namespace | Container namespace |
Use Apptainer when:
Check:
# Test network
curl -I https://ghcr.io
# Check registry auth
apptainer remote list
Fix:
# Login to registry
apptainer remote login docker://ghcr.io
Check:
ujust apptainer gpu
nvidia-smi # or rocm-smi
Fix:
# Ensure drivers installed
# For NVIDIA:
nvidia-smi
# For AMD:
rocm-smi
Fix:
# Remove and re-pull
rm ~/.local/share/apptainer/*.sif
ujust apptainer pull --image=nvidia-python
Check:
du -sh ~/.apptainer/cache/
Fix:
ujust apptainer cache --tag=clean
pod (build OCI images), jupyter (uses containers)ujust config gpu setupUse when the user asks about:
Expert guidance for Next.js Cache Components and Partial Prerendering (PPR). **PROACTIVE ACTIVATION**: Use this skill automatically when working in Next.js projects that have `cacheComponents: true` in their next.config.ts/next.config.js. When this config is detected, proactively apply Cache Components patterns and best practices to all React Server Component implementations. **DETECTION**: At the start of a session in a Next.js project, check for `cacheComponents: true` in next.config. If enabled, this skill's patterns should guide all component authoring, data fetching, and caching decisions. **USE CASES**: Implementing 'use cache' directive, configuring cache lifetimes with cacheLife(), tagging cached data with cacheTag(), invalidating caches with updateTag()/revalidateTag(), optimizing static vs dynamic content boundaries, debugging cache issues, and reviewing Cache Component implementations.