Docker Containerization Guidelines#

Important

When the Docker Env is ready to use, the details of which are available below, click to continue to the quickstart section Checking the build scikit-plots.

πŸš€ Docker Containerization#

πŸ’‘ Work on Docker Desktop or Github Codespaces

Here’s how containerization works:

  • Isolation: Containers run independently of each other and the host system, ensuring that they don’t interfere with other applications or containers.

  • Portability: Since containers include everything the application needs to run, they can be moved between different environments (like from development to production) without any compatibility issues.

  • Efficiency: Containers are more lightweight than virtual machines (VMs) because they share the host OS’s kernel rather than running their own separate operating system. This makes them faster and more resource-efficient.

  • Consistency: The application inside the container runs the same way regardless of where it’s deployed, ensuring consistency across environments.

Github Codespaces Guide#

(Connect IDE Interface Vscode or Jupyter Notebook)

Choose (recommended) not (default) Option for best practise



Docker Desktop Guide#

## Forked repo: https://github.com/scikit-plots/scikit-plots.git
git clone https://github.com/YOUR-USER-NAME/scikit-plots.git
cd scikit-plots/docker

## Use terminal or open to vscode to run ``docker compose``
code .

Docker Environment Setup for IDE (Vscode/Jupyter) and/or NVIDIA GPU driver

This repository contains Docker & Docker Compose configurations for running Jupyter Notebooks with optional NVIDIA GPU support.

You can run containers with either host-installed CUDA or pre-installed CUDA inside the container.

🏷️ Docker Compose Quickstart Guide#

πŸ’‘ The easiest way to launch the environment is using Docker Compose.#

▢️ Run Docker Env Jupyter Notebook (CPU only)

docker compose up --build notebook_cpu

▢️ Run Docker Env Jupyter Notebook (With NVIDIA Host GPU)

docker compose up --build app_nvidia_host_gpu_driver

▢️ Run Docker Env Jupyter Notebook (With NVIDIA Internal CUDA GPU)

docker compose up --build app_nvidia_internal_gpu_driver

▢️ Run Docker Env Jupyter Notebook by VS Code#

▢️ Connect Docker Container Especially When Docker-GUI dont available#

# docker-compose up --build notebook_cpu

docker ps  # check running containers
docker logs CONTAINER_ID_OR_NAME  # find jupyter (token) http address 127.0....
docker exec -it CONTAINER_ID_OR_NAME bash  # Connect interactive terminal

▢️ Run setup_vscode_ext.sh#

## (Optionally) Install common vscode extensions
##βœ… C/C++/Python and Jupyter Notebook
##βœ… Linter and Formatter
bash docker/script/setup_vscode_ext.sh  # (not needed every time)

▢️ Run post_create_commands.sh#

##πŸ‘‰ (recommended) Only Installed by `Codespaces default` option
##βœ… directories to mark as safe
##βœ… fetching submodules
##βœ… add remote upstream
##βœ… fetch tags from upstream
##βœ… create a new environment with python 3.11
##βœ… install required packages
##βœ… install pre-commit hooks
##βœ… install the development version of scikit-plots
# bash .devcontainer/script/post_create_commands.sh  # (not needed every time)
bash docker/script/post_create_commands.sh  # (not needed every time)

🚯 Stop Containers#

docker compose down

🐳 Docker Compose Configuration#

This project is based on Docker Compose and includes multiple services:

πŸ”Ή notebook_cpu (CPU-Only)

Runs Jupyter Notebook using jupyter/tensorflow-notebook:latest

No CUDA support, best for lightweight tasks

Mounts the local folder scikit-plots to /home/jovyan/work

Runs on port 8888

πŸ”Ή app_nvidia_host_gpu_driver (Uses Host CUDA)

Runs Jupyter Notebook using jupyter/tensorflow-notebook:latest

Uses host-installed CUDA for GPU acceleration

Requires NVIDIA runtime enabled (–runtime=nvidia)

Runs on port 8889

πŸ”Ή app_nvidia_internal_gpu_driver (CUDA Inside Container)

Runs nvidia/cuda:12.6.3-cudnn-runtime-ubuntu24.04 with pre-installed CUDA

Includes NVIDIA GPU support without needing host CUDA

Requires NVIDIA runtime (–runtime=nvidia)

Runs on port 8890

πŸ› οΈ Custom Docker Commands#

If you need more control, you can use Docker CLI commands.

▢️ Build & Run the Container Manually

docker build -t my-custom-container -f docker/Dockerfile .
docker run -it --rm -p 8888:8888 my-custom-container

▢️ Check GPU Availability Inside Container

docker exec -it <container_id> nvidia-smi

πŸ“‚ Folder Structure#

docker/
β”œβ”€β”€ docker-compose.yml              # Primary Docker Compose file
β”œβ”€β”€ docker-compose.override.yml     # Optional override file (auto-included if present)
β”œβ”€β”€ Dockerfile                      # Custom Dockerfile
β”œβ”€β”€ script/
β”‚   β”œβ”€β”€ install_gpu_nvidia_cuda.sh  # GPU setup script

πŸ–₯️ Useful References#

πŸ“š Jupyter Docker Stacks: Read the Docs

πŸ“š Docker Compose: Official Docs

πŸ“š Dockerfile Best Practices

πŸ“š LocalStack Installation with Docker Compose

πŸ“š NVIDIA CUDA in Containers: NVIDIA Docs

https://developer-blogs.nvidia.com/wp-content/uploads/2016/06/nvidia-docker.png

πŸš€ Now you’re ready to run Jupyter notebooks in Docker! 😊