🔧 Error Fixes
· 1 min read

Docker Logs Too Large — How to Fix Disk Space Issues


No space left on device
/var/lib/docker/containers/xxx/xxx-json.log — 10GB+

Docker stores container logs as JSON files that grow without limit by default.

Fix 1: Check Log Sizes

# Find large log files
sudo du -sh /var/lib/docker/containers/*/*-json.log | sort -rh | head -10

# Check specific container
docker inspect --format='{{.LogPath}}' container-name
sudo du -sh $(docker inspect --format='{{.LogPath}}' container-name)

Fix 2: Truncate Existing Logs

# Clear a specific container's logs
sudo truncate -s 0 $(docker inspect --format='{{.LogPath}}' container-name)

# Clear all container logs
sudo sh -c 'truncate -s 0 /var/lib/docker/containers/*/*-json.log'

Fix 3: Set Log Limits Per Container

# docker-compose.yml
services:
  app:
    logging:
      driver: json-file
      options:
        max-size: "10m"    # Max 10MB per log file
        max-file: "3"      # Keep 3 rotated files

Fix 4: Set Global Default

// /etc/docker/daemon.json
{
    "log-driver": "json-file",
    "log-opts": {
        "max-size": "10m",
        "max-file": "3"
    }
}
# Restart Docker to apply
sudo systemctl restart docker
# Note: only applies to NEW containers

Fix 5: Use a Different Log Driver

# Send logs to syslog instead of files
services:
  app:
    logging:
      driver: syslog

Fix 6: Docker System Prune

# Clean up everything (stopped containers, unused images, build cache)
docker system prune -a

# Check disk usage
docker system df
📘