FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
Node.js ran out of memory. The default limit is ~1.5GB. Large builds, data processing, or memory leaks can hit this.
Fix 1: Increase Memory Limit
# Run with more memory (4GB)
node --max-old-space-size=4096 script.js
# For npm scripts
NODE_OPTIONS="--max-old-space-size=4096" npm run build
# Add to .bashrc/.zshrc for permanent fix
export NODE_OPTIONS="--max-old-space-size=4096"
Common values:
2048= 2GB4096= 4GB8192= 8GB
Fix 2: Fix Your Build Tool
This often happens during builds with webpack, Vite, or Next.js.
Next.js:
// next.config.js β reduce memory usage
module.exports = {
swcMinify: true, // uses less memory than Terser
}
Webpack:
# Use the memory-efficient mode
NODE_OPTIONS="--max-old-space-size=4096" npx webpack --mode production
TypeScript:
# tsc can be memory-hungry on large projects
NODE_OPTIONS="--max-old-space-size=4096" npx tsc
Fix 3: Find the Memory Leak
If increasing memory just delays the crash, you have a leak.
// Check memory usage in your code
console.log(process.memoryUsage());
// {
// rss: 50MB, β total memory
// heapUsed: 30MB, β actual usage
// heapTotal: 40MB β allocated heap
// }
Common leak causes:
- Growing arrays/objects that are never cleared
- Event listeners that are never removed
- Closures holding references to large objects
- Caching without size limits
Profile it:
# Generate a heap snapshot
node --inspect script.js
# Open chrome://inspect in Chrome, take heap snapshots
Fix 4: Process Data in Streams
If youβre loading large files into memory:
// β Loads entire file into memory
const data = fs.readFileSync('huge-file.json', 'utf8');
const parsed = JSON.parse(data);
// β
Stream it
const stream = fs.createReadStream('huge-file.json');
stream.on('data', chunk => {
// Process chunk by chunk
});
Fix 5: Docker / CI Environments
Containers often have less memory than your local machine.
# Dockerfile
ENV NODE_OPTIONS="--max-old-space-size=2048"
# GitHub Actions
env:
NODE_OPTIONS: "--max-old-space-size=4096"
# Docker Compose β give the container more memory
services:
app:
mem_limit: 4g
Quick Fixes by Scenario
| Scenario | Fix |
|---|---|
npm run build crashes | NODE_OPTIONS="--max-old-space-size=4096" npm run build |
tsc crashes | Same NODE_OPTIONS fix |
| Script processing large data | Use streams instead of loading all into memory |
| Docker build crashes | Add ENV NODE_OPTIONS to Dockerfile |
| Keeps crashing even with more memory | You have a memory leak β profile it |