FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
java.lang.OutOfMemoryError: Java heap space
MemoryError
Your application ran out of available memory. Either you’re processing too much data at once, or you have a memory leak.
Fix 1: Increase Memory Limit
# Node.js
node --max-old-space-size=4096 app.js # 4GB
# Java
java -Xmx4g -Xms2g MyApp
# Python — no direct limit, but check system RAM
Fix 2: Process Data in Chunks
// ❌ Loading entire file into memory
const data = fs.readFileSync('huge-file.csv', 'utf8');
// ✅ Stream it
const stream = fs.createReadStream('huge-file.csv');
stream.on('data', (chunk) => {
processChunk(chunk);
});
# ❌ Loading all rows at once
rows = cursor.fetchall() # Millions of rows
# ✅ Fetch in batches
while True:
rows = cursor.fetchmany(1000)
if not rows:
break
process(rows)
Fix 3: Memory Leak — Event Listeners
// ❌ Adding listeners without removing them
setInterval(() => {
emitter.on('data', handler); // Leak: new listener every interval
}, 1000);
// ✅ Remove listeners when done
emitter.on('data', handler);
// Later:
emitter.removeListener('data', handler);
Fix 4: Memory Leak — Growing Arrays
// ❌ Array grows forever
const cache = [];
app.get('/data', (req, res) => {
cache.push(req.body); // Never cleared
res.json({ ok: true });
});
// ✅ Use a bounded cache
const LRU = require('lru-cache');
const cache = new LRU({ max: 500 });
Fix 5: Large JSON Parsing
// ❌ Parsing a 2GB JSON file
const data = JSON.parse(fs.readFileSync('huge.json'));
// ✅ Use a streaming JSON parser
const JSONStream = require('JSONStream');
fs.createReadStream('huge.json')
.pipe(JSONStream.parse('*'))
.on('data', (item) => process(item));
Fix 6: Docker Memory Limits
# docker-compose.yml — container hits memory limit
services:
app:
deploy:
resources:
limits:
memory: 2G # Increase if needed
Debugging
# Node.js — track memory usage
node --inspect app.js
# Open chrome://inspect and use Memory tab
# Java — heap dump
jmap -dump:format=b,file=heap.hprof <pid>
# Linux — check system memory
free -h