DEV Community

Cover image for Diagnose Memory Leaks in Scala with VisualVM
Anas Anjaria
Anas Anjaria

Posted on • Originally published at Medium

Diagnose Memory Leaks in Scala with VisualVM

Originally published on medium


Where do you even begin when your Java/Scala application is eating up memory?

Recently, I had to fix a memory leak in a Scala service running in production. I was new to debugging such issues and couldn’t find a comprehensive guide—so I decided to write one myself.

This post focuses on concepts, tooling, and hands-on steps to investigate and understand memory leaks in live Scala applications.

✅ If you're looking for more real-world examples, check out Troubleshooting Scala Memory Leaks


🔍 How Do You Know If There's a Memory Leak?

If your app has a memory leak, memory usage will gradually increase over time—maybe over hours or days—until it eventually crashes or gets restarted.

We actively monitors our production system and trigger alerts when memory usage spikes. This confirmed the issue.

🧪 Tip: Always confirm the leak before jumping into analysis.


🛠️ Step-by-Step: Using VisualVM to Find Memory Leaks

The first step is to remotely connect to your live application experiencing a memory leak and attach a profiler—like VisualVM—to begin your investigation.

Attaching a VisualVM to a live system experiencing a memory leak
Attaching a VisualVM to a live system experiencing a memory leak

Step 1 — Start Your App With JVM Options

Add the following parameters to your JVM startup:

-Dcom.sun.management.jmxremote.rmi.port=9010 -Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote.port=9010 -Dcom.sun.management.jmxremote.ssl=false -Dcom.sun.management.jmxremote.authenticate=false -Dcom.sun.management.jmxremote.local.only=false -Djava.rmi.server.hostname=localhost 
Enter fullscreen mode Exit fullscreen mode

👉 Helpful StackOverflow answer

Step 2 — Create an SSH Tunnel

  • SSH into your EC2 instance
  • Find the container’s internal IP
docker inspect \ -f '{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' \ <container_name> 
Enter fullscreen mode Exit fullscreen mode
  • Close the terminal & open a new one.
ssh -L 9010:<container-ip>:9010 \ -i ~/.ssh/your-key.pem ec2-user@<ec2-host> 
Enter fullscreen mode Exit fullscreen mode

This exposes remote port 9010 locally via SSH.

Step 3 — Attach VisualVM

  1. Open VisualVM
  2. Right-click Local → Add JMX Connectionlocalhost:9010

Adding a JMX connection
Adding a JMX connection

VisualVM should now detect the remote app.

For a high-level overview, check the distribution of Heap, Stack (thread count), and Metaspace to identify where memory is being used most.

In most cases, Heap will dominate memory usage—as shown below:

High memory utilization — Heap utilized almost all the memory
High memory utilization — Heap utilized almost all the memory

Step 4 — Analyze Memory with Sampler

Start memory sampling in VisualVM.

Collect memory samples via visualvm
Collect memory samples via visualvm

It gives you:

  1. Heap histogram
  2. Per-thread allocations

Use heap histogram and focus on your packages (e.g., com.mycompany.api) and sort by live objects.

Sorting memory samples using live objects.
Sorting memory samples using live objects.

🔎 Recurring high object counts may indicate leaks.

Step 5 — Think Critically and Hypothesize

You now have visibility. Combine it with your code understanding.

  • Review recent PRs
  • Look for: unclosed resources, large structures, excessive allocations

🧪 What If VisualVM Doesn’t Help? Use Heap Dumps

If runtime analysis fails, take a heap dump:

# Inside container jmap -dump:live,format=b,file=/tmp/dump.hprof <PID> 
Enter fullscreen mode Exit fullscreen mode

Copy the dump to your machine and analyze using Eclipse MAT.

🧠 Tip: Heap dumps can be huge—ensure enough disk space.

You can also auto-create dumps on OOM

-XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/path/to/dump 
Enter fullscreen mode Exit fullscreen mode

✅ Things to Check Before You Dig Deeper

  1. Investigate only if usage is high (say > 60%) so that you can see the problem.
  2. Set same min/max heap values (why?)
  3. Please don’t over-provision head memory. For instance, if a system has only 1 GB, don’t set max heap size as 1GB as this would lead to OOM as well. Try to set reasonable values and have enough room of memory for other process running on the same machine.

📝 What You’ve Learned

  • How to confirm a memory leak with metrics
  • How to attach VisualVM to a live app
  • How to analyze live memory with histograms
  • When and how to use heap dumps
  • What JVM flags help automate this process

📚 Related Reading

Troubleshooting Scala Memory Leaks


📘 I write about PostgreSQL, devOps, backend engineering, and real-world performance tuning.

🔗 Find more of my work, connect on LinkedIn, or explore upcoming content: all-in-one

Top comments (0)