Locating Memory Leaks in Node.js Applications

April 12, 2018

Post

A reference to an object, if not properly managed, may be left assigned even if unused. This usually happens on an application logic level but can also be an issue inside of an imported package.

Memory leaks usually manifest themselves in long-running production applications. Fixing them in development or staging environments is often very hard, firstly because the production environment has different and more complex behavior.

What is needed to locate memory leaks in production?

Once a memory leak is detected, the first thing we need to know in order to locate it is where in the code the memory is allocated and not collected by garbage collection.

V8 includes a low-overhead heap allocation sampler that allows to see where and how much memory was allocated. Instana’s production Node.js profiler relies on the V8’s allocation sampler to report continuous memory allocation profiles from any environment, including high-load production environments. The result is a historical view of memory allocation rate per stack trace, as seen in the following screenshot.

Production NodeJS Profiler from Instana

Using these allocation profiles, sorted by stack frames with the highest allocation rates, it is now much easier to go back to the source code and make sure that the objects created at those locations are properly managed.

Instana’s always-on production Node.js profiler also reports runtime metrics, including memory and garbage collection, which are critical for detecting the memory leak in the first place.

If the memory leak is reproducible in the development environment, the profiler can be used in the manual mode for a longer and more focused sampling.

See NodeJS Profiling documentation for detailed setup instructions.

That’s it! After restarting/deploying the application, you can access all production profiles (which are captured automatically) through the Instana Unbounded Analytics Data Engine (or from individual traces.

Similar profile history is automatically available for:

  • CPU usage
  • Async calls

In addition to that, exceptions and runtime metrics will also be available in the Dashboard.

Play with Instana’s APM Observability Sandbox

Start your FREE TRIAL today!

As the leading provider of Automatic Application Performance Monitoring (APM) solutions for microservices, Instana has developed the automatic monitoring and AI-based analysis DevOps needs to manage the performance of modern applications. Instana is the only APM solution that automatically discovers, maps and visualizes microservice applications without continuous additional engineering. Customers using Instana achieve operational excellence and deliver better software faster. Visit https://www.instana.com to learn more.