Sunday, February 5, 2023
HomeSoftware DevelopmentAvoiding reminiscence leaks with Spring Boot WebClient |

Avoiding reminiscence leaks with Spring Boot WebClient |

For those who’re performing internet requests with Spring Boot’s WebClient you maybe, similar to us, learn that defining the URL of your request needs to be executed utilizing a URI builder (e.g. Spring 5 WebClient):

If that’s the case, we advocate that you just ignore what you learn (until looking hard-to-find reminiscence leaks is your passion) and use the next for setting up a URI as a substitute:

On this weblog submit we’ll clarify find out how to keep away from reminiscence leaks with Spring Boot WebClient and why it’s higher to keep away from the previous sample, utilizing our private expertise as motivation.

How did we uncover this reminiscence leak?

Some time again we upgraded our software to make use of the most recent model of the Axle framework. Axle is the framework for constructing Java purposes, like (REST) companies and frontend purposes. It closely depends on Spring Boot and this improve additionally concerned updating from Spring Boot model 2.3.12 to model 2.4.11.

When working our scheduled efficiency checks, all the pieces regarded superb. Most of our software’s endpoints nonetheless supplied response occasions of beneath 5 milliseconds. Nevertheless, because the efficiency check progressed, we observed our software’s response occasions growing as much as 20 milliseconds, and after an extended working load check over the weekend, issues received lots worse. The response occasions skyrocketed to seconds – not good.

Earlier than Axle improve: Higher ninetieth percentile response occasions of considered one of our endpointsAfter Axle improve: Higher ninetieth percentile response occasions of the identical endpoint

After an extended stare down contest with our Grafana dashboards, which offer insights into our software’s CPU, thread and reminiscence utilization, this reminiscence utilization sample caught our eye:

This graph exhibits the JVM heap measurement earlier than, throughout, and after a efficiency check that ran from 21:00 to 0:00. Throughout the efficiency check, the applying created threads and objects to deal with all incoming requests. So, the capricious line exhibiting the reminiscence utilization throughout this era is precisely what we might anticipate. Nevertheless, when the mud from the efficiency check settles down, we might anticipate the reminiscence to additionally settle right down to the identical stage as earlier than, however it’s truly larger. Does anybody else odor a reminiscence leak?

Time to name within the MAT (Eclipse Reminiscence Analyzer Instrument) to search out out what causes this reminiscence leak.

What precipitated this reminiscence leak?

To troubleshoot this reminiscence leak we:

  • Restarted the applying.
  • Carried out a heap dump (a snapshot of all of the objects which might be in reminiscence within the JVM at a sure second).
  • Triggered a efficiency check.
  • Carried out one other heap dump as soon as the check finishes.

This allowed us to make use of MAT’s superior characteristic to detect the leak suspects by evaluating two heap dumps taken a while aside. However we didn’t must go that far, since, the heap dump from after the check was sufficient for MAT to search out one thing suspicious:

Right here MAT tells us that one occasion of Spring Boot’s AutoConfiguredCompositeMeterRegistry occupies virtually 500MB, which is 74% of the entire used heap measurement. It additionally tells us that it has a (concurrent) hashmap that’s accountable for this. We’re virtually there!

With MAT’s dominator tree characteristic, we will record the biggest objects and see what they saved alive – That sounds helpful, so let’s use it to have a peek at what’s within this humongous hashmap:

Utilizing the dominator tree we have been capable of simply flick thru the hashmap’s contents. Within the above image we opened two hashmap nodes. Right here we see lots of micrometer timers tagged with “v2/merchandise/…” and a product id. Hmm, the place have we seen that earlier than?

What does WebClient must do with this?

So, it’s Spring Boot’s metrics which might be accountable for this reminiscence leak, however what does WebClienthave to do with this? To search out that out you actually have to know what causes Spring’s metrics to retailer all these timers.

Inspecting the implementation of AutoConfiguredCompositeMeterRegistrywe see that it shops the metrics in a hashmap named meterMap. So, let’s put a well-placed breakpoint on the spot the place new entries are added and set off our suspicious name our WebClientperforms to the “v2/product/{productId}” endpoint.

We run the applying once more and … Gotcha! For every name the WebClientmakes to the “v2/product/{productId}” endpoint, we noticed Spring creating a brand new Timerfor every distinctive occasion of product identifier. Every such timer is then saved within the AutoConfiguredCompositeMeterRegistrybean. That explains why we see so many timers with tags like these:

/v2/merchandise/9200000109074941 /v2/merchandise/9200000099621587

How are you going to repair this reminiscence leak?

Earlier than we establish when this reminiscence leak may have an effect on you, let’s first clarify how one would repair it. We’ve talked about within the introduction, that by merely not utilizing a URI builder to assemble WebClient URLs, you may keep away from this reminiscence leak. Now we are going to clarify why it really works.

After somewhat on-line analysis we got here throughout this submit ( of Philip Riecks, wherein he explains:

“As we normally need the templated URI string like “/todos/{id}” for reporting and never a number of metrics e.g. “/todos/1337” or “/todos/42″ . The WebClient presents a number of methods to assemble the URI […], which you’ll be able to all use, besides one.”

And that technique is utilizing the URI builder, coincidentally the one we’re utilizing:

Certainly, after we assemble the URI like that, the reminiscence leak disappears. Additionally, the response occasions are again to regular once more.

When may the reminiscence leak have an effect on you? – a easy reply

Do you’ll want to fear about this reminiscence leak? Effectively, let’s begin with the obvious case. In case your software exposes its HTTP consumer metrics, and makes use of a technique that takes a URI builder to set a templated URI onto a WebClient, you must undoubtedly be fearful.

You’ll be able to simply verify in case your software exposes http consumer metrics in two other ways:

  1. Inspecting the “/actuator/metrics/http.consumer.requests” endpoint of your Spring Boot software after your software made at the very least one exterior name. A 404 means your software doesn’t expose them.
  2. Checking if the worth of the applying property administration.metrics.allow.http.consumer.metrics is about to true, wherein case your software does expose them.

Nevertheless, this doesn’t imply that you just’re secure if you happen to’re not exposing the HTTP consumer metrics. We’ve been passing templated URIs to the WebClient utilizing a builder for ages, and we’ve by no means uncovered our HTTP consumer metrics. But, unexpectedly this reminiscence leak reared its ugly head after an software improve.

So, may this reminiscence leak have an effect on you then? Simply don’t use URI builders along with your WebClient and try to be protected towards this potential reminiscence leak. That may be the easy reply. You do not take easy solutions? Truthful sufficient, learn on to search out out what actually precipitated this for us.

When may the reminiscence leak have an effect on you? – a extra full reply

So, how did a easy software improve trigger this reminiscence leak to rear its ugly head? Evidently, the addition of a transitive Prometheus ( dependency – an open supply monitoring and alerting framework – precipitated the reminiscence leak in our specific case. To know why, let’s return to the state of affairs earlier than we added Prometheus.

Earlier than we dragged within the Prometheus library, we pushed our metrics to statsd ( – a community daemon that listens to and aggregates software metrics despatched over UDP or TCP. The StatsdMeterRegistry that’s a part of the Spring framework is accountable for pushing metrics to statsd. The StatsdMeterRegistry solely pushes metrics that aren’t filtered out by a MeterFilter. The administration.metrics.allow.http.consumer.metrics property is an instance of such a MeterFilter. In different phrases, if administration.metrics.allow.http.consumer.metrics = false the StatsdMeterRegistry will not push any HTTP consumer metric to statsd and will not retailer these metrics in reminiscence both. Thus far, so good.

By including the transitive Prometheus dependency, we added one more meter registry to our software, the PrometheusMeterRegistry. When there may be multiple meter registry to show metrics to, Spring instantiates a CompositeMeterRegistry bean. This bean retains monitor of all particular person meter registries, collects all metrics and forwards them to all of the delegates it holds. It’s the addition of this bean that precipitated the difficulty.

The difficulty is that MeterFilter situations aren’t utilized to the CompositeMeterRegistry, however solely to MeterRegistry situations within the CompositeMeterRegistry (See this commit for extra data.) That explains why theAutoConfiguredCompositeMeterRegistryaccumulates all of the HTTP consumer metrics in reminiscence, even after we explicitly set administration.metrics.allow.http.consumer.metricsto false.

Nonetheless confused? No worries, simply don’t use URI builders along with your WebClient and try to be protected towards this reminiscence leak.


On this weblog submit we defined that this method of defining URLs of your request with Spring Boot’s WebClient is greatest averted:



Please enter your comment!
Please enter your name here

Most Popular

Recent Comments