In case you’re performing internet requests with Spring Boot’s WebClient you maybe, similar to us, learn that defining the URL of your request needs to be finished utilizing a URI builder (e.g. Spring 5 WebClient):
If that’s the case, we suggest that you just ignore what you learn (except looking hard-to-find reminiscence leaks is your interest) and use the next for establishing a URI as a substitute:
On this weblog put up we’ll clarify the right way to keep away from reminiscence leaks with Spring Boot WebClient and why it’s higher to keep away from the previous sample, utilizing our private expertise as motivation.
How did we uncover this reminiscence leak?
Some time again we upgraded our utility to make use of the newest model of the Axle framework. Axle is the bol.com framework for constructing Java purposes, like (REST) providers and frontend purposes. It closely depends on Spring Boot and this improve additionally concerned updating from Spring Boot model 2.3.12 to model 2.4.11.
When operating our scheduled efficiency assessments, every thing seemed positive. Most of our utility’s endpoints nonetheless supplied response instances of beneath 5 milliseconds. Nevertheless, because the efficiency check progressed, we observed our utility’s response instances rising as much as 20 milliseconds, and after a protracted operating load check over the weekend, issues bought so much worse. The response instances skyrocketed to seconds – not good.
Earlier than Axle improve: Higher ninetieth percentile response instances of one in every of our endpointsAfter Axle improve: Higher ninetieth percentile response instances of the identical endpoint
After a protracted stare down contest with our Grafana dashboards, which give insights into our utility’s CPU, thread and reminiscence utilization, this reminiscence utilization sample caught our eye:
This graph exhibits the JVM heap dimension earlier than, throughout, and after a efficiency check that ran from 21:00 to 0:00. In the course of the efficiency check, the appliance created threads and objects to deal with all incoming requests. So, the capricious line exhibiting the reminiscence utilization throughout this era is strictly what we might anticipate. Nevertheless, when the mud from the efficiency check settles down, we might anticipate the reminiscence to additionally settle all the way down to the identical stage as earlier than, however it’s truly increased. Does anybody else odor a reminiscence leak?
Time to name within the MAT (Eclipse Reminiscence Analyzer Device) to search out out what causes this reminiscence leak.
What brought on this reminiscence leak?
To troubleshoot this reminiscence leak we:
- Restarted the appliance.
- Carried out a heap dump (a snapshot of all of the objects which can be in reminiscence within the JVM at a sure second).
- Triggered a efficiency check.
- Carried out one other heap dump as soon as the check finishes.
This allowed us to make use of MAT’s superior characteristic to detect the leak suspects by evaluating two heap dumps taken a while aside. However we didn’t must go that far, since, the heap dump from after the check was sufficient for MAT to search out one thing suspicious:
Right here MAT tells us that one occasion of Spring Boot’s AutoConfiguredCompositeMeterRegistry occupies virtually 500MB, which is 74% of the whole used heap dimension. It additionally tells us that it has a (concurrent) hashmap that’s answerable for this. We’re virtually there!
With MAT’s dominator tree characteristic, we will listing the most important objects and see what they saved alive – That sounds helpful, so let’s use it to have a peek at what’s inside this humongous hashmap:
Utilizing the dominator tree we have been capable of simply flick through the hashmap’s contents. Within the above image we opened two hashmap nodes. Right here we see lots of micrometer timers tagged with “v2/merchandise/…” and a product id. Hmm, the place have we seen that earlier than?
What does WebClient must do with this?
So, it’s Spring Boot’s metrics which can be answerable for this reminiscence leak, however what does WebClienthave to do with this? To search out that out you actually have to know what causes Spring’s metrics to retailer all these timers.
Inspecting the implementation of AutoConfiguredCompositeMeterRegistrywe see that it shops the metrics in a hashmap named meterMap. So, let’s put a well-placed breakpoint on the spot the place new entries are added and set off our suspicious name our WebClientperforms to the “v2/product/{productId}” endpoint.
We run the appliance once more and … Gotcha! For every name the WebClientmakes to the “v2/product/{productId}” endpoint, we noticed Spring creating a brand new Timerfor every distinctive occasion of product identifier. Every such timer is then saved within the AutoConfiguredCompositeMeterRegistrybean. That explains why we see so many timers with tags like these:
/v2/merchandise/9200000109074941 /v2/merchandise/9200000099621587
How will you repair this reminiscence leak?
Earlier than we establish when this reminiscence leak would possibly have an effect on you, let’s first clarify how one would repair it. We’ve talked about within the introduction, that by merely not utilizing a URI builder to assemble WebClient URLs, you may keep away from this reminiscence leak. Now we are going to clarify why it really works.
After slightly on-line analysis we got here throughout this put up (https://rieckpil.de/expose-metrics-of-spring-webclient-using-spring-boot-actuator/) of Philip Riecks, during which he explains:
“As we normally need the templated URI string like “/todos/{id}” for reporting and never a number of metrics e.g. “/todos/1337” or “/todos/42″ . The WebClient affords a number of methods to assemble the URI […], which you’ll be able to all use, besides one.”
And that technique is utilizing the URI builder, coincidentally the one we’re utilizing:
Certainly, once we assemble the URI like that, the reminiscence leak disappears. Additionally, the response instances are again to regular once more.
When would possibly the reminiscence leak have an effect on you? – a easy reply
Do you could fear about this reminiscence leak? Properly, let’s begin with the obvious case. In case your utility exposes its HTTP consumer metrics, and makes use of a technique that takes a URI builder to set a templated URI onto a WebClient, it’s best to positively be nervous.
You possibly can simply test in case your utility exposes http consumer metrics in two other ways:
- Inspecting the “/actuator/metrics/http.consumer.requests” endpoint of your Spring Boot utility after your utility made at the very least one exterior name. A 404 means your utility doesn’t expose them.
- Checking if the worth of the appliance property administration.metrics.allow.http.consumer.metrics is ready to true, during which case your utility does expose them.
Nevertheless, this doesn’t imply that you just’re protected in case you’re not exposing the HTTP consumer metrics. We’ve been passing templated URIs to the WebClient utilizing a builder for ages, and we’ve by no means uncovered our HTTP consumer metrics. But, abruptly this reminiscence leak reared its ugly head after an utility improve.
So, would possibly this reminiscence leak have an effect on you then? Simply don’t use URI builders together with your WebClient and you have to be protected in opposition to this potential reminiscence leak. That might be the easy reply. You do not take easy solutions? Honest sufficient, learn on to search out out what actually brought on this for us.
When would possibly the reminiscence leak have an effect on you? – a extra full reply
So, how did a easy utility improve trigger this reminiscence leak to rear its ugly head? Evidently, the addition of a transitive Prometheus (https://prometheus.io/) dependency – an open supply monitoring and alerting framework – brought on the reminiscence leak in our explicit case. To know why, let’s return to the scenario earlier than we added Prometheus.
Earlier than we dragged within the Prometheus library, we pushed our metrics to statsd (https://github.com/statsd/statsd) – a community daemon that listens to and aggregates utility metrics despatched over UDP or TCP. The StatsdMeterRegistry that’s a part of the Spring framework is answerable for pushing metrics to statsd. The StatsdMeterRegistry solely pushes metrics that aren’t filtered out by a MeterFilter. The administration.metrics.allow.http.consumer.metrics property is an instance of such a MeterFilter. In different phrases, if administration.metrics.allow.http.consumer.metrics = false the StatsdMeterRegistry will not push any HTTP consumer metric to statsd and will not retailer these metrics in reminiscence both. Up to now, so good.
By including the transitive Prometheus dependency, we added one more meter registry to our utility, the PrometheusMeterRegistry. When there may be a couple of meter registry to reveal metrics to, Spring instantiates a CompositeMeterRegistry bean. This bean retains monitor of all particular person meter registries, collects all metrics and forwards them to all of the delegates it holds. It’s the addition of this bean that brought on the difficulty.
The difficulty is that MeterFilter situations aren’t utilized to the CompositeMeterRegistry, however solely to MeterRegistry situations within the CompositeMeterRegistry (See this commit for extra data.) That explains why theAutoConfiguredCompositeMeterRegistryaccumulates all of the HTTP consumer metrics in reminiscence, even once we explicitly set administration.metrics.allow.http.consumer.metricsto false.
Nonetheless confused? No worries, simply don’t use URI builders together with your WebClient and you have to be protected in opposition to this reminiscence leak.
Conclusion
On this weblog put up we defined that this strategy of defining URLs of your request with Spring Boot’s WebClient is finest prevented: