QPR Metrics Server Memory Consumption

5 posts 0 new
Log in or register to post comments.

QPR Metrics Server Memory Consumption

Hi,

We have QPR Metrics 2015.1 up and running on a HP DL380 server with 64GB of memory with windows server 2012 R2. Also SQL Server 2014 is running under the same machine as database for QPR Metrics.

In QPR Configuration Manager, I set the memory usage limit to 30 GB to let the SQL Server use the rest of memoty. but after I start the QPR Metrics service, it take about 63 GB of memory and the system start lagging. It seems the memory usage limitation option in setting, doesn't work properly.

Please let me now if there is any solution. Thanks.

Hi,

Memory limit of 30gb is quite big and in normal case that shouldn't be exceeded. 

Could you send an email to customercare@qpr.com about this situation?

We can continue the investigation there.

Hello,

It is highly unusual for the Metrics server to consume such amounts of memory. I suspect that there is something particular in the data content that is causing this. If we could take a look a the database, or at least the full trace server logs, we might spot the reason. For that purpose, filing a customer care ticket might be in order.

The effect of the memory usage limit is that after the limit is bypassed, the Metrics server tries to drop from memory any data that is not immediately needed. In that sense it is not a "hard limit", as much of the data is such that is continually needed in memory in order for e.g., measure value calculation to function. One other way to decrease memory consumption is to archive in Metrics Client any models that do not need to be accessed from the Portal (backup models and such).

I start a support ticket to solve this problem. these are some images below: