New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RAM usage overflow when loading "Visits Over Time" the first time #15789
Comments
Just checking can you confirm you have disabled browser archiving and configured a cron to do the archiving? And this is for the regular "All visits" segment? Do you know what the filter limit is in the evolution graph on the right (meaning the value in bottom right in below screenshot)? And do you know if this happens for day period, just as well as for week and month metric? This could be related to #15726 |
@tsteur in the general settings, I have "Archive reports when viewed from the browser" set to yes and I don't have a cron job since it should not be necessary for a small personal website, like written in Matomo ("For medium to high traffic websites, it is recommended to disable Matomo archiving to trigger from the browser"). |
The problem is then likely the browser archiving being enabled and it might be archiving heaps of data there. Depending what your dump included, this error might go away in the future as it only once needed to maybe archive this data so far back. Can you still reproduce this issue after reloading it trying it many times maybe? |
@tsteur if the data was already displayed once, I cannot reproduce the issue (I updated my report to mention it yesterday) but if I select another month in a previous year or another year that was not already displayed, the issue happens again. Update: a workaround is to use auto-archiving with a cronjob and set "Archive reports when viewed from the browser" to "No". When I run I have some questions: Maybe you can also update the documentation https://matomo.org/docs/setup-auto-archiving/ which says "--url=http://example.org/matomo/ is the only required parameter in the script". In my case, I did not need it so I think it is not necessary. |
This would be expected since it would aggregate many reports.
I reckon there are usually no issues because these things would usually happen step by step. In your case it might cause memory issues because it needs to archive a lot of missing data which it would usually do just a tiny fraction of it if it does it on a daily/regular basis compared to doing many months or years of data at once. I'd say you'd very soon see no issues anymore re memory in the archiver or they might be already gone? Your memory limit is 128MB right? We usually recommend at least 256MB and likely we won't be doing much here considering the limit is fairly low and it had to archive a lot of data at once and the issue should disappear afterwards. I agree 128MB is quite a bit, but these days memory is usually quite cheap and it's maybe a rather rare edge case. It be great to know if you see these errors go away?
Yes that's very much expected since we aggregate the data for each day, week, month, year and have to store this aggregated data in the database. |
@tsteur with auto-archiving, there is no more error. However if I select several years in Visitors Overview (for example 2014 to 2020), I have to set Before displaying several years, the RAM usage was 300 MB and it can increase up to 1,40 GB if I reload the page several times, which is also very slow and could produce a RAM overflow error in the future if I want to display additional years. Do you also have an answer to my other question: And can you confirm that using |
@baptx when you go to You maybe want to keep basic metrics as they don't need much storage or delete all reports. Note: It may mean if/when you request this data, it will be rearchived again (not sure if that's the case when you have You can also use this command / API https://matomo.org/faq/how-to/faq_155/ and these invalidated reports would then be deleted at some point |
@tsteur If I check "Regularly delete old reports from the database", uncheck "Keep basic metrics" and click on "Purge DB Now", it looks like all day / week statistics are definitely lost (if older than the number of months entered) since I cannot see it again when selecting a day or week in "Visits Over Time", even if I archive data again with the command |
If "keep basic metrics" is disabled, and all data was deleted, I think it would be expected to archive again. If the checkbox is enabled, I would assume it wouldn't archive again. I think you want to use The other questions, are you referring to this?
It's hard to say. These things take quite a long time to replicate and investigate. Not sure what you mean by closing the page it stays at more than 900MB? You mean you close the browser page? Do you know if the process is still running in the background? Cause once the request is finished PHP should definitely free the memory from that particular request |
@tsteur If "Keep basic metrics" was unchecked before the click on "Purge DB Now", I cannot get the old data back in "Visits Over Time", even when archiving again with By closing the page, I mean just closing the tab or closing the web browser but it should not matter since the Linux server RAM usage should decrease directly after the data is loaded and displayed. The processes I can see on the server are from Apache and MariaDB so they are always running. Could it be that the RAM does not decrease because the web server or database is caching data? I think it is easy to replicate the issue on any Matomo installation having several years of data, we just need to select several years in "Visitors Overview" (tested from 2014 to 2020), which results in a lot of RAM usage. I was also referring to this question:
|
It should not be needed 👍
No that can't really be it. Is it possible that in the background there is a cronjob running doing eg archiving? Like you could maybe do |
@tsteur no, I cannot see another process with your commands.
|
Sorry I might not be of much help here. When a request finishes, PHP should release all the memory for sure. I suppose there is no other application running on the same server? It's bit hard for me to say what could be happening here as we can't reproduce this. Not sure if that be an issue with that particular PHP version, or what request is running there exactly. Are you maybe bit familiar with apache to see what is happening there? |
@tsteur no other app is running, I just have WordPress and Matomo installed. I am using the default Apache configuration with mod_php from Debian 10 (on an OVH SSD 1 VPS with 2GB RAM) so I don't know what else I could do if there is a memory leak. I noticed that the memory is freed in a bit less than 24 hours, probably because of this message that is displayed at midnight in About my previous comment, do you have an idea how to get the old data back?
And is it possible to display data for several years in "Visits Over Time", faster and without using so much RAM? For example by displaying a summary of the data instead of loading all data. |
Matomo should already load only the data needed to display that data. This loads in a few ms here when browser archiving is disabled. Unless you maybe want to view the entry point for every day within the last few years then it might take indeed up to 30 seconds (when loading the data for eg 700 different days)
If you have access to the database then you could simple delete all tables starting with |
@tsteur thanks, deleting
As you can see, it is not very convenient to execute all these commands instead of just one. But there is probably a way to improve your archiving script by freeing memory as soon as possible after each year is archived (or month or week or even day if possible for very large websites that have a lot of data), instead of freeing memory for several years at the same time after each period type (day, week, month, year), like it is done currently:
-> RAM was freed at this time (checked with htop)
-> RAM was freed at this time (checked with htop)
-> RAM was freed at this time (checked with htop)
-> RAM was freed at this time (checked with htop)
If a website collected data with Matomo without archiving for several years, they will probably see an out of memory error if they don't do the first archiving with the |
Thanks for contributing to this issue. As it has been a few months since the last activity and we believe this is likely not an issue anymore, we will now close this. If that's not the case, please do feel free to either reopen this issue or open a new one. We will gladly take a look again! |
I have a backup of 20 MB made with mysqldump (Ver 10.13 Distrib 5.5.62) using MySQL (Ver 14.14 Distrib 5.5.62 on Debian 8) on Matomo / Piwik version 3.10.0 that I had to import on a new server.
This backup was imported in command line with MariaDB (Ver 15.1 Distrib 10.3.22-MariaDB on Debian 10) and used with the latest Matomo version 3.13.4.
To access Matomo administration area, I had to do an upgrade: "Matomo database will be upgraded from version 3.10.0 to the new version 3.13.4.".
When logging in to Matomo, I clicked on "Visitors" -> "Overview" and selected the previous month or year.
Using
htop
on the server, I saw that 300 MB of RAM were used at the beginning and it kept increasing to more than 1,50 GB while displaying "Loading Visits Over Time...".It failed with an error "Oops… there was a problem during the request. Maybe the server had a temporary issue, or maybe you requested a report with too much data. Please try again. If this error occurs repeatedly please contact your Matomo administrator for assistance.".
In "/var/log/apache2/error.log", I saw "PHP Fatal error: Maximum execution time of 30 seconds exceeded in /var/www/website/piwik/core/DataTable/Manager.php on line 86" as well as in "/var/www/website/piwik/libs/Zend/Db/Statement/Pdo.php on line 290".
Sometimes the error is even worse on my 2 GB RAM VPS from OVH (SSD 2018): "PHP Fatal error: Out of memory (allocated 127926272) (tried to allocate 131072 bytes) in /var/www/website/piwik/core/DataAccess/ArchiveWriter.php".
If the data was successfully loaded, the next time it is displayed directly. It looks like the issue happens the first time we load data. However when the data is loaded and we do a database export, the backup can have the double of the original size.
Do you have an idea where the problem comes from and is it possible to fix it? I think the PHP parameter
max_execution_time = 30
in/etc/php/7.3/apache2/php.ini
is enough and the problem comes from Matomo. I have already changedmemory_limit = 512M
instead of 128M, like recommended (https://matomo.org/docs/setup-auto-archiving/#increase-php-memory-limit).It is also very slow, this issue may be related: #9532
The text was updated successfully, but these errors were encountered: