@tsteur opened this Issue on November 11th 2021 Member

This is a follow up from https://github.com/matomo-org/matomo/pull/17817 where we only expand one data table at a time.

We've noticed though we're still seeing sometimes memory issues for eg a yearly archive where it exceeds 8GB. This archive has 9 levels down of subtables and overall a lot of rows. One thing that could help with memory be in to not load the data of all data tables at once into memory.:

image

Each data table is then later expanded one at a time.

The only problem with loading the needed data tables on demand would be performance as we'd need to do a lot more select queries to get each "subchildren" data table. Maybe that means we might not want to do this or maybe we can find a compromise somehow.

refs L3-126

@tsteur commented on November 11th 2021 Member

I checked and in the case where we have a memory issue, $dataTableBlobs variable alone consumes 3.1GB memory. This is before any data table is expanded.

@gijshendriksen commented on November 16th 2021

I was running into memory issues in my Matomo installation, which I also discussed in this forum post. It sounds like solving this issue would also resolve my problems with the archiver. Do you also think this is the case, or do you think there is a separate issue going on there?

@tsteur commented on November 16th 2021 Member

It might @gijshendriksen it's hard to say without knowing all the details.

Generally what might also help is to lower the number of actions in the report see https://matomo.org/faq/how-to/faq_54/

eg setting something like

[General]
datatable_archiving_maximum_rows_actions = 500
datatable_archiving_maximum_rows_subtable_actions = 100

Sometimes this does not fix it alone, and if it happens eg for a yearly period, then the archives for each month in the year would need to be invalidated see https://matomo.org/faq/how-to/faq_155/

Hae you maybe already lowered the number of rows in the reports before?

@gijshendriksen commented on November 22nd 2021

@tsteur thanks for your help! I wasn't aware these configuration options would help with the memory usage, but after applying your suggested changes the archiver now seems to be working again. Thanks!

@tsteur commented on October 2nd 2022 Member

We're having this issue daily quite a few times with various customers.

@tsteur commented on October 17th 2022 Member

We again have an issue with a customer where archiving stopped working because of this issue

@tsteur commented on October 23rd 2022 Member

Seeing this issue happening again for a URL like this:

/index.php?date=2021-06-01,2022-12-31&filter_limit=300&flat=1&format=CSV&format_metrics=1&idSite=1&language=en&method=Actions.getPageUrls&module=API&period=day&segment=&token_auth=ANONYMOUS&force_session_api=1&translateColumnNames=1&convertToUnicode=0&1

Not fully sure if it is this specific problem but I assume so and it should reduce memory quite a bit if we had this logic.

imageimage
@tsteur commented on October 26th 2022 Member

Again issues today

@tsteur commented on October 30th 2022 Member

Again happened few times.

@tsteur commented on November 3rd 2022 Member

We're having issues pretty much daily with this one

Powered by GitHub Issue Mirror