@tsteur opened this Issue on November 11th 2021 Member

This is a follow up from https://github.com/matomo-org/matomo/pull/17817 where we only expand one data table at a time.

We've noticed though we're still seeing sometimes memory issues for eg a yearly archive where it exceeds 8GB. This archive has 9 levels down of subtables and overall a lot of rows. One thing that could help with memory be in to not load the data of all data tables at once into memory.:


Each data table is then later expanded one at a time.

The only problem with loading the needed data tables on demand would be performance as we'd need to do a lot more select queries to get each "subchildren" data table. Maybe that means we might not want to do this or maybe we can find a compromise somehow.

refs L3-126

@tsteur commented on November 11th 2021 Member

I checked and in the case where we have a memory issue, $dataTableBlobs variable alone consumes 3.1GB memory. This is before any data table is expanded.

@gijshendriksen commented on November 16th 2021

I was running into memory issues in my Matomo installation, which I also discussed in this forum post. It sounds like solving this issue would also resolve my problems with the archiver. Do you also think this is the case, or do you think there is a separate issue going on there?

@tsteur commented on November 16th 2021 Member

It might @gijshendriksen it's hard to say without knowing all the details.

Generally what might also help is to lower the number of actions in the report see https://matomo.org/faq/how-to/faq_54/

eg setting something like

datatable_archiving_maximum_rows_actions = 500
datatable_archiving_maximum_rows_subtable_actions = 100

Sometimes this does not fix it alone, and if it happens eg for a yearly period, then the archives for each month in the year would need to be invalidated see https://matomo.org/faq/how-to/faq_155/

Hae you maybe already lowered the number of rows in the reports before?

@gijshendriksen commented on November 22nd 2021

@tsteur thanks for your help! I wasn't aware these configuration options would help with the memory usage, but after applying your suggested changes the archiver now seems to be working again. Thanks!

Powered by GitHub Issue Mirror