@anonymous-matomo-user opened this Issue on April 4th 2012

This bug relates to forum thread: [http://forum.piwik.org/read.php?2,83377]

When an API call is made to the CustomVariable plugin, while using a period of Year or Date Range, and using a segment filter, and the archives are prepared from scratch (i.e. not satisfying the api call via pre-prepared archives), an exception is thrown.

You can reproduce this on the demo site using the following URL:
[http://demo.piwik.org/index.php?module=Widgetize&action=iframe&filter_limit=10&moduleToWidgetize=CustomVariables&actionToWidgetize=getCustomVariables&idSite=7&period=year&date=2012-04-03&disableLink=1&widget=1&segment=customVariableName3==Forum%20status;customVariableValue3==LoggedIn%20user]

However, to reproduce - the call must trigger an archive recreation and not just read from prepared archives. If you lower the "Reports for today (or any other Date Range including today) will be processed at most every" to X seconds, you will be able to reproduce this bug every X seconds.

I have reproduced this the first time I executed the query, but it is loading fine since. If I waiting til it had to rebuild the archives for the API call it would fail again.

@mattab commented on April 4th 2012 Member

Thanks for the report owen. Are you familiar with the code? Any idea on a bug in the code?

@anonymous-matomo-user commented on April 5th 2012

There seems to be a problem with the function: ArchiveProcessing/Period.php - archiveDataTable();

It calls getRecordDataTableSum() in the same file, which starts building a table for each month in the year. Then it comes across a month that doesn't exist in the archive as a result of calling $archive->preFetchBlob(). This triggers the CustomVariable plugin archivePeriod() function to be called again, which in turns re-enters ArchiveProcessing/Period.php - archiveDataTable();.

The problem with this re-entry into archiveDataTable() is that the function calls deleteAll() on the DataManager before it returns. Thus when recursion unravels back to the original getRecordDataTableSum() function, the tables it had created (specifically those that were referred to by the subtableid) are now deleted and when accessed an Exception is thrown.

I don't understand the architecture enough (or have enough time at the moment) to fix this. Perhaps you need a reference count in the DataManager - or maybe the above problem is symptomatic of some other problem and shouldn't normally occur.

Hope the analysis helps.

@mattab commented on April 18th 2012 Member

Thanks owen it definitely helps!

@mattab commented on July 31st 2012 Member

(In [6606]) Refs #3084 This should fix it! but I'll wait for @owen confirmation,
Refs #3207 Now calling destroy() prior to setTableDeleted as expected

@mattab commented on October 19th 2012 Member

Fixed 3 months ago :)

This Issue was closed on October 19th 2012
Powered by GitHub Issue Mirror