New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
don't try to insert unrealistically high values for time_transfer and other performance metrics #17035
Comments
I'm getting a similar error on field visitor_seconds_since_order. That'd be more than 65 years? Since I'm importing log files I don't quiet understand where the value would come from? No campaigns as such. Hmmm. Perhaps this should be a report at : https://github.com/matomo-org/matomo-log-analytics/ ? |
@poetaster I reckon it be best to report this indeed in the log analytics issue tracker as there might be a problem. Be great to mention there which parameters you use in log analytics import script call. |
Ahoi. I'll do so. Although it appears as haproxy log, it's a clf log file (postproced) python3 /var/www/.../htdocs/misc/log-analytics/import_logs.py --auth-user=piwik --auth-password=.... --url=https://....org --idsite=1 --recorders=1 --log-hostname=https://.....org --log-format-name=ncsa_extended --log-hostname=https://.....org /path/piwik-logs/${DATE}.haproxy.log.done |
To try and reproduce this need to send an HTTP Matomo Tracking API request see https://developer.matomo.org/api-reference/tracking-api with unrealistic page performance info. Seems it is trying to use Tracking code that generates API request can also be seen in https://developer.matomo.org/guides/tracking-javascript-guide#finding-the-piwik-tracking-code but seems it's not possible to easily reproduce this particular issue with unrealistic numbers so we would just ignore any unrealistic value (any value > 16000000 which is 16 million) |
Hello, example request
NOTES
This produces the error log in matomo.log
|
Finally i managed to import (replay) the tracker access logs by filtering out the outliers
Another convenience for the import script would be a switch to skip pass an erroneous batch rather than abort. Since one ourlier can make a big import to fail. |
@sgiehl is this one fixed already maybe? |
Those values should be discarded with #17574. Not sure if there was another issue to identify why those high numbers occur in the first place. |
I think we still don't know where those invalid values come from, Matomo just doesn't fail anymore with #17574 when it encounters them. |
@Findus23 that would be ideally a different issue ideally maybe? It seems to only happen with an old iOS version |
It be good for someone to have a quick look if we can reproduce this issue using iOS 9.X (seems browserstack has only a iPhone with that iOS version not a tablet). As there are very few devices running this OS we could otherwise ignore it. However, if we can reproduce it there we may want to add a check in tracker (matomo.js) to exclude way too high values. |
@tsteur Tried to reproduce that with various ios emulations and simulations on browserstack, but wasn't able to. Some older versions of Safari do not support the performance timing api, but they simply don't track it. For all others the transmitted values looked correct. |
I'll close this for now. We can create a new issue if this still is an issue (maybe it was indirectly fixed meanwhile) |
This issue has been mentioned on Matomo forums. There might be relevant details there: https://forum.matomo.org/t/error-on-archiving-track-sqlstate-22003/54806/1 |
see https://forum.matomo.org/t/512m-of-memory-limit-is-it-too-little/40047/5
No page takes 50 years to load, so I guess Matomo should not try to insert invalid values.
Then again it would be interesting to know where they come from.
The text was updated successfully, but these errors were encountered: