@samjf opened this Issue on April 22nd 2022 Contributor


Often times we come across a tracking bug or bots that continually cause tracking requests to be sent. This can flood the server and slow down the DB. I would expect that there should be some sort of realistic upper limit in the client JS that stops a single visitor from generating too many requests per page view.

It may be wise to implement some upper limit on the client JS so that such bugs/bots don't cause this issue so easily. A good example may be that we have 5000 requests within a virtual page view (virtual page view meaning the counter is reset as soon as someone calls trackPageView).

Your Environment

  • Matomo Version: 4.8
  • PHP Version: 8.1
  • Server Operating System: Mac
  • Additionally installed plugins: *
@MichaelRoosz commented on April 23rd 2022 Contributor

To protect your SQL Database and insert tracking data into it at a constant rate you may use https://plugins.matomo.org/QueuedTracking (requires a redis server)

To block bots and set some limits you may use https://plugins.matomo.org/TrackingSpamPrevention

@sgiehl commented on April 26th 2022 Member

@MichaelRoosz I guess the issue was more about the client side. So that the javascript tracking kind of counts the requests that were already sent on a certain page and when it reaches a limit further tracking requests would be dropped. That way they would not even reach the server.

@samjf commented on April 27th 2022 Contributor

@sgiehl Yes, this was what I was imagining.

An example of what i'm trying to avoid is say like 7000 form submissions in a single pageview by a single visitor. That wouldn't seem like a realistic usage of valid tracking.

I wonder if implement if this could cause trouble for SPA sites? Though, I would suspect that they would track virtual pageviews that could reset the limit.

Powered by GitHub Issue Mirror