Seems updating the usage timestamp of a token might cause issues when using log import.
Not sure if we need to update the usage at all in this case. If so we maybe could also update it every 10 minutes only or so, but that would at least require an additional read request to check that.
Seems this doesn't fix the issue? Also it shouldn't really be a problem I think as when using the API concurrently we would see these issues as well maybe? Generally could change the where though like
UPDATE 'matomo_user_token_auth' SET 'last_used' = ? WHERE 'idusertokenauth' = ? and last_used < ?
It likely wouldn't make a difference though . We basically only want to update it when
time() is higher. When value doesn't change then mysql should be smart enough though.
Maybe that's only one part of the issue. But firing thousands of updates on the same row isn't good in any case.
Will have a closer look at the code again later. Maybe there's any easy way to update it like only every 10 minutes or so
Be great to indeed only update it like every 10 min (in tracker and in general). Probably can't use general tracker cache as it would invalidate the cache every 10min when we use a much longer cache time on cloud.(1hr+)
@tsteur updated the PR. There will now always be an select to check if the last date for the token is older than 10 minutes. Not sure if that might make problems if multiple requests are executed at the same time