Performance data will be tracked for each page view and will contain the following metrics:
Those metrics can be summed up to get the full page load time.
Action reports for at least urls will show the full page load time as new column
and an additional row action to show the page load evolution:
Also there is a new report page, that shows the overall page load evolution:
The reports for page performance are almost ready, but there are some more adjustments needed for the tracking.
Currently all the timing data would be sent with the pageview, but that doesn't actually work, as some of the performance timings might not yet be available when the pageview is tracked. Wondering how to solve that best. As sending the page view event after the onload event was finished won't be a usable solution, I currently see two possibilities with sending the timing data later:
I actually haven't tested that through yet. But to have all metrics available in the timings api we would need to listen to the window.load event and add the tracking with a timeout of 0, so it's done when onload is finished. So depending on how the matomo tracker is included, I would say at least the onload metric won't be available and maybe also DOM Interactive to Complete, as Matomo might be executed within this process.
I would say at least the onload metric won't be available and maybe also DOM Interactive to Complete, as Matomo might be executed within this process.
We could send them with ping events, or other tracking requests if needed. Could even temporarily store data and track it on the next page view for the previous one should they visit another page but that gets too complicated. I would say we would simply document that in some cases the value wouldn't be recorded if the onload event happened yet and if they don't use the ping event or have other tracking requests. We certainly do not want to cause any extra tracking request just for that information.
Something to think about: Would we need to do anything for single page applications that track multiple page views within one page? I suppose only the first page would then get the metrics which be fine for now. Be good to not make this feature too complicated.
As for the update: I suppose either a log table or
log_link_visit_action would be possible and technically allow the same features if it has an idVisit in there etc. Be great to reuse
log_link_visit_action though I would say maybe (for simplicity). I suppose the update be like
update log_link_visit_action left join log_action set metric = xyz where idvisit = ? and idpageview = ? and log_action.type = pageview (1 or ...)?
That might be ok as long as we only execute this query should it not be recorded yet. Meaning the tracker would need to make sure to send the metric only once and the server would only issue this query once and only if the action type is not a page view. When the metric is set in the request, and action type is a page view, then it can be recorded as part of the index.
We'll need to see what impact this update has on disk I/O later once in production. We may want a feature later to disable recording metrics after the page view was tracked to avoid this update statement and instead if user wants to have these metrics, force them to listen to the onload event for the track page view.
@tsteur ok. changed the tracking accordingly. If the metrics are already available when the pageview is tracked it will be sent with the pageview, otherwise they will be sent with the next tracking request.
should be ready for a first review. Some tests of the submodules are still failing, but the expected system files will only need an update later...
wonder if we would want an "All tables column visualisation" and then show all page specific performance metrics?
Many people will never find this though. On the other side we maybe don't want to show all of them all the time so at least they would be able to get it, should they want to.
In this screen should we explain that some metrics might not be always tracked? and link to an FAQ or so?
Ideally we would remove the new performance menu item under Behaviour and instead show below screen as part of Visitors -> Overview (below the visits overview, not as part of the visits overview sparklines)
Haven't looked at code yet otherwise or tested too much but looking good! The row evolution screen is quite cool!
added a new table visualization to show the supported page reports with page performance metrics only:
moved the page performance reports below the visits overview
There are still some weird failures in the UI tests. I'm still trying to investigate why. But a code review can already be done in the meantime...
Quick feedback on the UI/UX: Considering the metrics are technical and hard to understand, could we
and also show them in the in-app help section? like on Visitors Overview:
show the definition on hover in tooltips.
That's already done for tables containing those metrics
and also show them in the in-app help section? like on Visitors Overview:
Already added that:
Also ideally these help text would explain both what it means + how to improve it eg. "Avg latency is X. To make latency better, you can sometimes do X and Y. On the other hand Z will make latency worst." or similar/better.
Giving advices what to do and what not is not yet done. And that might also be hard to do. If we really want to give the users a good advice what they should do, we could create a simple plugin to get some useful data from Google's PageSpeed API
The failing UI tests seem to fail because puppeteer fails to render the pages table report correctly. The additional column seems to make the report wider then the window causing puppeteer to render it like this:
Updated puppeteer locally, which seems to fix that issue. But it actually changes almost all screenshots slightly. Will do that in another PR, as that would be a bit too much to put it in here...
@tsteur the metrics are now split up as discussed. But it's still bit unclear to me how we should handle the old
page generation metric. As I tried to describe before, piwik.js has a method
setGenerationTimeMs, which allows it to set a custom value for the generation time. The new metrics are automatically filled based on the performance timing, with no possibility to set them manually. So users that used to set the value manually can't use the new methods for that anymore.
So how shall we handle that? shall we remove that possibility to set a custom value completely? e.g. remove the method
setGenerationTimeMs or log a warning when the method is called instead. Also should the
gt_ms param be processed on Matomo side if the value is set in a request or shall we remove that, as well?
So how shall we handle that? shall we remove that possibility to set a custom value completely? e.g. remove the method setGenerationTimeMs or log a warning when the method is called instead.
I would say it should be fine.
Also should the gt_ms param be processed on Matomo side if the value is set in a request or shall we remove that, as well?
I'd say we can remove it then as well.
@mattab fyi... we're removing generation time as it was never useful anyway and it'll be replaced by the new metrics see https://github.com/matomo-org/matomo/pull/15736#discussion_r402599498 existing data will still be shown for older reports.
@tsteur I've updated the PR and all related PRs so it's no longer possible to set the page generation time. Guess this and all related PRs should be ready for a final review...
Cheers team! It's a very good geek improvement but, isn't it a bit tricky report for a target of non-techy editors? Most of all they don’t have any clue about DOM ... :thinking:
@tassoman thanks for the comment. There are inline helps that explain those metrics for those who don't understand them directly. Also the default reports will only show the page load time, which should be clear to everyone