New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"Inverse" page tracking - to find bad pages #10519
Comments
a bad "page" could be:
|
As long as it's possible to remove/diable the feature. If you "only" have 500 pages it's not too bad I presume but I got 11 000 different web pages and maybe 4000 PDF so yes, we have many "unviewed" or low-visited pages over time... |
of course it should be only optional - maybe even as an plugin. |
as first approach crawling could be easily done by using a standard (google-)sitemap.
For this one only need a setting for the url of it (most times it's http://www.example.net/sitemap.xml) |
Since the comparison should only be done with very low frequency (e.g.once a month) or on request (e.g. via Button click to schedule it to make it next night) there shouldn't be a big performance problem at all... |
In addition a setting for comparison period is needed: |
To make your website better it would be not only interesting which pages are visited, but on the other site:
pages are never or only very rarely visited.
This may have several reasons:
To find these pages it would be very helpful to have the possibility to find & show them in an automatic way via "Invers page tracking"
one way may be
the next step of this would be
"Invers" Custom Event tracking - to find bad elements/functions #5186
The text was updated successfully, but these errors were encountered: