Challenge: make archiving faster when there are hundreds of websites
This should probably be implemented in two places.
- in core:archive we would also be smart and automatically skip websites, and all segments, whenever there was no new visit.
Would this cause problems for plugins that force archiving even if no visits?
when plugins force archiving in this way, they would have good reasons to do so, so their forcing would overwrite any logic we add around skipping websites/segments/etc
@mattab I think the first bullet point actually solves that as well, doesn't it? Ie, if the archiver sends requests for segments, we'll see there are no visits and avoid archiving. Adding this logic to CronArchive.php while keeping some logic in PluginsArchiver.php for plugins that force archiving would be rather difficult I think.
The first bullet point (core archiver) fixes part of the problem, which may be enough for now, but thought there would be a lot of improvements hidden in the other second bullet point (core:archive / CronArchive). for example imagine a Matomo with 1,000 sites and 10 global segments so 10,000 segments. If we only do the first bullet point, we still need to send 10,000 requests * 5 periods = 50K requests.. which would take a long time, possibly hours? what do you think (not sure if my numbers are correct)?
The numbers seem right, but since we don't know if a plugin will force archiving, we can't really skip those requests... not w/o taking the forcing outside of the core archiving logic and putting it in CronArchive. I think that would be rather non-trivial.