Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add testdox to phpunit.xml #17462

Merged
merged 1 commit into from May 17, 2021
Merged

Add testdox to phpunit.xml #17462

merged 1 commit into from May 17, 2021

Conversation

flamisz
Copy link
Contributor

@flamisz flamisz commented Apr 15, 2021

Description:

I had a failing test in another PR and it was hard to understand why and where it is happening. Modifying the phpunit.xml can help investigate failing tests easier.

Instead of only dots, we have some information about what's going on, and if a build fails, we can see which one was the last test.

An example:

Api Get Report Metadata (Piwik\Tests\System\ApiGetReportMetadata)
 ✔ Api with data set #0 [548.94 ms]
 ✔ Api with data set #1 [26.24 ms]
 ✔ Api with data set #2 [28.44 ms]
 ✔ Api with data set #3 [29.22 ms]
 ✔ Api with data set #4 [7.41 ms]
 ✔ Api with data set #5 [3.97 ms]

Review

  • Functional review done
  • Potential edge cases thought about (behavior of the code with strange input, with strange internal state or possible interactions with other Matomo subsystems)
  • Usability review done (is anything maybe unclear or think about anything that would cause people to reach out to support)
  • Security review done see checklist
  • Code review done
  • Tests were added if useful/possible
  • Reviewed for breaking changes
  • Developer changelog updated if needed
  • Documentation added if needed
  • Existing documentation updated if needed

@flamisz flamisz added the Needs Review PRs that need a code review label Apr 15, 2021
@flamisz flamisz self-assigned this Apr 15, 2021
@sgiehl
Copy link
Member

sgiehl commented Apr 15, 2021

Personally I wouldn't enable that by default in phpunit.xml.dist. If you want that locally, you can simply create a phpunit.xml that contains it. Imho seeing which tests are running is only a useful information while debugging. In all other cases you only need to know which tests are failing - and that is also available without testdox. Having it enabled also makes the output very very long, which also makes it slower to load the travis output in the browser.

@flamisz
Copy link
Contributor Author

flamisz commented Apr 15, 2021

Personally I wouldn't enable that by default in phpunit.xml.dist. If you want that locally, you can simply create a phpunit.xml that contains it. Imho seeing which tests are running is only a useful information while debugging. In all other cases you only need to know which tests are failing - and that is also available without testdox. Having it enabled also makes the output very very long, which also makes it slower to load the travis output in the browser.

In my case was different, I wanted to know where it failed on Travis because failed differently, but anytime we can just commit it when needed (probably or hopefully very rare) and when we fixed it, modify it back. This is what I did. I just wanted to know the team thinks about it :)

@tsteur
Copy link
Member

tsteur commented Apr 15, 2021

Personally finding it quite useful because it'll be there when you need it, and doesn't really cause too much harm when it's not needed. For me it loads as fast, the scrolling to the end takes a second though but that's it. Finding it also quite good to see how long each test took etc. The timings are not ongoing information you need every time though. Other people might not know how and where to set this flag when tests are not executing so it working instantly could be quite good. Especially also for community contributors etc. Luckily, we don't really have much problems anymore with builds timing out otherwise it would have been useful. I can also see though that it does add noise and we could enable it on demand but then ideally there be some output in travis run like this (could even include a link to an FAQ where we explain how to do it)
image
Overall no big preference and not seeing too much harm with adding it but ok either way if we at least add such a message.

@diosmosis any thoughts?

@diosmosis
Copy link
Member

Not really, either way is fine for me.

@flamisz
Copy link
Contributor Author

flamisz commented Apr 15, 2021

@tsteur I don't think that "trigger another test run by doing...". if you modify the phunit.xml.dist to set the testdox to true, that will trigger the new test run anyway. Or I miss something?

And for me it wasn't an issue either, the result was longer on travis.

@flamisz
Copy link
Contributor Author

flamisz commented May 12, 2021

@tsteur @sgiehl @diosmosis
Do we want to merge this or I can close it?

@tsteur
Copy link
Member

tsteur commented May 12, 2021

@flamisz @sgiehl @diosmosis I would say let's just merge it and disable it again (and then instead print a message on how to activate this feature to troubleshoot when needed) should we find it not useful. Be good to simply give it a try for a while and if we find it a problem then change. In UI tests we've always been doing this and it never was an issue. @sgiehl @diosmosis @flamisz be important to mention if/when it becomes annoying (eg with heaps of other test failures or so) and then we change it again. Be also useful if it ever helps you to mention eg in slack channel or so "Hey this just helped me save time or find xyz".

@diosmosis
Copy link
Member

It makes sense to try it first.

@sgiehl
Copy link
Member

sgiehl commented May 17, 2021

Let's simply merge then. We can later remove again if it might get annoying

@sgiehl sgiehl merged commit 62728e9 into 4.x-dev May 17, 2021
@sgiehl sgiehl deleted the add-testdox-to-phpunit branch May 17, 2021 13:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Needs Review PRs that need a code review
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants