Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CLI: Show full test suite when testing 1 spec file at a time (like Jest does) #3266

Closed
4 tasks done
ghiscoding opened this issue Apr 27, 2023 · 5 comments · Fixed by #3543
Closed
4 tasks done

CLI: Show full test suite when testing 1 spec file at a time (like Jest does) #3266

ghiscoding opened this issue Apr 27, 2023 · 5 comments · Fixed by #3543
Labels
enhancement New feature or request pr welcome
Milestone

Comments

@ghiscoding
Copy link
Contributor

ghiscoding commented Apr 27, 2023

Clear and concise description of the problem

Coming from Jest, I find that this feature is missing in Vitest and I do like to see the full list of tests (from a single spec file) when I'm currently working on a specific test suite. The way it works in Jest is that if you are testing all test suites (or all spec files), then it shows you only the title of each test suite (Vitest does that too, great!). However if you then add/update some test(s) under a single test suite or spec file (or if you are using "p" for pattern which returns a single spec file), then Jest will show the entire list of tests that belong to that specific spec test suite (Vitest does not do that, it only shows the test suite title in every case, that is the same output for a single spec file or multiple spec files).

For example in Jest when testing 1 spec file test suite, it shows all tests under it (in this case 12 tests), it's quite handy to see the full list, also sometime there are skipped tests that I can see in this list but I can't see them in Vitest

image

On the other hand, Vitest is only ever showing the title of the describe test suite title. There are 4 tests under that spec file but it never shows them and I would really like to get the same view that Jest provides me which is helpful when troubleshooting a single spec test file

image

Suggested solution

Provide the same view as Jest does (which is to show the full list of tests under the spec file test suite)

Alternative

it's simply missing in Vitest, there are no alternative that I know of

Additional context

There are more things to note about the print screens that I provided on top, I mention that we do not see the full list of tests under the test suite in Vitest... however that is not entirely true when we have an test failing, then we do see a summary similar to Jest. But even then, I still prefer Jest's UI in this case, the color they chose and the alignment of the text is simply easier to read in Jest. Also in Jest the spec filename is not shown on the same line as the describe test suite title, but in Vitest it's all on the same line (and it most often overflows) and I find that also harder to read and find which spec file and tests really failed

when failing in Jest, it's quite easy to read and find which test failed
image

when failing in Vitest, it's not as easy to read and trying to find the test that failed also seems much harder to see in the result window, I think it could be improved (for example separating the spec filename and the test describe under separate lines would help, and also use better or less colors)
image

Update

The comment below to improve colors got fixed and merged in PR #3349. The main enhancement requested in this issue, which is to show full report when runing single spec file, is still unaddressed

Validations

@sheremet-va sheremet-va added enhancement New feature or request pr welcome labels Apr 27, 2023
@AriPerkkio
Copy link
Member

AriPerkkio commented May 6, 2023

On the other hand, Vitest is only ever showing only the title of the describe test suite title. There are 4 tests under that spec file but it never shows them and I would really like to get the same view that Jest provides me

Does the reporters: 'verbose' provide this experience already?

@ghiscoding
Copy link
Contributor Author

oh I didn't realize we could simply change the reporter, however it does help but only partially. If I enable the verbose reporter, then all test suite are now showing with their entire line of tests which is way too much data to show. I still wish to see the same logic that Jest does by default which is to show all tests only when you are testing 1 file at a time (ie file pattern), basically if I'm not in 1 file at a time then I still wish to see the single test description. It would also be nice to see other UI improvements that I mentioned in the print screens

@ghiscoding
Copy link
Contributor Author

ghiscoding commented May 8, 2023

Other print screen comparisons with Jest vs Vitest

Looking at possible improvement on the colors, if we run the same test in Jest and Vitest. Jest is coloring in green all the expectations are green (not just the value but also the test title is green, actually it turns out that it's because the string is highlighted as green which is the same color as expectation), but in Vitest that entire portion (not just the title) is yellow. I prefer Jest UI a bit more, I find it easier to read and it's a bit quicker to spot the expectation errors.

image

in Vitest, the entire test is in yellow but in Jest that portion is using language highlight (JS or TS highlight)
image

I tested a little bit by changing some of Vitest internal code, I find it a bit better with the following (note that is for dark mode only)

in error.ts, I changed these 2 sections

else {
  printStack(ctx, stacks, nearest, errorProperties, (s) => {
    if (showCodeFrame && s === nearest && nearest) {
      const sourceCode = readFileSync(nearest.file, 'utf-8')
-      ctx.logger.error(c.yellow(generateCodeFrame(sourceCode, 4, s.line, s.column)))
+     ctx.logger.error(c.white(generateCodeFrame(sourceCode, 4, s.line, s.column)))
    }
  })
}

//... 
function printStack(
  ctx: Vitest,
  stack: ParsedStack[],
  highlight: ParsedStack | undefined,
  errorProperties: Record<string, unknown>,
  onStack?: ((stack: ParsedStack) => void),
) {
  const logger = ctx.logger

  for (const frame of stack) {
-    const color = frame === highlight ? c.yellow : c.gray
+    const color = frame === highlight ? c.cyan : c.gray

image

The quick change was to remove the yellow color, I changed the filename link as cyan and the text as white (obviously this is only good for dark). Ideally, the tested code should be highlight per language and the results expectation should be either red or green (for example, the expectation 'helloa world.' should be all red but "a" and the last "." should be bgRed like they are now). Currently these expectations follow the descriptor theme via getConcordanceTheme() (ie, strings are blue, numbers are yellow) but I don't think these expectation results should be colored this way, I would prefer to see them as red (-) or green (+). A different approach would be to use the theme inside the tested code (instead of yellow or the one I changed to white) but I don't know how to make this change

@sheremet-va
Copy link
Member

sheremet-va commented May 9, 2023

Currently these expectations follow the descriptor theme via getConcordanceTheme() (ie, strings are blue, numbers are yellow) but I don't think these expectation results should be colored this way,

It works as expected. If they were different, you would see the diff in red. I am fine with other changes (we can also print expected/received before the diff).

the tested code should be highlight per language

I don't think it justifies bringing the library for this to be honest 🤔

@sheremet-va
Copy link
Member

The implemented solution has a bug that we need to resolve to properly close this issue. It shows all tests even when the suite did not finish running which makes it very hard to see console.log messages that always appear on top.

@sheremet-va sheremet-va modified the milestone: 0.32.3 Jun 20, 2023
@github-actions github-actions bot locked and limited conversation to collaborators Jul 20, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request pr welcome
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants