Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Coveralls configuration #1277

Merged
merged 6 commits into from Dec 8, 2014
Merged

Add Coveralls configuration #1277

merged 6 commits into from Dec 8, 2014

Conversation

simov
Copy link
Member

@simov simov commented Nov 19, 2014

@nylen you just need to enable coveralls for the request project on https://coveralls.io, other than that technically it should work. I tried it on one of my projects with mocha instead of tape and it seems to be fine.

Also you can try this line out

./node_modules/.bin/istanbul cover ./node_modules/tape/bin/tape tests/test-*.js

This will generate a nice html output in coverage/lcov-report/index.html, so you can check the output locally as well.

@simov
Copy link
Member Author

simov commented Nov 19, 2014

The error in this build is due to coveralls.js script returning an error.

@nylen
Copy link
Member

nylen commented Nov 20, 2014

It's not clear to me what this error is, so what is the next step on this? And, our code coverage isn't really that bad, is it? According to what you posted here we are at around 75%.

@simov
Copy link
Member Author

simov commented Nov 20, 2014

You need to login to coveralls.io with your GitHub account, then find the request project in the list of request org projects, and click on the button to enable it. You need to be an owner of that org.

Coveralls token requests only read access to your email, so it can't modify anything on your behalf on GitHub like Travis-CI do.

Once you enable the request project on coveralls.io the script from the travis.yml will send the lcov data from our tests successfully (or at least it works that way for me on my personal projects)

Currently the test coverage is around 80% for the entire request.js file.

@nylen
Copy link
Member

nylen commented Nov 20, 2014

Done, and I restarted the Travis build.

@nylen
Copy link
Member

nylen commented Nov 20, 2014

Oh and as it stands the new build script disables the lint check on Travis, which I think we want to keep.

@simov
Copy link
Member Author

simov commented Nov 20, 2014

The build failed again for some reason. I don't even see the tests output.

@simov
Copy link
Member Author

simov commented Nov 20, 2014

This is what I have in my project

language: node_js
node_js:
  - 0.8
  - 0.10
script: ./node_modules/.bin/istanbul cover ./node_modules/mocha/bin/_mocha test/core --report lcovonly -- -R spec && cat ./coverage/lcov.info | ./node_modules/coveralls/bin/coveralls.js --verbose

and

"scripts": {
    "test": "node_modules/mocha/bin/mocha test/core --require should --reporter spec --timeout 10000"
  }

npm test is executed first, then the travis.yml script. For some reason npm test is not executed for request.

@nylen
Copy link
Member

nylen commented Nov 20, 2014

The travis docs say that script overrides the default: http://docs.travis-ci.com/user/build-configuration/#script

after_script does what you had in mind.

@simov
Copy link
Member Author

simov commented Nov 20, 2014

Makes sense, it seems that it works by magic for my project 😀 Not let's see how it goes.

@simov
Copy link
Member Author

simov commented Nov 20, 2014

Oh wait it's exactly the same for my project too, it's just that I use the exact same parameters for my tests, so I'm seeing the exact same output.

@simov
Copy link
Member Author

simov commented Nov 20, 2014

@nylen I tried generating the lcov data through taper as we're using it for the tests

istanbul cover ./node_modules/.bin/taper tests/test-*.js

All tests are passing but the output is No coverage information was collected, exit without writing coverage information

Then I tried using tape which generates lcov data, but it fails to pass all of the tests

istanbul cover ./node_modules/.bin/tape tests/test-*.js

So I'm not entirely sure what is going on. In travis.yml I'm trying with tape which generates lcov data, but it seems to fail somewhere.

@nylen
Copy link
Member

nylen commented Nov 20, 2014

It's probably because taper launches test files in subprocesses.

@simov
Copy link
Member Author

simov commented Nov 21, 2014

Yep, that seems to be the case, found similar issue solved in this module https://github.com/KenPowers/gulp-spawn-mocha#code-coverage Any idea how this can be easily fixed in taper ?

@nylen
Copy link
Member

nylen commented Nov 21, 2014

That's by design, to be able to detect timeouts which is a problem with tape because it's easy to forget to call t.end() or put the wrong number in t.plan(X).

You should be able to execute all the test files without using a runner though. And I can't think of a reason why tape wouldn't work.

I don't think it would be that difficult to add an option not to use subprocesses, but I probably won't have any time to look at this for at least a week (until I get back from vacation).

@simov
Copy link
Member Author

simov commented Nov 21, 2014

Running each test one by one with tape is fine. Running all of them at once throws errors at test-pipes (at least), which is strange because I tried running the same test separately and it works 😀

@nylen
Copy link
Member

nylen commented Nov 21, 2014

Here, maybe this will help: http://www.gamefudge.com/bash-the-computer

More seriously though, I wonder if some test file is "finishing" before one of its servers is closing? That would explain why it works as a separate process, since the process wouldn't end until all the servers close.

@simov
Copy link
Member Author

simov commented Nov 21, 2014

Thanks for the heads up!

It appears to be a process.exit used in pipes and timeout

So removing both of them gives me this coverage (of course this is not representative of the actual test coverage)

So either taper should be fixed to emit lcov data, or these two tests.

Nice game, I should do that more often.. or take the harder route. No idea which one 😀

@nylen
Copy link
Member

nylen commented Nov 21, 2014

Ah, yeah that makes sense.

Does return work in Node.js modules? If so, that's a better way to do test-timeout.js. Otherwise you could wrap the whole file in a if block.

For test-pipes.js, you could set an environment variable like COVERAGE=true and just don't call process.exit if that variable is set.

@coveralls
Copy link

Coverage Status

Changes Unknown when pulling 37e2268 on simov:test-coverage into * on request:master*.

@simov
Copy link
Member Author

simov commented Nov 22, 2014

return should work, but lint is throwing me an error, so I guess it doesn't allow return in the global scope of the module, which in fact is a function as well, but who knows ... 😀

So I ended up wrapping the whole thing in an if/else statement.

One of the pipe tests was missing res.end(), and the stderr in the node-debug test got polluted from the console.error in the timeouts test.

In short the above coverage includes all of the tests except the timeout one.

🎉 https://coveralls.io/r/request/request

@rlidwka
Copy link

rlidwka commented Nov 22, 2014

Changes Unknown when pulling 37e2268 on simov:test-coverage into * on request:master*.

Will this message be displayed for every change, even if you insert missing comma in README? 'cause as I've seen in other projects, @coveralls can be extremely annoying.

@simov
Copy link
Member Author

simov commented Nov 22, 2014

settings

Yeah I can understand that, still it's a good idea to have some form of automatic notification. They have a bunch of other notification options as well

notifications

So it might be a good idea to check them out too. The reason for all of this is that I missed one of the conditional branched in one of my previous PR's and this resulted in a non caught bug.

@simov
Copy link
Member Author

simov commented Dec 7, 2014

@nylen I think this is good to be merged in. It contains a few bugfixes for the tests, and probably coveralls notifications can be set to write to gitter for example (in case github comment are too annoying)

@nylen
Copy link
Member

nylen commented Dec 7, 2014

@simov thanks for the reminder, can you look into the merge conflict?

@coveralls
Copy link

Coverage Status

Changes Unknown when pulling 9c5fbd3 on simov:test-coverage into * on request:master*.

@simov
Copy link
Member Author

simov commented Dec 7, 2014

@nylen resolved. Probably the coverall comments can get annoying idd. I think writing to Gitter via web hook is a good option.

nylen added a commit that referenced this pull request Dec 8, 2014
@nylen nylen merged commit 998831d into request:master Dec 8, 2014
@nylen
Copy link
Member

nylen commented Dec 8, 2014

Thanks @simov, this is going to be nice to have. Let's see how PR comments go for a while, then address that later if they get too annoying.

@FredKSchott
Copy link
Contributor

@simov +1 on moving converalls out of the PR comments themselves and into gitter or travis output. I've seen a few complaints of how noisy it is while they never seem to add much value to the PR

@simov
Copy link
Member Author

simov commented Jan 7, 2015

The build just have to fail if the coverage is going down, not sure if that's possible though.

nylen added a commit to nylen/request that referenced this pull request Jan 7, 2015
Per discussion on Gitter and in request#1277 we want to disable Coveralls PR
comments.  Add a Coveralls badge so that we still watch coverage stats.

While we're at it, add a Travis badge, and make all badges the same
style.
@nylen
Copy link
Member

nylen commented Jan 7, 2015

Okay, coveralls comments on PRs should be disabled now. I think getting Coveralls output into somewhere else would be a bit of work.

I've submitted #1343 to add a coverage badge to the readme. If @coveralls doesn't show up in there then we should be good to go.

@nylen
Copy link
Member

nylen commented Jan 7, 2015

OK now they're really disabled.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants