Skip to content

Commit

Permalink
Apply consistent CI workflow (#160)
Browse files Browse the repository at this point in the history
DEFRA/water-abstraction-team#4

As a team, we are currently maintaining 11 repos and that's not including documentation, support and deployment-related stuff! What we inherited was not consistent; different CI tools were used for the same purpose and different CI steps were implemented.

An effort was made to get this under control with [water-abstraction-orchestration](https://github.com/DEFRA/water-abstraction-orchestration). But to get a handle on it in such a way as the current team can manage, we're simplifying still further.

This change updates `.github/workflows/ci.yml` to use a template we're applying across all the repos, amended to the requirements of the project. In the future, we hope to return to the lessons **water-abstraction-orchestration** taught us, if only to remove the duplication across the projects. But for now, the template hopefully simplifies and makes consistent our build process.

**Notes**

- Add .nvmrc

We use it locally and in the CI to determine the version of Node to use. This repo didn't have one. So, we've added it and set the version to match what WRLS use for node.js.

- Apply ci.yml template

This is very much based on https://github.com/DEFRA/sroc-charging-module-api/blob/main/.github/workflows/ci.yml

There is now one process, which means you'll just see the one item listed in the checks on GitHub. The steps are ordered in an effort to fail fast and/or where needed, for example, DB migrations before running tests and SonarCloud analysis after code coverage data has been generated.

Specifically for this project, it also moves us away from using travis-ci.org, which was the first CI provider we integrated many, _many_ years ago. Defra has since migrated all their repos to travis-ci.com, then finally GitHub actions. Clearly, this one has been overlooked.

- Update .labrc

This is based on https://github.com/DEFRA/sroc-charging-module-api/blob/main/.labrc.js. We use the config file to ensure we generate the coverage output in a format needed by SonarCloud.

- Add SonarCloud properties file

Previously, the CI specified the values SonarCloud needs in its steps plus there were some in the web UI. But there are a bunch of other things we can tell SonarCloud about which makes managing SonarCloud much easier.

So, we use `sonar-project.properties` files to set what the SonarCloud analysis step needs and what's useful in its UI.

- Update README with new badges

Also displays the badges we want in a layout that is consistent with most open source repos.

- Switch to standardjs

We had a problem trying to get the project to build again. We found [this issue](hapijs/lab#1017) which includes a comment from someone who solved it by removing eslint as a dependency. As we should be using standardjs anyway we removed eslint and switched to standardjs as part of this change. Normally, we would have done it as part of separate PR.

- Remove setting table owners in migration code

This breaks doing stuff locally and in our CI pipeline. The create migrations appear to be generated directly from pgAdmin so we also removed other statements which are not needed.

- Update to hapi lab and code dependencies

Hapi lab was listed x2 as a dev dependency. Once, as the new @hapi/hapi but the second time under the old `hapi` namespace. Also, old `lab` and `Code` were listed as the test dependencies.

The old Hapi looks like it was bringing in a version of catbox-memory that depended on `big-time` but that wasn't actually getting installed. We removed it but to be on the safe side we have also updated the versions of Lab and Code to use those under the `@hapi` namespace.
  • Loading branch information
Cruikshanks committed Sep 1, 2022
1 parent 9dddedc commit 4b85c5b
Show file tree
Hide file tree
Showing 48 changed files with 7,142 additions and 5,826 deletions.
8 changes: 0 additions & 8 deletions .codeclimate.json

This file was deleted.

6 changes: 0 additions & 6 deletions .eslintrc

This file was deleted.

98 changes: 98 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,98 @@
name: CI

on: push

jobs:
build:
runs-on: ubuntu-latest
env:
DATABASE_URL: postgres://water_user:password@localhost:5432/wabs_test

# Service containers to run with `runner-job`
services:
# Label used to access the service container
postgres:
# Docker Hub image
image: postgres:12-alpine
# Provide the password for postgres
env:
POSTGRES_USER: water_user
POSTGRES_PASSWORD: password
POSTGRES_DB: wabs_test
# Maps tcp port 5432 on service container to the host
ports:
- 5432:5432
# Set health checks to wait until postgres has started. You must have this so the runner knows to wait till
# postgres is up and running before proceeding
options: --health-cmd pg_isready --health-interval 10s --health-timeout 5s --health-retries 5

steps:
# Downloads a copy of the code in your repository before running CI tests
- name: Checkout repository
uses: actions/checkout@v3
with:
fetch-depth: 0 # Shallow clones should be disabled for a better relevancy of sonarcloud analysis

# Before we do anything, check we haven't accidentally left any `experiment.only()` or `test.only(` statements in
# the tests
#
# Reworking of https://stackoverflow.com/a/21788642/6117745
- name: Temporary tag check
run: |
! grep -R 'experiment.only(\|test.only(' test
# Our projects use .nvmrc files to specify the node version to use. We can read and then output it as the result
# this step. Subsequent steps can then access the value
- name: Read Node version
run: echo "##[set-output name=NVMRC;]$(cat .nvmrc)"
# Give the step an ID to make it easier to refer to
id: nvm

# Gets the version to use by referring to the previous step
- name: Install Node
uses: actions/setup-node@v3
with:
node-version: "${{ steps.nvm.outputs.NVMRC }}"

# Speeds up workflows by reading the node modules from cache. Obviously you need to run it at least once, and the
# cache will be updated should the package-lock.json file change
- name: Cache Node modules
uses: actions/cache@v3
with:
# npm cache files are stored in `~/.npm` on Linux/macOS
path: ~/.npm
key: ${{ runner.OS }}-node-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.OS }}-node-
${{ runner.OS }}-
# Performs a clean installation of all dependencies in the `package.json` file
# For more information, see https://docs.npmjs.com/cli/ci.html
- name: Install dependencies
run: npm ci

# Run linting first. No point running the tests if there is a linting issue
- name: Run lint check
run: |
npm run lint
- name: Database migrations
run: |
npm run migrate
# This includes an extra run step. The sonarcloud analysis will be run in a docker container with the current
# folder mounted as `/github/workspace`. The problem is when the lcov.info file is generated it will reference the
# code in the current folder. So to enable sonarcloud to matchup code coverage with the files we use sed to update
# the references in lcov.info
# https://community.sonarsource.com/t/code-coverage-doesnt-work-with-github-action/16747/6
- name: Run unit tests
run: |
npm test
sed -i 's@'$GITHUB_WORKSPACE'@/github/workspace/@g' coverage/lcov.info
- name: Analyze with SonarCloud
if: github.actor != 'dependabot[bot]'
uses: sonarsource/sonarcloud-github-action@master
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # This is provided automatically by GitHub
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }} # This needs to be set in your repo; settings -> secrets
14 changes: 9 additions & 5 deletions .labrc.js
Original file line number Diff line number Diff line change
@@ -1,11 +1,15 @@
'use strict'
// default settings for lab test runs.
//
// This is overridden if arguments are passed to lab via the command line.
module.exports = {
globals: 'globalThis',
verbose: true,
'coverage-exclude': [
'node_modules',
'test'
]
coverage: true,
// Means when we use *.only() in our tests we just get the output for what we've flagged rather than all output but
// greyed out to show it was skipped
'silent-skips': true,
// lcov reporter required for SonarCloud
reporter: ['console', 'html', 'lcov'],
output: ['stdout', 'coverage/coverage.html', 'coverage/lcov.info'],
globals: ['globalThis'].join(',')
};
1 change: 1 addition & 0 deletions .nvmrc
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
v14.19.1
33 changes: 0 additions & 33 deletions .travis.yml

This file was deleted.

16 changes: 8 additions & 8 deletions readme.md → README.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
[![Build Status](https://travis-ci.org/DEFRA/hapi-pg-rest-api.svg?branch=master)](https://travis-ci.org/DEFRA/hapi-pg-rest-api)
[![Test Coverage](https://codecov.io/gh/DEFRA/hapi-pg-rest-api/branch/master/graphs/badge.svg)](https://codecov.io/gh/DEFRA/hapi-pg-rest-api)
[![Known Vulnerabilities](https://snyk.io/test/github/defra/hapi-pg-rest-api/badge.svg?targetFile=package.json)](https://snyk.io/test/github/defra/hapi-pg-rest-api?targetFile=package.json)
[![Maintainability](https://api.codeclimate.com/v1/badges/f09ab6459489426d9e88/maintainability)](https://codeclimate.com/github/DEFRA/hapi-pg-rest-api/maintainability)
[![Test Coverage](https://api.codeclimate.com/v1/badges/f09ab6459489426d9e88/test_coverage)](https://codeclimate.com/github/DEFRA/hapi-pg-rest-api/test_coverage)
# Hapi PG Rest API

# HAPI REST API
![Build Status](https://github.com/DEFRA/hapi-pg-rest-api/workflows/CI/badge.svg?branch=master)
[![Maintainability Rating](https://sonarcloud.io/api/project_badges/measure?project=DEFRA_hapi-pg-rest-api&metric=sqale_rating)](https://sonarcloud.io/dashboard?id=DEFRA_hapi-pg-rest-api)
[![Coverage](https://sonarcloud.io/api/project_badges/measure?project=DEFRA_hapi-pg-rest-api&metric=coverage)](https://sonarcloud.io/dashboard?id=DEFRA_hapi-pg-rest-api)
[![Known Vulnerabilities](https://snyk.io/test/github/DEFRA/hapi-pg-rest-api/badge.svg)](https://snyk.io/test/github/DEFRA/hapi-pg-rest-api)
[![Licence](https://img.shields.io/badge/Licence-OGLv3-blue.svg)](http://www.nationalarchives.gov.uk/doc/open-government-licence/version/3)

A module to create a simple REST API in a HAPI 20 application connected to a particular Postgres DB table.

Expand Down Expand Up @@ -123,7 +123,7 @@ server.route([
* `validation` : an object containing Joi validation for the entity (required)
* `preUpdate` : a function which can filter the data object being updated
* `preInsert` : a function which can filter the data object being inserted
* `preQuery` : a function which can modify the data, filter and sort after a HAPI request has been interpreted
* `preQuery` : a function which can modify the data, filter and sort after a HAPI request has been interpreted
* `postSelect` : a function which can modify data retrieved by select query
* `upsert` : an object containing arrays `fields` and `set` - adds an on conflict clause to an insert
* `primaryKeyAuto` : whether primary key field is auto-generated by the DB (default false)
Expand Down Expand Up @@ -511,7 +511,7 @@ const pagination = {page : 1, perPage : 5};
// Single record
var {data, error} = await client.create(data);
var {data, error} = await client.findOne('guid');
var {data, rowCount, error} = await client.updateOne('guid', data);
var {data, rowCount, error} = await client.updateOne('guid', data);
await client.delete('guid');
// Batch
Expand Down
8 changes: 4 additions & 4 deletions auto-pk-api.js
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
const Joi = require('joi');
const HAPIRestAPI = require('./src/rest-api');
const pool = require('./db');
const Joi = require('joi')
const HAPIRestAPI = require('./src/rest-api')
const pool = require('./db')

module.exports = new HAPIRestAPI({
table: 'autopk_test',
Expand All @@ -13,4 +13,4 @@ module.exports = new HAPIRestAPI({
id: Joi.number(),
name: Joi.string()
}
});
})
4 changes: 2 additions & 2 deletions config.js
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
require('dotenv').config();
require('dotenv').config()

module.exports = {

Expand All @@ -9,4 +9,4 @@ module.exports = {
connectionTimeoutMillis: 2000
}

};
}
9 changes: 4 additions & 5 deletions db.js
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
const { Pool } = require('pg');
const { pg } = require('./config');
const uuidV4 = require('uuid/v4');
const pool = new Pool(pg);
const { Pool } = require('pg')
const { pg } = require('./config')
const pool = new Pool(pg)

module.exports = pool;
module.exports = pool
10 changes: 5 additions & 5 deletions index.js
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
const { throwIfError } = require('./src/helpers');
const { throwIfError } = require('./src/helpers')

module.exports = require('./src/rest-api.js');
module.exports = require('./src/rest-api.js')

module.exports.APIClient = require('./src/api-client.js');
module.exports.APIClient = require('./src/api-client.js')

module.exports.manager = require('./src/manager.js');
module.exports.manager = require('./src/manager.js')

module.exports.throwIfError = throwIfError;
module.exports.throwIfError = throwIfError
72 changes: 33 additions & 39 deletions migrations/20180111093803-create-table.js
Original file line number Diff line number Diff line change
@@ -1,53 +1,47 @@
'use strict';
'use strict'

var dbm;
var type;
var seed;
var fs = require('fs');
var path = require('path');
var Promise;
const fs = require('fs')
const path = require('path')
let Promise

/**
* We receive the dbmigrate dependency from dbmigrate initially.
* This enables us to not have to rely on NODE_PATH.
*/
exports.setup = function(options, seedLink) {
dbm = options.dbmigrate;
type = dbm.dataType;
seed = seedLink;
Promise = options.Promise;
};
exports.setup = function (options) {
Promise = options.Promise
}

exports.up = function(db) {
var filePath = path.join(__dirname, 'sqls', '20180111093803-create-table-up.sql');
return new Promise( function( resolve, reject ) {
fs.readFile(filePath, {encoding: 'utf-8'}, function(err,data){
if (err) return reject(err);
console.log('received data: ' + data);
exports.up = function (db) {
const filePath = path.join(__dirname, 'sqls', '20180111093803-create-table-up.sql')
return new Promise(function (resolve, reject) {
fs.readFile(filePath, { encoding: 'utf-8' }, function (err, data) {
if (err) return reject(err)
console.log('received data: ' + data)

resolve(data);
});
resolve(data)
})
})
.then(function(data) {
return db.runSql(data);
});
};
.then(function (data) {
return db.runSql(data)
})
}

exports.down = function(db) {
var filePath = path.join(__dirname, 'sqls', '20180111093803-create-table-down.sql');
return new Promise( function( resolve, reject ) {
fs.readFile(filePath, {encoding: 'utf-8'}, function(err,data){
if (err) return reject(err);
console.log('received data: ' + data);
exports.down = function (db) {
const filePath = path.join(__dirname, 'sqls', '20180111093803-create-table-down.sql')
return new Promise(function (resolve, reject) {
fs.readFile(filePath, { encoding: 'utf-8' }, function (err, data) {
if (err) return reject(err)
console.log('received data: ' + data)

resolve(data);
});
resolve(data)
})
})
.then(function(data) {
return db.runSql(data);
});
};
.then(function (data) {
return db.runSql(data)
})
}

exports._meta = {
"version": 1
};
version: 1
}

0 comments on commit 4b85c5b

Please sign in to comment.