Skip to content

Release Test Plan(s)

Manuel Martín edited this page Jun 24, 2022 · 33 revisions

Test Scenarios

The acceptance test "install" suite is used to test installation in different scenarios:

Name Client Platform Binary used Server Platform Install Test Install Options Push Test Push Options
Github Actions Runner Kubernetes Cluster Provider Operator Developer
Config_test Linux epinio-linux-amd64 K3d Configfile Flag Magic DNS Sample App Create, then push
Scenario1 Linux epinio-linux-amd64 GKE - Default epinio CA Custom DNS Sample App Zero Instances
Scenario2 Linux epinio-linux-amd64 GKE Split Install for Cert Manager Letsencrypt Staging CA
Custom DNS
Sample App Zero Instances
Scenario3 self-hosted linux epinio-linux-amd64 RKE MetalLb
Split Install for Cert Manager
Private CA
External Registry Magic DNS
Sample App Create with Configuration + External Registry (docker hub)
Scenario4 MacOS (Intel x86_64) epinio-darwin-amd64 EKS Nginx ingress Default epinio CA
Custom DNS
S3 storage
Sample App Env Vars
Scenario5 Windows (Intel x86_64) epinio-windows-amd64.exe AKS Split Install for Cert Manager Letsencrypt Staging CA
Custom DNS
Sample App -
Scenario6 Windows (Intel x86_64) epinio-windows-amd64.exe AKS - Default epinio CA
Custom DNS
Sample App -
⚠️Scenario7 Mac and/or Windows and/or Linux Rancher Desktop ? ? Sample App ?

Note: perhaps we could discuss auto-trigger of this tests as a condition to release. Open ticket for discussion here

Manual tests

  • Automation of installing and running Epinio on Rancher Desktop is not possible at the moment. So we should each (minor) release install Epinio once on Rancher Desktop and push the sample app.

Draft

Aspects to test

These are not the plans, just the notes of different things to make sure are covered in the plans.

Installation

  • Installation process on multiple CSPs:
    • Amazon EKS
    • Azure AKS
    • Google GKE
  • Installation process on Rancher K8s distros:
    • RKE
    • Rancher Desktop (which is k3s)
    • Minikube
    • K3d (already well tested through CI)
  • Two step installation process (epinio install-traefik, put IP in DNS, epinio install)
  • Installation with --loadbalancer-ip (if added to beta), instead of two step install where possible

Languages to explicitly test

Developer Workflows

  • Push without create
  • Create then push
  • Create with configuration
  • Add configuration to application without instances
  • Add environment variables to application without instances

To be continued (I ran out of time), feel free to add

Notes on Matrix

We can keep the test matrix smaller by not testing languages and services on all platforms. If a language feature works on one platform, it should work on all others. If not it's most likely a bug in the language's buildpack.

So we could test the developer workflows on pairs like this:

    • Amazon EKS, two step install, Java, buildpack environment variables
    • Azure AKS, Javascript, custom configuration
    • Google GKE, Golang
    • RKE, specified IP, Nginx,
    • Rancher Desktop (which is k3s), php, mysql from minibroker
    • Minikube, python with self-built custom builder

This is implemented by the test scenarios.