The talk can be found here.
The slide deck for this talk can be found here.
The workflow controller configmap configuration we used is in this repo. However, more information on setting up an S3 artifact repository with Argo Workflows can be found in the Argo Workflows Documentation
Visit this CI example repo to get a fully local installation of Argo Workflows that will run the two example workflows in this directory. Pull down the repo and add the workflow files over to the directory run them.
You can run the whole installation locally in a k3d cluster.
This example workflow shows how S3 artifact processing can be parallelized with Argo Workflows using a fan-out approach.
It also includes further artifact configurations:
artifactGC
strategy for all artifacts, plus an over-ride for the final output step- including
{{workflow.uid}}
in the artifact key podSpecPatch
for increased resources for thereduce
step
This example workflow uses minio as the artifact repository to pass data between steps for a CI workflow that builds and deploys an app to Argo CD. It references WorkflowTemplates located in the CI example repo, so it must be ran as part of that installation.
It also includes further artifact configurations:
artifactGC
strategy for all artifacts
Pipekit is the control plane for Argo Workflows. Platform teams use Pipekit to manage data & CI pipelines at scale, while giving developers self-serve access to Argo. Pipekit's unified logging view, enterprise-grade RBAC, and multi-cluster management capabilities lower maintenance costs for platform teams while delivering a superior devex for Argo users. Sign up for a 30-day free trial at pipekit.io/signup.
Learn more about Pipekit's professional support for companies already using Argo at pipekit.io/services.