I can't think of a better way to express this other than all existing CI/CD pipelines suck.

I've been looking at a number of CI pipelines lately, after having to deal with a large Jenkins installation a few years ago. Back when Jenkins was the only option and it didn't work with git very well (tags, merges). Does it work any better now? I dunno, I've been looking at others like Gitlab CI and Concourse.

Just, wow. I cannot find any CI pipeline that will let me do what I consider extremely simple build.

Checkout code Install dependencies Minify UI Test Scan Package Deliver

Every CI tool I've used (except jenkins) seems to think it's a good idea to erase all the work from each step and start the next step with a clean slate. How is that a pipeline? Why do I have to write 8 or 9 commands in one bash file and make that 1 "step" in a build "pipeline"?

I always thought pipeline meant passing something through a series of functions or mutations when it is used in the context of programming. But, in the concourse-ci docs, they specifically mention how each step gets a clean checkout of the starting git hash and claim that that's the "power of pipelines".

I would like to deliver the exact same binary (or package) that passes my tests. I do not want to re-build form the same git hash after passing tests. Is there any CI besides Jenkins (which doesn't seem to handle merges at all) that can provide the basic functionailty of passing the output of one step to the next step?

In Concourse, the only thing that can be the input to a "job" is a "resource" - which is specifically external to the pipeline. I was thining about making a pipe resource to pass the result of one job to the next one in the pipeline when it struck me just how ridiculous it is that I was going to make a my own pipe implementation to fit inside a build pipeline tool.

Here's a comment from 3 years ago where the user said they were just going to give up on concourse because Jenkins is the only CI that allows you to work on the same workspace for multiple stages. https://github.com/concourse/concourse/issues/230#issuecomment-247557514

A solution I came up with is to continuously tar up and push the entire workpace to a local S3 bucket provided by minio. This is how a build pipeline should behave by default. I will probably just switch to Jenkins as I'm setting this up for a customer, and starting off with hacks down to the core is not a good solution to deliver. But, last time I used jenkins, it wasn't able to understand git merges and the devs didn't think this was a problem ... :/

GoCD

I have talked about concourse and the limitations I hit when trying to use it. But, GoCD seemed like to would be exactly what I was looking for. Unfortunately, I cannot get the Ubuntu 18.04 based agent to function like Ubuntu 18.04. I cannot execute any docker containers on the agent, dockerd will not start because there is no iptables and no modprobe.

I've been looking at CI/CD pipelines for 4 days now. It's time to just build my own because I haven't found one that can be configured to simply run yarn install and then composer install and then tar up everything and let me have access to it.