docker-compose (v1.29.2) to remote host with ssh fails

I have a personal project that is docker-compose based which I’ve deployed to remote servers before in the past (a few years ago, using steps here), and recently attempting to redeploy it from a more recent self-hosted GitLab pipeline on Ubuntu 24.0.4 I get this error:

docker.errors.DockerException: Install paramiko package to enable ssh:// support

This issue is exactly as described on this ticket. The issue also seems to be OS specific as well as docker-compose version specific – I have docker-compose 1.29.2 on MacOS Sequoia and it works fine, but 1.29.2 on Ubuntu 24.04 or 22.04 fails with the above error.

The workaround as described by multiple comments on the ticket is to not use the version installed by apt-get, instead install a specific older/working version with pip instead:

pip3 install docker-compose==1.28.2

Deploying a container to Google Cloud Run via gcloud cli

If you don’t already have one, create an Artifact Registry:

gcloud artifacts repositories create your-repo-name \
--repository-format=docker \
--location=europe-west2 \
--description="your-repo-description" \
--immutable-tags \
--async

Authorize gcloud cli access to the registry in your region:

gcloud auth configure-docker europe-west2-docker.pkg.dev

This adds config to $HOME/.docker/config.json, you can look in this file to see what GCP registries you have already authenticate with.

The image you’re deploying needs to listen on port 8080, and needs to be built for linux/amd64. If you’re building on an Apple Silicon Mac, build your image with:

docker build . --platform linux/amd64 -t image-tag-name 

Tag the image ready to push to your registry:

docker tag SOURCE-IMAGE LOCATION-docker.pkg.dev/PROJECT-ID/REPOSITORY/IMAGE:TAG

where:

LOCATION = GCP region, e.g. europe-west2

Authenticate your local Docker with your GCP Artifact Repository:

gcloud auth configure-docker LOCATION-docker.pkg.dev

Push your image to the Artifact Repository with:

docker push LOCAITION-docker.pkg.dev/PROJECT-ID/REPOSITORY/IMAGE:TAG

After pushing you can browse your Artifact Registry in the Console and see your image there.

To deploy a new service using the image you just pushed:

gcloud run deploy gcp-nginx-test --project your-project-name --image LOCAITION-docker.pkg.dev/PROJECT-ID/REPOSITORY/IMAGE:TAG

These steps are a summary of the Artifact Registry docs here, and the Cloud Run docs here.

GitLab Runner unable to run Docker commands

I have a GitLab Runner using a Shell Executor that needs to build a Docker container. When it executes the first Docker command it gets this error:

docker.errors.DockerException: Error while fetching server API version: ('Connection aborted.', PermissionError(13, 'Permission denied'))

If I logon as the gitlab-runnner user and try to execute docker commands manually I get this error:

$ docker ps
permission denied while trying to connect to the Docker daemon socket at unix:///var/run/docker.sock: Get "http://%2Fvar%2Frun%2Fdocker.sock/v1.47/containers/json": dial unix /var/run/docker.sock: connect: permission denied

A quick Google and I need to add the gitlab-runner to the Docker group to grant it permission to execute Docker:

sudo usermod -a -G docker $USER

GitLab – no runners for project

Setting up a new self-hosted GitLab, the pipeline for my project is stuck with this error:

… which looks like there are no runners available for the project. I know I have a runner available because I set one up yesterday, so taking a closer look.

In the CI/CD settings for my project, I think I see my shared runner:

Looking in the admin settings, it looks like when I set it up I used the tag ‘shared’:

The error says ‘no runners match all of the job’s tags: docker-test’, so I think what I need to do is change the tags on my runner to match. I edited the tags to remove ‘shared’ and replaced with ‘docker-test’ and now the job starts running! On to the next errors!