Site update: Migrating hosting providers – automating deployment with Terraform, Ansible and GitLab CI Pipelines

Over the past couple of years I’ve been working on and off on a personal project to migrate and update a GitLab CI pipeline on my self-hosted GitLab for building and deploying this site. Unfortunately my self-hosted GitLab used to be on a e-waste HP DL380 G7 rack server that I no longer have after moving house, so I’ve gone back to using my old 2008 MacPro 3,1 as a Proxmox server, where I now run GitLab (which oddly is what I first used this Mac for several years ago).

As part of the update, I wanted to achieve a couple of goals:

  • update the GitLab pipeline to deploy to a staging server for testing, and then deploy to the live server
  • template any deployment files that are server/domain specific
  • update my Docker images for WordPress, updating the plugins, and anything that needs to be in the image to support the runtime, e.g. nginx, php plugins for nginx etc.
  • move to a new cloud provider that would allow me to provision VMs with Terraform
  • automate updating SSL certs with Let’s Encrypt certbot

I won’t share my completed pipeline because I don’t want to share specifics about how my WordPress site is configured, but I’ll give an overview of what I used to automate various parts of it:

While I’ve ended up with a working solution that meets my goals (I can run the pipeline to deploy to my test server or deploy latest to my new live server), I still have a few areas I could improve:

  • GitLab CI Environments, and parameterization – I don’t feel I’ve taken enough advantage of these yet. The jobs that deploy to my test server run automatically, but the deploy to my live set is the same set of jobs that I manually run, and configured to deploy to a different server – I feel there’s more I can parameterize here and need to do some more experimentation in this area

Although this effort was spread over a couple of years before I got to a point of completion, it was a great opportunity to gain some more experience across all these tools.

Preserving generated files as artifacts in GitLab CI Pipelines

Today I learned after spending a while trying to debug why a later job in my pipeline couldn’t see a file from a previous job, that GitLab does not preserve files on the filesystem between stages, and even jobs. I guess this makes sense as your pipeline is running against what it currently in your repo, and not untracked files that have been created by your pipeline.

If you are generating new files, from for example Ansible generating files from templates, if the files are generated in one job and then you expect to use them in a later job in the pipeline, you need to tell GitLab that the files are ‘artifacts’ to preserve them.

In the case of generated files, they will be untracked files in git. Tell GitLab to publish them as artifacts with the following config:

generate-nginx-config test2:
stage: generate-templates
environment: test2
script:
- cd iac/ansible
- ansible-playbook -i test2.yml nginx-playbook.yml
# keep the file generated from ansible template, which is now
# untracked, so it can be used in following jobs
artifacts:
untracked: true
paths:
- nginx/config/etc/nginx/sites-available
tags:
- docker-test

This is a job in my pipeline for generating my nginx config based on the environment I’m deploying to. Note the untracked: true which tells GitLab to preserve the untracked files as artifacts.

gp-pages deploy fails when run from GitHub Action: Author identity unknown

I’m calling gh-pages from a GitHub Action, and at the point when gh-pages is called by the Action, it fails with this error:

> gh-pages -d build

Author identity unknown

*** Please tell me who you are.

Following this recommendation on a similar posted issue, I updated the ‘npm run build’ script in my package.json to pass the -u option with the github-actions-bot userid:

    "deploy": "gh-pages -d build -u 'github-actions-bot <support+actions@github.com>'",

After adding this and re-running, now I have a different error:

> gh-pages -d build -u 'github-actions-bot <support+actions@github.com>'

fatal: could not read Username for 'https://github.com': No such device or address

Apparently to allow the Action to use actions/checkout to access your repo you must use a Personal Token per additional instructions here.

To create a new access token, access your account settings, then Developer Settings:

To add the token value as a secret to your project, add a new secret via settings on the repo that your Action is accessing:

After paying more attention to the Action log, the checkout step was actually working and completing as expected, it was the ‘npm run deploy’ step that was failing with same error as shown in the linked post above. Following the same advice to use the access token to resolve the ‘Could not read username’ error, I updated the ci.yml again to add the reference to the token as part of setting the remote repo url:

- name: Deploy
  env:
    MY_EMAIL: kevin.hooke@gmail.com
    MY_NAME: kevinhooke
  run: |
    git config --global user.email $MY_EMAIL
    git config --global user.name $MY_NAME
    git remote set-url origin https://$MY_NAME:${{ secrets.GH_SECRET }}@github.com/kevinhooke/my-example-project.git
    npm run deploy
  • the git config steps set the git user’s email and name properties within the context of the Github Action
  • the ‘git remote set-url’ specifies the repo url including my userid and the Personal Access Token retrieved from the GitHub Secret.

Problem solved, now the action works as expected and publishes this project’s GitHub pages on every commit!

Running GitLab in a Docker container on a different port

Gitlab by default runs on port 80. GitLab in a Docker container runs the same as a when natively installed, but to change the port you need to change the config, and change the exposed ports on the container.

First, per steps here, start the container with:

docker run --detach \	
--hostname gitlab.example.com \
--publish host-https-port:container-https-port
--publish host-http-port:container-http-port
--publish 22:22 \
--name gitlab \
--restart always \
--volume /srv/gitlab/config:/etc/gitlab \
--volume /srv/gitlab/logs:/var/log/gitlab \
--volume /srv/gitlab/data:/var/opt/gitlab \
gitlab/gitlab-ce:latest

By default host-http-port and container-http-port are 80, and host-https-port container-https-port are 443. Change these to be whatever port you want to run on, but keep each pair the same (e.g. host-http-port and container-http-port = 8090)

When the container is up, start a shell into the running container:

docker exec -it containerid sh

and then edit the config file:

vi /etc/gitlab/gitlab.rb

and add line (at the top is ok):

external_url 'http://localhost:your-new-port-here

setting your-new-port-here to the new port.

Reconfigure the server:

gitlab-ctl reconfigure
gitlab-ctl restart

Done!