• Simple Docker Based Deploy Solution

    We all love to automate things, right? So, you remember that lonely pet project of yours that you’ve been working on for a while now? It’s too small for a proper solution like k8s to seem reasonable, but you still want to deploy it somewhere and have it running. And you want to do it with a simple script, because you’re a lazy person just like me. Well, I have a solution for you!


    Let’s assume you already have a Dockerfile of the image you want to deploy. You will also need the docker-compose.yml config with a complete setup of the service. I personally prefer to have a separate docker-compose.production.yml file for production settings.

    # docker-compose.production.yml
        image: 'redis:latest'
        restart: always
          - 6379
          - redis-data:/data
        image: awesome/service:latest
        restart: always
          - redis
    1. Create a pair of private and public keys. You can use ssh-keygen for that. They will be used to connect to the target host (where we want our service to be deployed to).
    2. Create the desired user on the target host. Avoid using root for security reasons.
    3. Install the public key to the target host’s authorized_keys for user.
    4. Define the following envoronment variables in GitLab CI/CD settings:
      • SSH_PRIVATE_KEY: The private key to connect to the target host
      • SSH_HOST_KEY: The target host fingerprints
      • SSH_LOGIN_HOST: The login & host of the target host, example: user@target-host.com
      • DEPLOY_LOCATION: The location to deploy files, example: /home/user/awesome-service
    # .gitlab-ci.yml
    image: docker:24.0.5
    # this is required to work with docker in docker (dind)
      - docker:24.0.5-dind
      - deploy
      stage: deploy
        # Install dependencies, add private key and fingerprints to the agent
        - >
          apk update && apk add openssh-client bash &&
          eval $(ssh-agent -s) &&
          bash -c 'ssh-add <(echo "${SSH_PRIVATE_KEY}")' &&
          mkdir -p ~/.ssh &&
          echo "${SSH_HOST_KEY}" > ~/.ssh/known_hosts &&
        # Build a docker image from the specified Dockerfile
        - >
          docker build 
          -f './bot/Dockerfile'
          -t 'swinobot/swinobot:latest'
        # Save the docker image to a remote host through SSH
        - >
          docker save
          'docker load'
        # Copy the docker-compose.production.yml file to the remote host
        - >
          cat './docker-compose.production.yml'
          "cat > ${DEPLOY_LOCATION}/docker-compose.production.yml"
        # Start the docker-compose.production.yml file on the remote host
        - >
          "docker-compose -f ${DEPLOY_LOCATION}/docker-compose.production.yml up -d"

    That’s it. Now you can push your changes to the repository and watch the magic happen. The CI/CD pipeline will build the image, save it directly to the target host, copy the docker-compose.production.yml file, and start the updated service.

One is the pinnacle of evolution; another one is me

One on the picture is the pinnacle of evolution; another one is me: inspired developer, geek culture lover, sport and coffee addict