Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions .dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
.git
.github
ansible
__pycache__
*.pyc
14 changes: 13 additions & 1 deletion .github/workflows/README.md
Original file line number Diff line number Diff line change
@@ -1 +1,13 @@
Here you place the github actions automation
GitHub Actions workflows live here.

Expected workflow behavior:

- run `ansible-lint`, `shellcheck`, and playbook syntax checks on pull requests
- auto-deploy to the `staging` GitHub Environment on pushes to `main`
- allow manual deploys to `staging` or `production` through `workflow_dispatch`

Required GitHub Environment secrets for each environment:

- `DEPLOY_HOST`
- `DEPLOY_SSH_PRIVATE_KEY`
- `DEPLOY_KNOWN_HOSTS`
132 changes: 132 additions & 0 deletions .github/workflows/ci-cd.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,132 @@
name: CI and Deploy

on:
pull_request:
push:
branches:
- main
workflow_dispatch:
inputs:
target_environment:
description: Target environment
required: true
default: staging
type: choice
options:
- staging
- production

jobs:
lint:
name: Lint and Syntax Check
runs-on: ubuntu-latest

steps:
- name: Check out repository
uses: actions/checkout@v4

- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.13"

- name: Install shellcheck
run: |
sudo apt-get update
sudo apt-get install -y shellcheck

- name: Install Ansible tooling
run: |
python -m pip install --upgrade pip
pip install "ansible>=11,<12" "ansible-lint>=25,<26"

- name: Install Ansible collections
working-directory: ansible
run: ansible-galaxy collection install -r requirements.yml

- name: Write mock staging vault file for CI checks
run: |
mkdir -p ansible/inventory/group_vars/staging
cat > ansible/inventory/group_vars/staging/vault.yml <<'EOF'
---
vault_postgres_app_password: lint-only
vault_postgres_readonly_password: lint-only
vault_postgres_superuser_password: lint-only
vault_s3_access_key: lint-only
vault_s3_secret_key: lint-only
vault_ci_deploy_ssh_private_key: |
lint-only
EOF

- name: Run ansible-lint
working-directory: ansible
run: ansible-lint .

- name: Run shellcheck on tracked shell templates
run: |
shellcheck -s bash ansible/roles/systemd_reload/templates/app-reload.sh.j2
shellcheck -s bash ansible/roles/backup/templates/app-db-backup.sh.j2

- name: Syntax check Ansible playbooks
working-directory: ansible
run: |
ansible-playbook bootstrap.yml --syntax-check
ansible-playbook site.yml --syntax-check
ansible-playbook deploy.yml --syntax-check

deploy:
name: Deploy
needs: lint
if: github.event_name == 'push' || github.event_name == 'workflow_dispatch'
runs-on: ubuntu-latest
environment: ${{ github.event_name == 'workflow_dispatch' && github.event.inputs.target_environment || 'staging' }}

env:
TARGET_ENVIRONMENT: ${{ github.event_name == 'workflow_dispatch' && github.event.inputs.target_environment || 'staging' }}

steps:
- name: Check out repository
uses: actions/checkout@v4

- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.13"

- name: Install Ansible tooling
run: |
python -m pip install --upgrade pip
pip install "ansible>=11,<12"

- name: Install Ansible collections
working-directory: ansible
run: ansible-galaxy collection install -r requirements.yml

- name: Configure SSH key
run: |
mkdir -p ~/.ssh
printf '%s\n' "${{ secrets.DEPLOY_SSH_PRIVATE_KEY }}" > ~/.ssh/deploy_key
chmod 600 ~/.ssh/deploy_key

- name: Configure known_hosts
run: |
mkdir -p ~/.ssh
printf '%s\n' "${{ secrets.DEPLOY_KNOWN_HOSTS }}" > ~/.ssh/known_hosts
chmod 644 ~/.ssh/known_hosts

- name: Create temporary CI inventory
run: |
cat > ansible/inventory/ci.ini <<EOF
[${TARGET_ENVIRONMENT}]
target ansible_host=${{ secrets.DEPLOY_HOST }}

[${TARGET_ENVIRONMENT}:vars]
ansible_user=deploy
ansible_python_interpreter=/usr/bin/python3
ansible_ssh_private_key_file=$HOME/.ssh/deploy_key
ansible_ssh_common_args=-o IdentitiesOnly=yes
EOF

- name: Run deploy playbook
working-directory: ansible
run: ansible-playbook -i inventory/ci.ini deploy.yml --limit "${TARGET_ENVIRONMENT}"
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
ansible/inventory/group_vars/vault.yml
ansible/inventory/group_vars/*/vault.yml
/.ansible/
13 changes: 13 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
FROM python:3.13-slim

ENV PYTHONUNBUFFERED=1

WORKDIR /app

COPY app /app/app

EXPOSE 8000

HEALTHCHECK --interval=30s --timeout=5s --retries=5 CMD python -c "import urllib.request; urllib.request.urlopen('http://127.0.0.1:8000')"

CMD ["python", "/app/app/server.py"]
1 change: 0 additions & 1 deletion ansible/README.md

This file was deleted.

9 changes: 9 additions & 0 deletions ansible/ansible.cfg
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
[defaults]
inventory = ./inventory/hosts
roles_path = ./roles
interpreter_python = auto_silent
retry_files_enabled = False
host_key_checking = True

[ssh_connection]
pipelining = True
9 changes: 9 additions & 0 deletions ansible/bootstrap.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
---
- name: Bootstrap Debian application hosts
hosts: app
become: true #escalate privileges when tasks need root. Tasks need root run with sudo

roles:
- users
- sshd
- firewall
56 changes: 56 additions & 0 deletions ansible/deploy.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
---
- name: Deploy application updates
hosts: app
remote_user: deploy
become: true

vars:
app_deploy_only: true

# Current local VM learning flow:
# - sync the local working tree from the Mac to the VM
# - keep the migration step as a placeholder
# - trigger the custom systemd reload service to rebuild/restart the app
#
# If you later switch to git-on-VM deployment:
# - keep this playbook structure
# - keep `app_deploy_only: true`
# - keep the migration hook here
# - keep the reload + healthcheck verification here
# - do NOT add extra deploy tasks here just for source copying
# - instead, change the source-delivery part inside `roles/app/tasks/main.yml`:
# - uncomment the git checkout task there
# - comment out the three local-copy tasks there
#
# So the future change belongs mostly in the app role, not in this playbook.

roles:
- app
- systemd_reload

post_tasks:
- name: TODO framework migration hook
ansible.builtin.debug:
msg: >-
Replace this placeholder with the migration command your application
framework needs. In the current flow it belongs after the app files
are synced and before app-reload.service is started.

- name: Trigger application reload service
ansible.builtin.systemd_service:
name: "{{ app_reload_service_name }}"
state: reloaded

- name: Wait for application container healthcheck to pass
ansible.builtin.command:
cmd: "docker inspect --format {{ '{{.State.Health.Status}}' }} {{ app_container_name }}"
register: app_container_health
changed_when: false
retries: 20
delay: 3
until: app_container_health.stdout == "healthy"

- name: Verify application responds on localhost
ansible.builtin.uri:
url: "http://127.0.0.1:{{ app_bind_port }}"
status_code: 200
46 changes: 46 additions & 0 deletions ansible/inventory/group_vars/all.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
---
app_name: cloudops-test
app_user: deploy
app_group: deploy

app_root: /opt/app
app_storage_root: /opt/storage
app_repo_url: https://github.com/avidity/cloudops-test.git
app_image_repository: app
app_image_tag: latest
app_bind_host: 127.0.0.1
app_bind_port: 8000
app_domains:
- example.com

docker_version: "29"

# These versions now drive the Compose-managed service images inside the `app`
# stack. The old host-level `postgres` and `redis` roles are no longer used.
postgres_version: "18"
postgres_database: app
postgres_app_user: app_rw
postgres_readonly_user: app_ro
redis_version: "8"
nginx_version: "1.29"

ssh_allowed_public_keys_dir: files/ssh_keys
backup_retention_months: 6

# Real S3-compatible bucket settings.
# These can stay as placeholders while local-only backup testing is used.
# Once a real bucket exists, replace them with the real bucket name and
# endpoint, then comment out the local-only override further below.
backup_s3_bucket: replace-me
backup_s3_endpoint: https://s3.example.com

# Current local-only learning mode:
# - backups are still created on the VM under `/opt/storage/backups`
# - the upload step is skipped even though the role already supports it
# - this is useful until real S3-compatible credentials and bucket details exist
#
# When switching to real S3 upload later:
# - comment out the line below
# - set real `backup_s3_bucket` and `backup_s3_endpoint` values above
# - keep `vault_s3_access_key` and `vault_s3_secret_key` in the encrypted vault
backup_upload_enabled: false
15 changes: 15 additions & 0 deletions ansible/inventory/group_vars/production.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
---
# Current state:
# - this file is only a production scaffold
# - the `[production]` inventory group is still empty in `inventory/hosts`
#
# To switch production on later:
# - add real production hosts under `[production]`
# - replace `app_domains` with the real production FQDN(s)
# - optionally set `app_image_tag` to a specific release tag instead of `latest`
# - add the real encrypted production vault file with the database, backup, and
# deploy credentials
environment_name: production
app_image_tag: latest
app_domains:
- app.example.com
5 changes: 5 additions & 0 deletions ansible/inventory/group_vars/staging/staging.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
environment_name: staging
app_image_tag: latest
app_domains:
- staging.example.com
18 changes: 18 additions & 0 deletions ansible/inventory/group_vars/staging/vault.example.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
---
# Copy this file to:
# inventory/group_vars/staging/vault.yml
#
# Encrypt the copied file with:
# ansible-vault encrypt inventory/group_vars/staging/vault.yml

vault_postgres_app_password: change-me
vault_postgres_readonly_password: change-me

# Used by the backup role when real S3-compatible upload is enabled.
# In the current local-only VM flow, uploads are disabled, so these can stay as
# dummy values until a real bucket is wired in.
vault_s3_access_key: change-me
vault_s3_secret_key: change-me
vault_ci_deploy_ssh_private_key: |
REPLACE_WITH_CI_DEPLOY_KEY
vault_postgres_superuser_password: change-me
2 changes: 2 additions & 0 deletions ansible/inventory/host_vars/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
Add per-host overrides here when a specific machine needs values that differ
from the environment-wide defaults in `group_vars/`.
4 changes: 4 additions & 0 deletions ansible/inventory/host_vars/debian_test.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
---
sshd_allow_users:
- deploy
- john
27 changes: 27 additions & 0 deletions ansible/inventory/hosts
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
[staging]
# Replace with the real staging host or VM address before use.
debian_test ansible_host=203.0.113.10

[staging:vars]
ansible_user=deploy
ansible_python_interpreter=/usr/bin/python3
# Replace with the real private key path used to access the staging host.
ansible_ssh_private_key_file=/path/to/your/deploy/private_key
ansible_ssh_common_args='-o IdentitiesOnly=yes'

[production]
# Current state:
# - production is scaffolded only
# - no production host is configured yet
#
# To enable real production later:
# - add one or more real production hosts here, for example:
# app_prod ansible_host=203.0.113.10
# - point `ansible_ssh_private_key_file` at the real production deploy key
# - add real production variables in `inventory/group_vars/production.yml`
# - add the matching encrypted production vault values before running
# `site.yml` or `deploy.yml`

[app:children]
staging
production
7 changes: 7 additions & 0 deletions ansible/requirements.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
---
collections:
- name: ansible.posix
- name: community.crypto
- name: community.docker
- name: community.general
- name: community.postgresql
Loading