This repository was my attempt to be as DevOps as possible with my personal website.
The automations with GitHub actions are what really make it though, on push or merge to main, the infrastructure is provisioned, soure tested, built, pushed, and deployed to AWS.
I have plans in the future to setup some more robust monitoring / telemetry but that's obviously overkill for a website that nobody sees.
I also want to mess around with hosting it 'securely' on my homelab PC..
Here's the map of everything included.
.
├── app/ # FastAPI application with HTMX
│ ├── main.py # Routes and blog loading logic
│ ├── templates/ # Jinja2 templates
│ └── static/ # CSS, images, etc.
├── content/ # Markdown blog posts (gitignored, lives in S3)
├── scripts/ # Utility scripts
│ └── sync_content.py # Sync local content to S3
├── infrastructure/ # Terraform for AWS
│ ├── main.tf # EC2, security groups, IAM, S3
│ ├── scripts/setup.sh # Instance bootstrapping
│ └── ...
├── .github/workflows/ # CI/CD pipelines
└── Dockerfile # App Container image
I really wanted to use FastAPI and I had seen a video about HTMX which got me thinking about a no javascript site that still had some dynamism and server-side rendering. This is a hot take but I think that us deciding that we can allow arbitrary remote code execution to happen on the client with java is a major security antipattern.
- FastAPI - Core API functionality handles routing
- HTMX - Enables dynamic content without JavaScript
- Jinja2 - Templating engine
- Markdown - Blog posts with frontmatter metadata.
- Docker - Containerized for easy tests and deploys.
- Nginx - Reverse proxy sitting in front of uvicorn.
Everything runs on AWS and the Terraform that defines it lives in infrastructure/.
I didn't go crazy with the organization or modules of anything since the scope is so small.
- EC2 (Ubuntu 24.04) - Single instance running Docker Compose
- Elastic IP - Pointing my Squarespace domain DNS to this IP
- S3 - Content bucket for blog posts and assets
- IAM - Least privelaged roles for EC2 (SSM access) and GitHub Actions (deploy permissions)
- Security Groups - SSH from my laptop only, HTTP/HTTPS from anywhere
The EC2 instance bootstraps itself via user data script that installs Docker and pulls the latest image. No SSH required for deployments.
If money was no object I would have preferred to deploy on something like ECS or EKS simply because those opinionated container orchestrators are so nice for not having to deal with managing VM configurations.
S3 also stores the terraform state but that's hidden in .gitignore so please don't try and dox me.
This was honestly the whole point of the exercise. I really wanted to flesh out the 'Website as Code' concept with some easy and fluid automation for updating the site incrementally.
Three workflows chain together automatically:
- Test and Lint - Runs on push to main. Ruff for linting, pytest for tests. If this fails, nothing else runs. Style issues just raise an error.
- Build and Push - Triggers after tests pass. Builds the Docker image and pushes to Docker Hub with SHA and updates
latesttags. - Deploy to EC2 - Triggers after build succeeds. Uses AWS SSM to run
docker compose pull && docker compose up -don the instance. No SSH keys or networking to manage with GitHub Action runners.
So with one push to main or merged PR, our website is updated in a matter of minutes!
Blog posts live in S3, not in git. Edit locally and sync when ready:
# Write/edit content locally
vim content/blog/new-post.md
# Preview locally
CONTENT_SOURCE=local uvicorn app.main:app --reload
# See what would sync
python scripts/sync_content.py --dry-run
# Publish to S3
python scripts/sync_content.pyNo redeploy needed - sync and it's live.
I've been using uv to manage the packages and virtual environment for this project.
# Create virtual environment and install dependencies
uv sync
# Or to include dev dependencies (pytest, ruff, httpx)
uv sync --dev
# Activate the virtual environment
source .venv/bin/activatemake dev # Run locally with uvicorn hot reload
make test # Run pytest
make lint # Run ruff linter and auto-format
make check # Run lint + test
make sync # Sync local content to S3
make freeze # Update requirements.txt from pyproject.tomlI was inspired in part by my perferred developer experience these days.
Though I used to be a major dark-mode enjoyer, recently I've actually gravitated to the light side..
MacOS Sequoia with the Silver Aerogel terminal theme and VSCode Nord Light certainly had an impact here.
I basically wanted to emulated that setup, down to the JetBrians Mono font!
Trippy right? I wanted it to look like my desktop.
I like how the content window turned out with a simple header kind of emulating a terminal with the current working directory. :)
The background image is a photo I took of Mt.Shuksan in the North Cascades.
I'm still playing around with the sidebar, kind of going for a KDE inspired simple design where it's almost invisible.
And the best part? NO JAVASCRIPT :D
