Whenever someone creates a blog, it feels almost like a requirement that they write about how they update and manage it, so let’s run with that idea.
For me, starting this blog acted as its own mini ongoing project that I wanted to treat as a practical exercise in working with a couple of different technologies that I wanted more hands-on experience with. The focus was much more on how I can use these technologies to deliver and maintain the site rather than how I create the actual content for it. My main website is actually hosted using GitHub Pages, which beyond having to point my domain to it, didn’t demand any action on my behalf regarding hosting options, approaches and upkeep. I commit the files I want to serve, and they promptly appear publicly without me needing to do anything else. It just works.
Funnily enough, as I write this, it’s dawned on me that the final Workflow I now follow to update this blog is almost identical to how I update my main site, making it an accidental recreation of the services provided by GitHub Pages; just at a much smaller scale. It is just me, after all.
With that said, what were the key technologies I used to achieve this?
Astro(Paper)
Starting with the most visible elements - All content for this blog is generated using Astro, which is self-described as a “web framework for building content-driven websites”. In practice, blog posts and metadata (titles, dates, OG images, etc.) are all written / defined in Markdown, which are templated into static web pages upon deployment.
That said, I did not build this blog from scratch; my practical knowledge in Frontend still has a few gaps to fill first. Instead, I opted to use Sat Naing’s AstroPaper Theme, which still gave me the freedom to customize things to my liking such as theming and typography, while eliminating other areas of concern entirely - like ensuring the site is mobile-friendly, automatically generating RSS Feeds, and providing build / development Scripts out of the box using NPM. Once the site is built, all of its assets are placed into a single dist/
folder that I can upload and host wherever I please, which brings me onto the next piece…
Microsoft Azure
Finally, I get to put my cert to use!
Because the blog is static once built, it’s assets are stored inside of an Azure Blob Storage Container, which can be configured to allow hosting directly out of the Container. The process of provisioning a Container, uploading assets to it, and seeing the site live at a public URL only took a few minutes and is dirt cheap. I don’t need anything like server-side functionality or authentication, so this is perfect for my needs. The process of uploading an entire local directory’s worth of files can also be automated through the Azure CLI as well, which was a huge boon later on for creating workflows to update the blog:
az storage blob upload-batch -d '$web' -s [LOCAL_DIRECTORY_FILE_PATH] --account-name [AZURE_STORAGE_ACCOUNT_NAME] --subscription [AZURE_SUBSCRIPTION_ID --overwrite
In order to serve the site over HTTPS, the Container is put behind an Azure CDN instance. Because I use an external registrar to manage my domains, I was also able to trivially point a subdomain to the CDN endpoint by creating a CNAME Record, all of which you’re guided through in the Azure Portal.
Because I chose to host this blog on a subdomain (blog.not-ed.com
instead of just not-ed.com
), using Azure CDN also comes with the added bonus of delegating the certificate management and renewal process to Microsoft, leaving me with one less thing to worry about. As a brief aside, if you were to follow a similar strategy for hosting a site in Azure using the root domain (e.g. using just not-ed.com
on its own), be warned that this will not be done for you, and you will still be responsible for the process of provisioning and renewing Certificates for your site if you want to serve it over HTTPS. Hosting at the root domain with this approach also requires you to configure an Azure DNS Zone and Key Vault in addition to everything else during setup, which may or may not be something you want to do depending on complexity and budgetary requirements.
End-to-end, this solution is streamlined, costs me literal pennies to host each month, and only requires me to purge the CDN’s cache after updating any files, which can also be automated through the Azure CLI later on:
az cdn endpoint purge --content-paths '/*' --profile-name [AZURE_CDN_RESOURCE_NAME] --endpoint-name [AZURE_CDN_ENDPOINT_NAME] --resource-group [RESOURCE_GROUP_NAME]
With all that said, this solution is sadly not long-lasting. Given Microsoft’s recent advisory that Azure CDN will be sunsetted in 2027, I will need to either migrate to a different CDN provider, or move to other Azure products that incur upfront billing costs like Azure Front Door. I’m happy to make this a problem for future me though in the meantime.
GitHub Actions
Like all of my projects, the blog’s code and files are housed in a GitHub repository. For the final major piece of this workflow, GitHub Actions is used to automate the process of updating the site and publishing new posts.
If you have never heard of GitHub Actions before, to give a very simplified explanation: it allows you to provision a fresh machine (named a “Runner”) pre-loaded with various build tools for executing automated CI/CD Workflows on a Repository. These Workflows are defined using .yaml files, and the machine is torn-down after it finishes running, guaranteeing a fresh machine every time you run the Workflow. It’s a nifty little freebie that I think a lot of people don’t even realize they have access to - It’s free to use in public GitHub repos and (in my experience) provides a generous amount of free storage and runtime minutes in private repos.
For this blog, I have a GitHub Actions Workflow set up to provision an Ubuntu Runner whenever I make a commit to its Repository. When this happens, the Workflow automatically:
- Checks out the latest version of the blog’s Repository containing my changes.
- Authenticates to Azure on my behalf.
- Installs any necessary NPM Packages required to build the distributed version of the blog.
- Builds the distributed version of the blog and performs other post-build processes such as image compression.
- Clears the Azure Blob Storage Container which stores the blog’s distributed files, and re-uploads the new re-built version in its place.
- Purges the caches of the Azure CDN Endpoint serving the blog, forcing it to retrieve the latest version from the Blob Storage Container which was just updated, and making it available to readers.
name: Deploy Live Blog to Azure Blob Storage
on:
push:
branches:
- main
workflow_dispatch:
permissions:
id-token: write
contents: read
jobs:
upload-job:
runs-on: ubuntu-latest
steps:
- name: Checkout main branch
uses: actions/checkout@v3
- name: Log in to Azure
uses: azure/login@v2
with:
client-id: ${{ secrets.AZURE_CLIENT_ID }}
tenant-id: ${{ secrets.AZURE_TENANT_ID }}
subscription-id: ${{ secrets.AZURE_SUBSCRIPTION_ID }}
- name: Install NPM packages
run: npm install
- name: Build site
run: npm run build
- name: Clear Site Azure Blob Storage Container
env:
storage-account: ${{ secrets.AZURE_STORAGE_ACCOUNT_NAME }}
run: az storage blob delete-batch -s '$web' --pattern "*" --account-name ${{ env.storage-account }}
- name: Upload built Site to Azure Blob Storage Container
env:
subscription-id: ${{ secrets.AZURE_SUBSCRIPTION_ID }}
storage-account: ${{ secrets.AZURE_STORAGE_ACCOUNT_NAME }}
working-directory: ./dist
run: az storage blob upload-batch -d '$web' -s . --account-name ${{ env.storage-account }} --subscription ${{ env.subscription-id }} --overwrite
- name: Purge Azure CDN Endpoint for site
env:
cdn-profile-name: ${{ secrets.CDN_PROFILE_NAME }}
cdn-endpoint-name: ${{ secrets.CDN_ENDPOINT_NAME }}
blog-resource-group: ${{ secrets.BLOG_RESOURCE_GROUP}}
run: az cdn endpoint purge --content-paths '/*' --profile-name ${{ env.cdn-profile-name }} --endpoint-name ${{ env.cdn-endpoint-name }} --resource-group ${{ env.blog-resource-group }}
Workflow
Finally, when all of this comes together, the typical workflow for writing a new Post is no different to how I would manage a traditional programming project with Version Control.
In order to create new Posts, I will start by writing it using UpNote. In theory, any note-taking app which provides similar formatting capabilities to Markdown would be appropriate. I personally prefer UpNote as I can run it on my Windows, Linux, and Android devices and because it offers a one-time Lifetime Subscription cost, which I find incredibly respectable in an age where everything is subscription-based. Because notes are synchronized automatically, it also allows me to adopt a more atomic approach to writing instead of setting dedicated time aside for it. For example, I could pull out my phone on the train, write for a few minutes, and know that when I get back home, I can pick up right where I left off to refine what I’ve written.
Once a new post has been written, I can pull down the site’s files from GitHub, and insert what I’ve written into a new Markdown file, apply any formatting that’s required, and provide any media and alt-text required for it:
---
author: Edward Barton
pubDatetime: 2025-02-02T18:09:00Z
title: "Customary \"How I Update This Blog\" Post"
slug: customary-how-i-update-this-blog-post
featured: true
draft: false
ogImage: "@assets/images/2025/customary-how-i-update-this-blog-post/og.png"
tags:
- Azure
- GitHub
- GitHub Actions
- Cloud
- Astro
- Web
description: As is tradition.
---
Whenever someone creates a blog, it feels almost like a requirement that they write about how they update and manage it, so let's run with that idea.
Once I commit my changes, I don’t need to do anything else - the site is automatically rebuilt for me and published on my behalf. Besides maybe checking my Billing on Azure every once in a while to make sure that my Alerts are still working, everything else is practically hands-off, or was something I already needed to take of anyway (such as managing my domain).
Now all that’s left for me to do is to actually write at a half-decent pace. (take this out)