Blog Setup

The original setup for this blog was extremely straightforward. The entire technology stack is:

Originally I was expecting to have to do the classic "S3 bucket serving static content" style of website but while poking around I realized in order to set up the email forwarding for the domain I had already done 90% of the difficult work of just getting the website hosted.

The last mile piece at the end was interacting programmatically with the WebDAV server that fastmail has and because of the semantics of WebDAV (and the way fastmails folder based hosting works) that meant writing a custom script to do the actual file wiping/uploading.

Long term I'd prefer to replace this with something better but I've included it here in case anyone is following in my footsteps.

#!/usr/bin/env bash
set -euo pipefail

LOCAL_PATH=$1
TARGET_DIR=$2
REMOTE_URL=$3
USERNAME=$4
PASSWORD=$5

create_remote_dir() {
    local dir_url=$1
    echo "Creating directory: $dir_url"
    curl -s -o /dev/null -u "$USERNAME:$PASSWORD" -X MKCOL "$dir_url" || true
}

upload_file() {
    local file_path=$1
    local dest_url=$2
    echo "Uploading file: $file_path -> $dest_url"
    curl -s -o /dev/null -u "$USERNAME:$PASSWORD" -T "$file_path" "$dest_url"
}

# Get list of files and directories in the target subdirectory
echo "Fetching contents of ${REMOTE_URL}${TARGET_DIR}..."
RESPONSE=$(curl -s -u "$USERNAME:$PASSWORD" -X PROPFIND \
  --header "Depth: 1" \
  "${REMOTE_URL}${TARGET_DIR}")

# Extract hrefs (paths) from XML response
# Have to use perl here instead of the much easier grep as I'm on a mac with only BSD preinstalled (yes I could switch to gnuutils grep...)
PATHS=$(printf '%s' "$RESPONSE" \
  | perl -0777 -ne 'while (/<(?:[^:<>]+:)?href>(.*?)<\/(?:[^:<>]+:)?href>/gis) { print "$1\n" }')

for ITEM in $PATHS; do
  # Skip the directory itself
  if [[ "$ITEM" == "${TARGET_DIR}" || "$ITEM" == "${REMOTE_URL}${TARGET_DIR}" ]]; then
    echo "Skipping $ITEM"
    continue
  fi

  # Full URL
  FULL_URL="${REMOTE_URL}/${ITEM}"

  echo "Deleting $FULL_URL"
  curl -s -u "$USERNAME:$PASSWORD" -X DELETE "$FULL_URL" -o /dev/null
done

echo "Cleanup complete."


find "$LOCAL_PATH" -type d | while read -r dir; do
    # Strip the local path prefix to get relative path
    rel_path=${dir#"$LOCAL_PATH"}
    remote_dir_url="$REMOTE_URL$TARGET_DIR$rel_path/"
    echo "Creating dir $rel_path at $remote_dir_url"
    create_remote_dir "$remote_dir_url"
done

find "$LOCAL_PATH" -type f | while read -r file; do
    rel_path=${file#"$LOCAL_PATH"}
    remote_file_url="$REMOTE_URL$TARGET_DIR$rel_path"
    echo "Uploading File $file to $remote_file_url"
    upload_file "$file" "$remote_file_url"
done

echo "Upload completed."