Posts from May 10, 2026

BlogMore v2.21.0

2 min read

After noticing a broken link in a post yesterday, I got to thinking that it would be useful to add a linter to BlogMore. So I've released v2.21.0 which adds linting support.

A number of things are checked and the results are broken down into things that are errors or warnings. Errors result from any of these checks:

  • Ensures all posts and pages have valid YAML frontmatter. If a file cannot be parsed, it is reported as an error.
  • Scans the generated HTML for links to non-existent internal paths (other posts, pages, categories, tags, archives, site features like search, or files in extras/).
  • Checks that all <img> sources resolve to valid internal paths or files in the extras/ directory.
  • Checks that the cover property in a post or page frontmatter points to a valid resource.
  • Verifies that all page slugs listed in sidebar_pages actually exist.
  • Checks that all internal-looking URLs in the links: and socials: configuration settings point to valid targets.

On the other hand, the following just result in a warning:

  • Flags if a post is missing a title, category, tags, or a date.
  • Reports if a post's date or modified date is set in the future.
  • Notes if a post's modified date is earlier than its original publication date.
  • Identifies if two or more posts share the exact same title.
  • Flags inline images missing an alt attribute, or those with an empty/whitespace-only alt attribute.
  • If clean_urls is enabled, warns if internal links point explicitly to index.html.
  • Reports internal links using the full site_url (e.g., https://example.com/path/) instead of a root-relative path (/path/).

I feel like all of these cover most of the things that are low-cost to detect but have a positive impact on the state of the content of a blog.

One thing I've not done is any sort of checking of external links. This would be costly and could possibly have unintended consequences that I don't want to be messing with (perhaps a tool to export the list of external links for checking could be useful, at some point).

Having run this against this blog, I did find some things that needed cleaning up, mostly absolute links that could be turned into root-relative links (always good for making the content portable).

I'm going to make this a standard part of my "I'm ready to publish" check for this blog, and it should also be helpful as I carry on migrating the images in the blog over to WebP.

An argument with Gemini

1 min read

At the moment I'm working on a linting command for BlogMore. Having given up on Copilot/Claude for this, I've been having quite a bit of success with Gemini CLI. But while doing this, I've noticed some odd things with it. It does have this habit of cargo-culting some changes, or just rewriting code that doesn't need it.

For example, the tests for the new linting tool: it keeps adding import pytest near the top of the test file despite the fact that pytest doesn't get used anywhere in the code. Every time, I'll remove it, every time it adds more tests, it'll add it back.

Another thing I've noticed is it seems to be obsessed with adding indentation to empty lines. So, if you've got a line of code indented 8 spaces, then an empty line, then another line of code indented 8 spaces, it'll add 8 spaces on that empty line. That sort of thing annoys the hell out of me1.

But the worst thing I just ran into was this. It had written this bit of code:

def lint_site(site_config: SiteConfig) -> int:
    """Convenience function to run the linter.

    Args:
        site_config: The site configuration.

    Returns:
        0 if no errors, 1 if errors were found.
    """
    linter = Linter(site_config)
    return linter.lint()

On the surface this seems fine: a function that hides just a little bit of detail while providing a simple function interface to a feature. But that use of a variable to essentially "discard" it the next line... nah. I dislike that sort of thing. The code can be just a little more elegant. So seeing this I edited it to be (removing the docstring for the purposes of this post):

def lint_site(site_config: SiteConfig) -> int:
    return Linter(site_config).lint()

Nice and tidy.

I then had Gemini work on something else in the linting code. What did I see towards the end of the diff? This!

A sneaky edit

Sneaky little shit!

Now, sure, the idea is you review all changes before you run with them, but knowing that it's likely that any given change might rewrite parts of the code that aren't related to the problem at hand adds a lot more overhead, and I wonder how often people using these tools even bother.


  1. I've seen some IDEs do that on purpose too; I've got Emacs configured to strip that out on save. 

More syncing GitHub to GitLab and Codeberg

1 min read

Following on from my first post about this, I've tweaked the script I'm using to backup a repo to GitLab and Codeberg:

#!/bin/sh

# Check if the current directory is a Git repository
if ! git rev-parse --is-inside-work-tree > /dev/null 2>&1; then
    echo "Error: This directory is not a Git repository."
    exit 1
fi

REPO_NAME="$1"

# If no repository name was provided, try to get it from the origin remote
if [ -z "$REPO_NAME" ]; then
    ORIGIN_URL=$(git remote get-url origin 2>/dev/null)
    if [ -n "$ORIGIN_URL" ]; then
        REPO_NAME=$(basename -s .git "$ORIGIN_URL")
    else
        echo "Error: No repository name provided and no 'origin' remote found."
        echo "Usage: $0 <repo-name>"
        exit 1
    fi
fi

echo "Configuring multi-forge backup sync for: $REPO_NAME"

# Set up the remote called backups. Anchor it to Codeberg.
git remote remove backups > /dev/null 2>&1
git remote add backups "ssh://git@codeberg.org/davep/${REPO_NAME}.git"

# Set up the push URLs.
git remote set-url --push backups "ssh://git@codeberg.org/davep/${REPO_NAME}.git"
git remote set-url --add --push backups "git@gitlab.com:davep/${REPO_NAME}.git"

# Only ever backup main.
git config remote.backups.push refs/heads/main:refs/heads/main

# Also backup all tags.
git config --add remote.backups.push 'refs/tags/*:refs/tags/*'

echo "----------------------------------------------------"
echo "Backups configured:"
git remote -v
echo "----------------------------------------------------"
echo "To perform the initial sync, run: git push backups"

### setup-forge-sync ends here

The changes from last time include:

  • The repo name now defaults to whatever is used for GitHub, so I don't have to copy/paste it or type it out.
  • It now backs up all the tags too.

I've been running with this for a couple of days now and it's proving really useful. Well, when Codeberg is available to push anything to...