Since I was in university, the application hosted on this domain – or, rather, its predecessor domains because I’ve switched a few times (all of the previous ones do redirect here though) – was WordPress. But a short while ago I switched to the Jekyll static site generator.

I had been considering moving from WordPress to a static site for a number of reasons, but I ended up going with Jekyll specifically because:

  • Jekyll is the default static site generator for GitHub Pages. While I actually use Gitlab Pages (which, clearly, also supports it) and not GitHub Pages, using Jekyll means I get double the relevant documentation and an easier time switching should I ever need to (I probably won’t ever just because Microsoft but having the option is nice).
  • Jekyll’s written in Ruby, and I like Ruby. It’s a fun language that sits nicely between Python (which I use quite a bit) and Perl (which I pretend I use a lot for the bit).

But if you’re accessing this site at any domain other than cblanke2.gitlab.io (like cblanke2.com or cblanke2.sdf.org), then it’s not the version hosted on GitLab Pages. It’s one of the places I’m mirroring this site! And the ability to easily mirror my site across different hosts was one of the reasons I wanted to go static. While I could use some DNS magic or load balancing to serve the same domain from multiple hosts, having separate domains adds another layer of redundancy should one of the domains it’s hosted on lapse, or get hijacked, or something like that. And nothing is really stopping me from load balancing on my primary domain while still keeping the domain-separate mirrors, so I could do both if I really wanted to. I just don’t want to right now.

I did think about what it would take to do something similar – domain-separate mirrors and/or redundant hosts – with my previous WordPress site, but that’d be overkill for my personal site. I really just want something simple here. Always have. Always will. Even when I ran WordPress I kept the number of plugins and themes to a minimum, mostly using standard pages and posts. But WordPress is inherently a little complex. All web applications are to some degree, but that complexity grows when you run separate copies and try to keep them in sync. Now, if I make a change to a page or write a new post or something, all I need to do is commit my changes to the repo and run a few rsyncs/sftps to the various hosts. I don’t need to set up MySQL replication, repair MySQL replication when something goes wrong, worry about virtual networking between the separate hosts, repair MySQL replication when something goes wrong, troubleshoot lsyncd, or repair MySQL replication when something goes wrong. And if I want to move/mirror my site elsewhere (soon, all will become Elvis mirrors), I don’t need to worry about database exports or anything like that; I just need to copypaste/rsync/sftp some HTML, maybe adjust some DNS records, and then I’m done. The fewer moving parts on a static site also means fewer security concerns and lower resource usage.

TL;DR: simplicity, flexibility, security, and efficiency. Those are the reasons I went static.

My previous site had quite a few posts that I wrote over the years. When I first started actively working to move things over to a static site (late 2024/early 2025) I thought about bringing them all over using one of the available WordPress-to-Jekyll migration tools. But the more I worked on the site (and the more I re-read many of those early posts), the less I actually wanted to do that. There’s a few paragraphs here and there that I think are worth preserving, but I eventually decided that rather than do a bunch of post-migration cleanup it would be better to start fresh and hand-pick what I thought was valuable enough to bring over; either in whole or in new posts incorporating the salvaged fragments. The manual approach felt simpler. At least to me: a neurotic SysAdmin who is less than a decade clean from distrohopping but has the itch to start again. And in making that decision I was finally able to just do the damned thing and put the static site live.

Now, before I move on to where I totally go back on my stated goal of keeping things simple, I want to mention the other hosts I’m using besides GitLab Pages. Naturally I’m running the copy served by my primary domain on Reclaim Hosting’s Shared Hosting server because that’s where I work, but what about the copy served at cblanke2.sdf.org? Well, that’s hosted on the SDF Public Access UNIX System server(s). I’ve been a proper member for a few years now and, while I mostly just lurk (Rule 33), it’s a great community of people. It’s a group that I highly recommend joining if you’re an IT professional, have an interest in retrocomputing, or just appreciate the novelty of it all.

Alright, so, if you know me in any capacity you should know that I’m a big proponent of digital privacy (I say without any trace of irony on a site plastered with my real name and picture and CV and links to my social media); Tor in particular. Yet another goal of mine in switching over to a static site was the ability to also make it accessible as a Tor hidden service. Y’know, a .onion. While I could make a WordPress site accessible in such a way, it just further complicated the already existing hosting dilemma. And, in my experience, things can get weird and complex running WordPress as a Tor hidden service. Yet again, going with a static site helped keep things simpler.

“But what’s the point of doing that when you’re not trying to actively hide where your site’s hosted in order to, for example, circumvent some unjust censorship?” you ask. “You’re not writing anything important enough to get you censored!” you say.

“Shut up.” I respond. “Let me have fun. Also let’s circle back on that whole censorship thing in a year or so.”

Through my work as a web hosting-focused SysAdmin, I’ve become decently familiar with using Cloudflare. I’ve also become decently familiar with the pain of fighting off LLM bots scraping websites for the fuel needed to generate more environment-destroying artificially intelligent auto-responses. Cloudflare’s various methods of blocking/limiting LLM crawlers and other annoying bots already had my interest, but using Cloudflare on a static site still felt like yet another case of overkill. I mean, if any type of site can handle requests like all get out, it’d be a static site. So I was on the fence, until I saw that Cludflare has built in onion routing that allows sites run through Cloudflare to be accessible as a Tor hidden services. If Cloudflare could handle making my site available as a hidden service automatically, then it just meant one less thing for me to do. And with that I decided to route my domain through Cloudflare to get both the security benefits and the onion routing.

However, I haven’t been able to get Cloudflare’s onion routing thing working as expected. And based on the Cloudflare Forums, neither have other folks. Or maybe I just have poor reading comprehension. There are a number of possibilities here. It did make me wonder if Cloudflare was still worth using for my site, but I decided to continue since I already had everything set up there. Also, pivoting away felt pointless, especially since I was thinking about using Cloudflare Pages as yet another mirror host.

Anyways, I just accepted that if I wanted a mirror of my site accessible as a hidden service in the exact way I expected, I would just need to put in the work myself to set up a Tor server. And unless there’s some redirect weirdness if both the alt-svc header from Cloudflare and the onion-location meta tag in the site’s HTML are set, running such a Tor-enabled mirror and keeping onion routing enabled on Cloudflare shouldn’t be a problem. Probably.

So I set up a Tor server. Since this is just a mirror of a static site that likely won’t be accessed all that often by anyone but me, and not something super-essential, I went with a rather small VM on DigitalOcean running Debian. I did a bit of hardening and tacked on a few firewalls before anything else, because what kind of monster leaves a server unprotected on the internet?

I went with lighttpd for the web server software over something like Apache or Nginx because it’s super light (thus the name) and RAM is gold on my tiny-as-hell VM; far lighter than anything else I’ve encountered in the many years I’ve spent managing web servers. Since I intended to run it behind Tor rather than make it accessible on the clear web, I set server.bind to 127.0.0.1 and once I ensured the default page was visible via curl on localhost (and not anywhere else), I started on Tor itself.

The Tor project has great instructions for installing Tor and setting up a hidden service. During this process I did find out that Tor does not like -s (or maybe .s) in the HiddenServiceDir value, but once I swapped them out with some _ everything worked as expected. Meaning, my personal site now has a Tor mirror! So if you’re accessing this site using the Tor browser, you should be notified that a .onion is available due to the onion-location meta tag. Normally, this value would just send you to the site’s homepage no matter what page you were on, but with some Ruby/Jekyll magic (tacking on page.url | relative_url to the end of the .onion URL), each page is generated with its own page-specific onion-location value; ensuring proper redirects.

Now, if you’ve ever gone to a regular site that is also served as a Tor hidden service, it might be the case that their hidden service URL isn’t as random as you’d expect. For example, the onion url for Bellingcat actually starts with “bellingcat”! Also, the onion url for my site starts with cblanke2 if you didn’t already notice. It turns out that there are tools you can use to generate a vanity address for your Tor hidden service. I went back and forth on using a vanity address or just the random address for my Tor mirror, initially going with the latter before finally brute forcing myself a vanity domain (because I am neurotic and insane). Using a vanity domain didn’t mean I had to outright abandon the original random one though. I was able to just point both of them to the same web server in my torrc. So now I have two .onions. I love onions. I ate one like an apple once. Raw. I’d probably do it again.