I’ve concluded I want comments, despite all the difficulties they pose. How do I add them to this static site when even relative dates are verboten? Let’s explore my options.

Reddit:

I could create a thread on Reddit or Hacker News for every post. It sounds tedious. I wouldn’t have any control over the comments. Nor would I own the data. Reddit could disable my account tomorrow and stop me from commenting or posting entirely, and I wouldn’t be able to do a thing about it.

Disqus:

I could use Disqus , but as James3 outlined in the Netlify forums :

  • They come with ads in their free plan
  • They collect user data (very unfriendly to your privacy)
  • They are breaking the GDPR law 4 and sharing visitor data
  • Performance degradation (sends many 3rd-party requests)

On top of all that, it would require JavaScript to participate. You know how I feel about depending on JavaScript, even if it’s only for comments. And it would still leave me reliant on a third-party to continue providing my data, just like above.

Isso:

I could use Isso, ‘a commenting server similar to Disqus’ . However, this is another JavaScript-dependent solution.

Netlify Edge Handlers:

I could use Netlify Edge Handlers —if I were granted access to the beta—to rewrite my page as it was served and insert comments, in conjunction with their ‘Forms’ and their ‘Functions’ . This is conceptually sound. I’d be responsible for accepting, processing, and storing comments, so I’d control every aspect of it. I could rewrite the page to put them in the HTML directly, instead of requiring Javascript.

On the other hand, this would require me to build so many different pieces of functionality to integrate, all of them specific to Netlify. And, more practically, there are logistical issues :

By default, Edge Handlers have limits for the amount of memory and execution time they can use:

memory limit: 256 megabytes
execution time limit: 50 milliseconds

So these don’t seem to be a viable option.

Jamstack Comments Engine: ?

From the JCE site:

[…] this approach uses traditional http form posts and continuous integration to automatically build your comments directly into your site.

The flow goes like this:

  1. A user submits a comment to the comments queue form on your page. That form posts to the form handling facility in Netlify where the site is hosted.
  2. The form submission triggers a call to a Lambda function which passes the details of the comment along to Slack where a site administrator can review the comment, and click a button to accept or reject the comment.
  3. Rejected comments get deleted from the comment queue
  4. Accepted comments get posted into the approved comments form, which automatically triggers a build and deployment of the site. Accepted comments are also deleted from the queue.
  5. The site build pulls all the approved comments from the Netlify submissions API, and then generates all of the pages (complete with their comments) with a static site generator (the simple and elegant 11ty)

Submitting forms to create pull requests intrigues me. The PRs should ideally be debounced: I don’t mind a delay of a minute before a comment shows up in the queue. I would probably create a separate repository for the comments and include it as a submodule in my main repository, which would update it on rebuilds, and I would rebuild the blog every time I approved comments. GitLab even supports multi-project pipelines , so I could do that automatically.

Streamlining builds

Speaking of which, rebuilding is a tad problematic. Deploying this site has been taking between four and four-and-a-half minutes on average. That means a maximum of 300[1] ÷ 4 = 75 builds a month in the best case, which includes all the builds that occur through my normal tinkering—both Deploy Previews and regular deployments. If I started re-deploying the site every time a comment is added, I’d have to either approve comments in large batches or reduce the velocity of my development. Neither is an appealing choice.

The build process spends most of its time installing Sharp and generating responsive images. Before I added those, it took less than a minute. I need to avoid this step, and I’m not about to abandon responsive images altogether. Let’s find another solution.

Taming responsive image generation

Netlify has an intriguing feature that lets you resize images on the fly . The hitch is that you have to use them as your Git LFS provider , i.e. your actual images must live on Netlify’s servers and your Git repository must contain only pointers to those. It would make me unhappy to split my data between GitLab and Netlify and sacrifice control over caching. It would also tie me further to Netlify (not irreversibly—undoing the change later is not hard, it just requires care). Oh, and the free plan allows 2,500 ‘transformations’ per month , which means 2,500 ÷ 2[2] = a maximum of 1,250 distinct images… nevermind, this may not be a problem for me just yet.

Therefore, I’ve decided to have a separate image server, perhaps running at images.shivjm.blog. It can resize images on the fly before serving them. I thought I’d use imageflow , which I’ve been eyeing for years, and run the server on DigitalOcean’s App Platform at something like $5 per month. (Maybe I’ll even find a free option at some point. This has such low requirements. I’m even tempted to use Scaleway’s dirt-cheap Stardust instances .) However, the code looks, shall we say, immature, and I can’t find a way to make it produce ETag headers that doesn’t require me to fork the project and set up the entire build system. I would like to avoid adding hashes to the filenames at this point, so adding immutable cache headers to all the images like I did previously is not an option.

If not imageflow_server, I can use express-responsive-images piped through imagemin for optimization, though imagemin is where the build currently spends most of its time.

I whiled away an enjoyable few hours building a responsive image server using imageflow on Kubernetes, but I realized midway I was about to both spend money and take on responsibility for many moving parts (a subdomain, a Kubernetes cluster, a Rust HTTP server, a Traefik ingress controller, and more) just to avoid using a readymade Netlify solution which was probably more efficient and mature than anything I could cobble together. What’s more, if I ever needed to leave Netlify, I could build such a server later, if I needed to.

So I let my better sense prevail. The images are now powered by Netlify Large Media and my builds are down to a minute again. It was worth spending a night on. (…I think.)

That’s all for now. I’ll talk about what’s new on the site next time.


  1. The number of build minutes I get for free.
  2. How many sizes of images I generate.