Team blog: Developers

All forms now processed using Zod

As mentioned in our last blog post, all forms (e.g., the new review creation form, or the blog post form I’m typing into right now) are now using Zod. Zod is a type-safe validation library that makes it easier for us to catch errors early in how form data is being processed.

With that said, it’s entirely possible that there are new bugs. If a form doesn’t save correctly, or something else looks weirdly broken, please don’t hesitate to let us know.


More (or less) fun with TypeScript

We’ve been quiet, but we’ve not been idle. As mentioned in our previous blog post, we’ve focused on the following improvements:

  • Adding a code formatter and linter. We’re now using Biome, which is a joy to use: fast & powerful.

  • Strengthening type passthrough from our data access layer (DAL) to the application, so the TypeScript compiler can catch when the code is trying to access data incorrectly

  • Eliminating the use of raw SQL in the application by building more helpers into the DAL that handle all data lookups

TypeScript is a gift that can keep on giving forever, and the next major milestone would be to ramp up the strictness of our type checks, which will require lots of further code changes throughout the app.

For now, the immediate next focus area is an exploration of Zod, a pretty nifty validation library that may come in handy for replacing our bespoke form processing code. And then, finally, we can start adding some features again.


Hello TypeScript & ESM

The lib.reviews codebase is now fully in TypeScript. But notice how the title doesn’t say “Goodbye, JavaScript!”. That’s because TypeScript is, of course, transpiled to JavaScript on the client and the server. So, lib.reviews is still fundamentally a JavaScript codebase, just one that’s statically typed.

In a nutshell, TypeScript makes it easier to avoid common mistakes in programming — e.g., situations where you’re passing a garbage value to some function that’s expected to do important work. Instead of just letting you run the code (resulting in unpredictable behavior or errors), TypeScript will yell at you and say: “Wait, that value could be garbage. Are you sure that’s what you meant to do?”

But all of that happens before you build the project. The version that the user (you) ends up running is just pure JavaScript again.

This was not quite as heavy a lift as the PostgreSQL migration, but it still had to be structured into two phases. First, we converted the codebase to the ECMAScript Module standard. Any application that’s more than a single file has to worry about how it loads the files it depends on; ESM is the modern method for doing that. It avoids many pitfalls of the older CommonJS standard, and it works both in browsers and in Node.js, which makes for some nice consistency.

While TypeScript doesn’t strictly depend on ESM, a lot of the current tooling in the JavaScript ecosystem is built around it. So it made sense to take care of this migration first.

Then came the TypeScript migration itself — which was mostly a matter of, one file at a time, spelling out exactly how each function or class is supposed to behave.

Although the code is now fully TypeScript, there are still improvements to be made for it to pass strict validation. Here’s what’s next:

  • Adding a code formatter and linter (likely Biome, replacing historical eslint usage). This will help ensure the codebase is consistently formatted, and will catch other common coding issues.

  • Re-architecting the data access layer. This is what connects us to PostgreSQL. While the first version of the DAL works well enough, the bootstrap logic is needlessly complicated, and it is not designed with static typing in mind.

  • Auditing remaining TypeScript validation issues and ensuring we use correct types wherever possible.

Once all this modernization work is done, we’ll be in a good place to start adding features again. As always, please let us know about any new issues that may crop up.


Database dumps now in new format

Following up on the PostgreSQL migration, we’ve just published the first database dump as a .sql file for PostgreSQL. You can download it from the usual address.

The database dumps include all public information from lib.reviews. What’s not included:

  • passwords and email addresses

  • user preferences (all are normalized to be identical)

  • deleted revisions

  • session data

  • team “join requests” (which may be regarded as DMs to a team’s moderators)

We do not remove deleted revisions from historical dumps, so in principle they are recoverable for a while — but we delete older dumps to free up diskspace, so eventually deleted revisions will disappear fully from all publicly available archives.

You can import the public database dump into a local development environment and look around (but you’ll need to create a new user account to do anything, as the existing ones are effectively locked).

Public archives are an important commitment to user freedom. They make it possible to fork websites, and to reclaim your own contributions for your own use. If you encounter any issue with the database dumps, please don’t hesitate to let us know!


Goodbye RethinkDB, hello PostgreSQL

lib.reviews is now powered by PostgreSQL, the veteran open source database that drives much of the open web (and a good share of the closed web, too).

When we started, we used RethinkDB, an interesting document-based database that seemed well-suited for a project like this one, where review subjects don’t necessarily conform to a fixed structure — books, movies, restaurants all need different kinds of metadata.

But RethinkDB ran out of money, and it has been in life support mode ever since. This doesn’t make it a future-proof backend for a website. PostgreSQL nowadays supports flexible column types that can store JSON documents. We use this for our multilingual text fields, review metadata, and more.

Migrating was a major technical lift, as you can see in the 97 commits that comprise the pull request. However, it is now done, and we got a lot more test coverage out of it along the way.

We also ported the database dump script over to PostgreSQL, and will soon be providing sanitized database dumps in the new format.

We’ve tested this extensively, but there are bound to be bugs. If you find one, please don’t hesitate to open an issue!


New languages enabled

We’ve enabled the following new languages for UI and content: Arabic (experimental), Finnish, Hindi, Lithuanian, Slovak, Slovenian, Turkish, and Ukrainian. Huge thanks to all the volunteer translators who have made this possible. :)

To create a version of lib.reviews in your language, please see our FAQ.


Goodbye Grunt, hello Vite

In today’s modernization epic, we say goodbye to Grunt and Browserify, two very old-school ways of orchestrating the build process for JavaScript and CSS. Their place is taken by Vite, which is faster and comes with some nice development tooling.

As a developer, before this change, if I wanted to change some JS and see it live, I would have to stop the server and rebuild the code. Now, I can simply save the JS file, and Vite will take care of everything in the background.

As part of this, we’ve started to modernize the frontend code using JavaScript modules, which Vite is much more comfortable with than the older require syntax.

The production site is powered by this new Vite tooling; if you notice any issues, let us know!


Goodbye greenlock, hello certbot

As part of modernizing the codebase, we had to phase out greenlock, a neat Node extension for automatic HTTPS cert management directly within Node. Unfortunately, it’s no longer maintained. Fortunately, certbot has come a long way over the years and makes HTTPS cert management quite painless!

lib.reviews now directly serves up HTTPS certs generated via certbot, which should be quietly auto-renewed without downtime when the time comes. Fingers crossed!

Oh, and pm2 is also gone. It’s a nice process manager for Node, but for our purposes, it just adds overhead. Our production setup is now just managed with systemd.


Gradually modernizing the codebase

I’m working my way through modernizing the lib.reviews codebase, starting with older dependencies.

Today included:

  • Switch to maintained bcrypt dependency for password hashing (compatible with previous hashes)
  • Phase out deprecated request dependency
  • vendored Thinky ORM (unmaintained) and improved its accessibility; removed its reliance on bluebird
  • made test suite fixture setup more resilient against interruption or errors by using a test wrapper
  • replaced unmaintained node-webhooks with internal webhook handler
  • replaced unmaintained remote-ac autocomplete library with vendored & modernized version

The changes ahead will include:

  • moving away from the unmaintained greenlock library for managing HTTPS certs; I’ll likely just use certbot and a trigger to re-generate certs as needed
  • likely dropping pm2 for managing production deploys in favor of just using systemd

And the biggest planned change is to move from RethinkDB (unmaintained DB backend) to PostgreSQL. That literally touches every aspect of the site, so it may need to proceed in a long-lived branch until we’re ready to flip the switch.


lib.reviews is part of Permacommons

I’ve transferred the lib.reviews repo to the Permacommons organization, which I also manage. The idea behind Permacommons is to build a permanent home for shared resources, managed with AI support (in the case of lib.reviews, for development) and automation in mind. If you’re interested, check out the other projects that are being built under the same GitHub org.


 Older blog posts