More Thoughts on human.json

2026-04-01 11:05 AM

#human #smallweb #comment

I've thought some more about the human.json protocol meant to make site authorship more transparent. There are two things that stand out in my mind:

  1. There is nothing to prevent AI-generated sites from simply including their own human.json file with a couple links to well-known, individual-run sites. The Chrome/Firefox extension works well and is clear about who a given author vouches for. But, there is nothing to stop anyone from linking to whomever they would like. We're trusting individuals to be a good neighbor and use the system as intended.

  2. Network reach is a limiting factor. I know a handful of folks whose sites I read and check regularly. Their blogs land in my reader and I follow/interact with them on a social platform here and there. My network effect is small. There are others who are much more widely connected, so they can provide a much larger list of sites they can vouch for. But the labor is on the individual. There isn't a way to automatically add to your list of known sites - I need to manually go in and update my vouches list. The idea, in principle, is that I'm maintaining my own list of sites whom I would put my own reputation on. In reality, it's another thing I have to manage.

There are some issues open on the repo that address adding a vouch for a site without the human.json file as a way to rubber stamp a website whose owner hasn't gone through the trouble of updating their own. I also came across a small shell script that someone wrote up to add sites to the vouches property automatically. I also toyed around with making a small table here in my own database that would update human.json on demand, kind of like my RSS feed is created.

At the end of the day, verification is hard. We're relying on the goodwill of other humans on the internet to all play nice and make this kind of thing work. In reality, we've seen over and over that AI/LLM inclined businesses and adjacents are unwilling to play nice. Their bots crawl sites without discretion, use content without attribution, and pretty much do whatever they want without participating in norms that make the Internet useful or enjoyable.

I'm going to keep my human.json published and accessible and I'm going to try to stay on top of adding sites when I find something that's truly human created, but I'm not sure what kind of dent it will make in the long run.

Share this post