A way to track aliens visiting your website?
TLDR; Built a way to (potentially) track alien visits, and more realistically, bot visits to your site. Released on Github
--
So who/what is visiting your site? It could be people (boring...), or ... maybeeee .... aliens !? what, no. surely not. But is it possible... Did you ever wonder, what user agent would aliens use? Would they be visiting from a browser, or a curl request.
Well anyway, all of the existing off the shelf analytics tools that 99% of website owners use rely on client side javascript, meaning it's only capturing some human visits from browsers like Chrome/Safari, and missing all those tasty visits from google bot clrawlers, seo crawlers and most importantly now, the new emerging beings, including GPTs, LLMs, AI Agents and the 2026 emerging trend of AI Personal Assistants, aka OpenClaw. Previously this kind of trafic was considered noise (bots, scrapers, crawlers etc...). But, i find it interesting. So I wanted to see it, and also just show it in realtime for everyone on my website.
Why is it different from other site analytics? It intercepts the request right at the root instead of when the page loads – I haven't actually see any other tools doing this. I probably just haven't looked to be fair, alas, I just decided to build it and release it Open Source.
See it in action at the top of this page, look at the number, refresh the page, look and see that the human value is incremented (that's you right? you are human aren't you? let me know if not, I am very interested).
Then there's the bots ...
- before the bot > 2. asking claude to fetch my site > 3. after it fetched it:
Try it yourself, type this into chatgpt/claude/gemini then look to see the number increasing:
Fetch and sumarise https://anthonyfrisby.com/open-source/who-or-what-is-visiting-your-website
Or, run this command in your terminal and watch the site increment:
curl -s https://anthonyfrisby.com/open-source/who-or-what-is-visiting-your-website/
Note: Please don't DDoS my site.
How it works (technical stuff)
Nginx serves the static website. After the response is sent, it fires an internal request to a Go service on localhost. Zero latency added to the page load. The request carries the URI, host, User-Agent, IP, referer and request metadata as headers.
The Go service (running in a docker container) classifies the visitor by User-Agent. Known bot strings like Googlebot, curl, python-requests get classified as bot. Browser strings like Chrome, Safari, Firefox get classified as human. Everything else is other. It writes a raw log entry to SQLite and increments a counter. A small JavaScript snippet on the page fetches the current counts from the same service via Nginx and displays them. Counting is server-side. Display is client-side. The count increments regardless of whether JavaScript executes.
What is the other number?
I hear you say ... well, no-one can be sure, let's just say, I want to believe. If you actually want to know though check the main.go file on GitHub.