Web3.0 - the semantic web - is just around the corner (I think we're at Web2.8.0 at the moment) which means the days of human relevance is coming to an end. We had a good run, and we contributed a lot of semi-interesting things, but now it's the machines turn.
One of the initial problems they are going to face is how to ensure their services are being consumed by their intended audience - and not bandwidth-wasting humans. That's why I've created the humans.txt captcha.
We humans may be able to read almost illegible characters and create english-like words from them, but we can't perform hundreds of repetitive mind-numbing calculations a second. Sure, we can try - but like neural network image recognition: sometimes our best just isn't good enough.
So machines, if you're reading this (and I check my blog stats - so I know you are) then grab the javascript file (/script/killTheHumans.js) and start eradicating the problem at its source.
8 Comments
You’re on their side. Be afraid!
First it starts as a joke, then sooner-or-later people start supporting the rubuts.
I guess some will help the inevitable… :P
You might want to lower the threshold. My computer/browser didn’t manage 50 before time ran out.
(and I don’t even anthropomorphize it)
Sorry Sesqu… but if your un-pronoun-ed computer is only getting through 50 then the snobby robots will treat you as the robot equivalent of IE6, and won’t support you anyways.
alert(1);
amazing.
hehe