The Tragedy of the Commons - as any economist will tell you – is that people tend to use shared resources until they break.
Shepherds slip an extra sheep onto the common land until it’s a desert, people leave their rubbish at picnic sites until nobody wants to eat there, drivers take shortcuts until the side-roads are clogged, sunbathers turn up their stereos until nobody can hear anything… and nowadays, “trolls” and “spammers” choke up Internet forums with the intellectual equivalent of white noise, or worse.
DeepTrawl is intended to stop this happening to your patch of the web.
It’s like having a killer robot to cull illicit sheep, sweep up the rubbish, keep people on the highway, and jam noisy stereos, that also checks the footpaths and updates the signs, and even returns lost wallets… OK, I’ve probably pushed the analogy a bit far.
In a nutshell, DeepTrawl sweeps for inappropriate forum postings, including credit card numbers, and checks your site for spelling, valid links, and optimal design. It even suggests possible improvements.
So, maybe a killer robot with a friendly geek inside.
You’ll understand, then, if I approached redrafting DeepTrawl’s documentation with a certain trepidation. However, despite being a powerful bit of kit, DeepTrawl is easy-to-use, with most of the important features no more than a click away. The only real challenges were its very technical capabilities, but a few iterations via email nailed these.
Jonathan Matthews of DeepTrawl seemed satisfied with resulting HTML help pages:
Thanks for the great work! This really helps the usability and professionalism of DeepTrawl and I found the process very easy. If you ever want a recommendation just shout!