Arachnophobia – Spider, Spider go away, come again another day

added by wiredone
1/19/2012 1:47:59 PM

3 Kicks, 138 Views

While working on soon-to-be-released projects there has often been a need to make a staging/testing website publicly accessible for client testing. This is a slippery slope if search engine spiders get in and index your site before the rest of the world is meant to see it (it happens more than you’d like to think) – If it happens to be a website you are building for something that the rest of the world shouldn’t see yet such as a product/service launch, having it leak too early can often make or break you. They have a word that describes this very fear of spiders – it’s called Arachnophobia.


3 comments

dpeterson
1/19/2012 8:44:37 AM
Seems like this could be accomplished trivially without the use of a 3rd party library. What's the advantage over writing out the robots.txt yourself and setting your headers by hand?

wiredone
1/19/2012 1:41:27 PM
Your comment is fair.

The idea behind Arachnophobia is threefold:
- Make it dead easy to add/remove these things together (nuget)
- Act as a project for more of these features to grow (starting to collect spider user agents for hard block later)
- Enable you to add this functionality at the server level to all sites such as on a staging server ( if you take a look at the source there is a half completed installer in the works for this very purpose )

dpeterson
1/19/2012 1:48:44 PM
Thanks for clarifying, I didn't think about it being used at the server level.