I'd like to hide an html page so its content can't be logged by passing "bots".
Can someone point me towards a solution, please?
My knowledge of PHP is terribly rudimentary but I don't expect you to write the script for me! I just need a few clues.
Am I being paranoid in not trusting robots.txt and 'NOINDEX'?
Hiding from the robots
Moderators: egami, macek, gesf
-
- php-forum Fan User
- Posts: 106
- Joined: Fri Oct 17, 2003 8:00 am
Last edited by Martin Pickering on Mon Mar 30, 2009 11:48 am, edited 3 times in total.
-
- php-forum Fan User
- Posts: 106
- Joined: Fri Oct 17, 2003 8:00 am
I found this but I welcome other suggestions:
http://www.webmasterworld.com/forum24/788.htm
Presumably requires this:
http://web-professor.net/scripts/isbot/isbot.txt
I'm thinking "there's got to be an easier way than listing every bot!"
http://www.webmasterworld.com/forum24/788.htm
Code: Select all
TOP OF page:
<?
// include the cloaking function
require_once('/full/path/to/cloaking/script/IsSpider.php');
?>
FOR EACH LINK ON THE page(s):
<P> Visit our sponsor</P>
<P>
<?
if IsSpider() {
print "Lose Weight! Grow Hair!";
}else{
print "<A href=\"http://www.fat-n-hairy.com/\">Lose Weight! Grow Hair!</a>";
}
?>
</P>
etc., etc....
http://web-professor.net/scripts/isbot/isbot.txt
I'm thinking "there's got to be an easier way than listing every bot!"
Hey Martin!
I believe robots.txt will do the work. And no... i don't think you're beeing paranoid
I believe robots.txt will do the work. And no... i don't think you're beeing paranoid

Code: Select all
User-agent: *
Disallow: /