Google robot requests old links
what I’m wondering: On the webserver of my company I find a lot of requests from Google and Scooter robots for *very* old pages, resulting in 404 of course. Trying to reduce the number of 404 messages, I thought search engines would automatically remove outdated links from their databases. At least after trying for 100 (or more accurate 1000’s) of times. Is their something wrong, please let me know, because I really don’t know how to ‘talk’ to Google and say something like “hey, stop requesting this outdated file on my server”
<< Home