Re: 2 LogWatch questions

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Sun, 2005-10-09 at 09:24 -0700, Corey Head wrote:
> Also, I'm receiving this under httpd on almost a daily
> basis:
> 
> Requests with error response codes
> 404 Not Found
> /robots.txt: 12 Time(s)
> Is this something I should be worried about?  Robots
> sound like 'worms' to
> me.
> 
> Thanks!
> Corey
> 

The file is used to restrict web crawlers from accessing your site.  A
web crawler is a program that search engines use to catalog the content
on the Internet.  Here's a link that may help explain it better:

http://www.robotstxt.org/wc/robots.html


Shockwave


[Index of Archives]     [Current Fedora Users]     [Fedora Desktop]     [Fedora SELinux]     [Yosemite News]     [Yosemite Photos]     [KDE Users]     [Fedora Tools]     [Fedora Docs]

  Powered by Linux