protection
French 
CrawlTrack, webmaster dashboard.
Web analytic, SEO and protection

CrawlProtect, your website safety.
Reinforced protection

Two php/MySQL scripts, free and easy to install
The tools you need to manage and keep control of your site.





diable

CrawlTrack and CrawlProtect support forum

You are not logged in.


  • Index
  •  » Suggestions
  •  » Automatic Scan Inclusions of .htaccess and robots.txt

#1 31-07-2009 21:27:39

Daerk
Nouveau membre
Registered: 31-07-2009
Posts: 1

Automatic Scan Inclusions of .htaccess and robots.txt

It would be nice if crawltrack would scan the .htaccess for denied ip's or ip blocks, denied user-agent's, denied referrers, and scan robots.txt for the same, then automatically add those entries to a "Blocked" index that is tracked separately from the "errors" or "crawlers" or "visitors" statistics.

It would also be nice to port the user-agents.org XML list of known user-agents into crawltrack and be able to give "Label Groups" to each user-agent as: "Spider" "Bot" "Worm" "Browser" "Unknown" etcetera... *and* have the ability to have all user-agents in groups added to the "Hacking Attempt" list... for instance... include all of the "Worm" user-agent (and http-referer informations) automatically applied to the Hacking Attempt statistics.

Thank you for developing this excellent free tracker!

-- Daerk
How2SellCCTV.com

Offline

 
  • Index
  •  » Suggestions
  •  » Automatic Scan Inclusions of .htaccess and robots.txt

Board footer

Powered by PunBB
© Copyright 2002–2008 PunBB