First, thanks for this nice piece of software, it gives out really useful information!
Then a suggestion. user-agents.org has a pretty comprehensive database of user-agents used by bots, robots, spiders, crawlers, browsers etc. The site also offers its database in XML format.
I think it would be really useful if one could import this list into CrawlTrack.