Distil Networks, which focuses on detecting malicious bots, has released findings from a study of how top Web sites perform against Internet-based bots.
The ninth annual study, audited by the Online Trust Alliance (OTA), evaluated the top 1,000 Web sites in retail, banking, consumer services, government, news media, internet service providers and OTA members.
The findings reveal that while an average of 16% of Web sites across all industries can fend off simple bot attacks, only 5% can properly protect against sophisticated attacks from bots that are often used by competitors, hackers and fraudsters to scrape data, take over accounts, create online fraud and spam, and more.
In the retail sector, findings suggest that "crude" bots, defined as basic scripts that behave like a bot, coming fast from one IP address, were detected and blocked on 64.8% of the sites tested. In the same category, crude bots were found and blocked 85% of the time on bank sites.
When analyzing consumer sites, crude bots were found on nearly 70% of sites; federal sites, 77%; news sites, 59%; and ISP Hosts, 75%.
The findings show that while most industries tested can adequately protect against crude bots, they struggle to effectively block simple, moderate, and sophisticated bots. For example, federal Web sites block 22% of simple bots, but only protect against 1% of sophisticated bots, performing below any other industry tested.
Where detection falls short is in all categories for sophisticated bots, defined as "low and slow" that come from dozens of IP addresses using browser automation tools. The chances of detecting and stopping these bots drops to between 1% and 7%, depending on the category.
Despite poor performance, this year’s findings reveal a marked improvement from Distil’s 2016 study, which found Web sites tested could protect against only 0.7 percent of sophisticated bots. The study attributes the gradual uptick in performance to greater awareness and adoption of more advanced bot detection and mitigation solutions.