We tend to think of the internet as a tool that people use to access information and communicate, but, in reality, the majority of internet traffic is machines talking to machines. There is no person in the loop.
Bot is the catchall name for software that “pretends” to be human on the web. Google’s web crawler is a bot; it follows links on web pages and indexes the content it finds. Web crawlers like Google’s are considered good bots. Good bots follow the rules and are generally beneficial.
Bad bots do not follow the rules. They instead exploit online resources on behalf of malicious actors, often criminals.
These kinds of bots include brute force bots, which try to guess authentication credentials; web scrapers, which are used for copyright violation and to gather data; denial of service bots, which bombard web servers with more requests than they can handle; and ad fraud bots, which are programmed to click on web advertising to generate revenue and skew analytics data.
They generate a huge proportion of the web’s traffic. They visit your WordPress site regularly to probe for vulnerabilities, to guess passwords, to try out credentials leaked from other sites. Even if the bot doesn’t accomplish anything, it wastes bandwidth and CPU cycles that you have paid for.
It is impossible to block every bad bot, but you can block or confuse the worst.
Move the WordPress Login Page
Brute force bots repeatedly attempt to authenticate via a WordPress site’s login page. Bots that make hundreds of login attempts a minute consume a lot of resources. Most of these bots are unsophisticated and moving the login page is enough to confuse them. The Move Login plugin allows WordPress users to change the URL of the login page.
WordPress site owners can send messages to bad bots via the robots.txt file. Robots.txt is composed of rules that bots are supposed to obey, as specified in the robots exclusion standard. You can, for instance, instruct bots not to visit particular pages on your WordPress site.
If a bot visits a page from which it is excluded by robots.txt, it is, by definition, a bad bot. The Blackhole for Bad Bots plugin takes advantage of this to identify and block bad bots. The plugin adds hidden links to pages on your WordPress site — people can’t see them, but bots can. You then add a pattern that matches these pages to your WordPress site’s robots.txt file. If the page is visited, the culprit has to be a bad bot, and its activity can be blocked.
Use a Web Application Firewall
A Web Application Firewall (WAF) can be loaded with a blacklist of known bad bots, automatically blocking requests from those IP addresses. The premium version of the Sucuri WordPress security plugin includes a WAF that is updated in real time. The Wordfence plugin has a similar feature.
These bots threaten the security and steal the resources of WordPress sites. It is worth taking the time to implement these strategies, which will deflect the majority of bot attacks.
About Graeme Caldwell – Graeme is a writer and content marketer at Nexcess, a global provider of hosting services, who has a knack for making tech-heavy topics interesting and engaging to all readers. His articles have been featured on top publications across the net, from TechCrunch to TemplateMonster. For more content, visit the Nexcess blog and give them a follow at @nexcess.