Every science fiction fan will tell you, there are two kinds of robots in the world: Good bots and bad bots. C-3PO? Good bot. Ultron? Bad bot. They’ll also tell you that, when confronted by a bot, it’s critical to your well-being to recognize and avoid the bad ones.
Same goes for the thousands of bots visiting your web applications every day. Many organizations don’t appreciate the level of bot traffic out there and its impact on their web applications. But the fact is, according to a report by Incapsula Labs, a huge and growing portion of website visitors are automated clients – bots. These bots make up over 60% of web traffic – up from just 50% a year ago. Your web apps are being visited by more bots than people!
Worse, roughly half of those bots are bad ones.
What are those bad bots doing during their visits? Almost a quarter of them are performing Distributed Denial of Service (DDoS) attacks – they’re trying their nefarious best to put your web business out of business by overwhelming your site with traffic. Like the bad bots in movies, they’re growing rapidly in number (we have seen a 240% increase in visits by DDoS bots in the last twelve months!). And like those movie bots they’re also getting a lot smarter (time was when it was easy to identify a DDoS bot by tossing it something only a human would understand, like a cookie, but today around 30% of these DDoS bots can now accept cookies, making them harder to detect and harder for many anti-DDoS solutions to defend against).
And the evilness doesn’t stop there. Besides performing DDoS attacks to prevent legitimate users from being able to access your site and do constructive things such as, well, buying your goods, they’re also hacking the site directly to steal things like payment information or customer data. They’re stealing proprietary information via techniques like scraping so others can re-present, for free, information you spent resources to generate. And they’re damaging visitor experiences by spamming comment areas, along with a myriad of other malicious activities.
These unwanted bots can have a significant direct impact on your business’ bottom line.
Think about it: 30% of the traffic visiting your website is unwanted, but you still have to build or buy enough infrastructure to accommodate it. You are effectively spending cash to subsidize these bad bots on your network and website. That’s wasted money and, as I mentioned earlier, it will be even worse next year because their number and intelligence are growing.
The good news is that Imperva recently introduced ThreatRadar Bot Protection Services as an add-on for its SecureSphere Web Application Firewall (WAF). The service’s client classification engine analyzes and classifies all incoming traffic to your website, quickly and accurately distinguishes between human and bot traffic, identifies “good” and “bad” bots, and more. SecureSphere then uses this granular level of information to drive policy enforcement decisions like how to handle bad bots. For example you can choose to have administrators receive an alert (e.g., for monitoring purposes) or even have SecureSphere block the bot altogether before it gets anywhere near your web app. True bad bot protection.
Ultron would not be happy. But your bottom line will be.
For more information, please visit us at ThreatRadar Bot Protection Services.