Today we released some new research showing that for websites with fewer than 2,500 monthly visitors, up to 83 percent of traffic is non-human and 49 percent can pose serious security threats to small businesses. This research follows our recent report that found 31 percent of website traffic can harm your website for sites with 50,000 to 100,000 monthly visitors.
When analyzing these smaller websites, it becomes clear that the automated, non-human traffic is significantly higher as a percentage of overall Web visitors than for medium to larger sites. This can have massive performance and security impact. Most small businesses rely on web hosting companies that sell fixed-cost shared hosting services with unlimited plans. However, the more websites the hosting provider adds to the server — in an attempt to increase overall utilization and drive more revenue — the more the server’s utilization decreases due to the increase in Bot traffic.
Most of this traffic is automated and is entirely unrelated to the website’s real human traffic. Basically, each website spun up by a hosting provider will suffer a set level of Bot traffic regardless of how many real visitors it attracts. We like to compare this to an analogous a phenomenon in aerodynamics known as parasitic drag, which occurs when moving a solid object through a gaseous medium – a common example is an airplane wing’s drag during flight.
When discussing the so-called “cost of a data breach,” many businesses tend to simplify the risk and look at malicious traffic as binary – you either get hacked or you don’t. This assumption is dangerous because it implies that hackers only cost you money when they’re successful. That misses much of the true damage and cost to your business. Your website is often your business. Bots seriously degrade the user experience and performance of your website. Would-be customers abandon shopping carts or flee to a competitor when your website doesn’t perform. So what can be done? The good news is that blocking bad automated traffic isn’t difficult. Automated hacking leaves distinct fingerprints, including:
- Inhuman click rates – people don’t usually click 10 times a second.**
- Headers – When automated attack software interacts, they often give away information in the headers explaining who they are.**
- Irregular coverage – When people interact with a website, they usually visit some subset of the available pages, roughly 20%. Conversely, machines will visit 80% plus.**
Today, some companies build home grown capabilities to contain automated visitors which include blacklisting IP addresses, trying to identify abusive usage patterns and looking for undesired browser types. However, in order to do this effectively, one needs large scales of data, powerful analytics capabilities and highly specialized expertise.
So… Incapsula users: Sit back and relax, we’ll take care of those bots for you.