Dec 12, 2013, 15:00 pm
If you are visiting this page the chances are that you are not a human, at least according to research.
A study by Incapsula suggests 61.5% of all website traffic is now generated by bots. The security firm said that was a 21% rise on last year's figure of 51%.
Some of these automated software tools are malicious - stealing data or posting ads for scams in comment sections.
But the firm said the biggest growth in traffic was for "good" bots.
These are tools used by search engines to crawl websites in order to index their content, by analytics companies to provide feedback about how a site is performing, and by others to carry out other specific tasks - such as helping the Internet Archive preserve content before it is deleted.
To generate its report, Incapsula said it observed 1.45 billion bot visits over a 90 day period.
The information was sourced from 20,000 sites operated by its clients.
Dr Ian Brown, associate director at Oxford University's Cyber Security Centre - which was not involved in the study - said the figures were useful as an indication of the growth in non-human traffic, even if they were not accurate to the nearest decimal place.
"Their own customers may or may not be representative of the wider web," he told the BBC.
"There will also be some unavoidable fuzziness in their data, given that they are trying to measure malicious website visits where by definition the visitors are trying to disguise their origin."
Impersonator bots
Despite the overall growth in bot activity, the firm said that many of the traditional malicious uses of the tools had become less common.
It said there had been a 75% drop in the frequency spam links were being automatically posted. It suggested this was in part down to Google's efforts to make it harder to carry out the practice.
It also said it had seen a 10% drop in hacking tool bot activities, including the use of code to distribute malware, to steal credit cards and to hijack and deface websites
However, it noted that there had been an 8% rise in the use of "other impersonator bots" - a classification including software that masquerades as being from a search engine or other legitimate agent in order to fool security measures.
It said these bots tended to be custom-made to carry out a specific activity, such as a DDoS attack - forcing a server to crash taking a website or service offline by flooding it with traffic - or to steal company secrets.
Activity by "good bots", it added, had grown by 55% over the year. It suggested this might be because the legitimate services were sampling the net more frequently. This might, for example, allow a search engine to add breaking news stories to its results more quickly.
Dr Brown noted that these extra visits were likely to put website operators under more strain, meaning they would have to buy more computer servers to handle the extra traffic. But he played down the risk.
"While the trend will increase the costs of website operators, I think that, at this scale, it's something they can cope with," he added.
Originally Published: Thu, 12 Dec 2013 13:42:49 GMT
source