Lets take an example. I have just read an article by Juniper Networks about the rise of mobile malware. The title is the usual link-baiting, scare tactic fare; Mobile Malware Development Continues To Rise, Android Leads The Way. The article is here if you want to give it a quick once over. I'm going to cover the title first. Take the word "malware" out of that title (Mobile Development Continues To Rise, Android Leads The Way). Now we can all probably surmise thats true as well. More and more development is happening in the mobile space every day. More and more companies and software developers are jumping on the mobile bandwagon. And, in fact, Android is also leading the way there too. Android Market is having more new apps being developed for it than any other mobile application store.
And this is my first point. If you say X phenomenon is on the increase, you firstly need to determine if thats an anomaly based on the trend in the area it operates in. I don't have figures for this (because, wouldn't you know it, that article didn't give any), but this discussion isn't about proving the article, its about discussing the misleading way things are being reported.
For example, if all mobile application development is seeing an increase of 100%, and at the same time we are also seeing malware development increasing by the same 100%, well, that's not pointing to a unexpected increase in malware. All mobile apps are increasing by that amount. Its just a part of the curve.
What a lot of articles tend to do as well is throw "data" at a reader that make things look worse than they really are, and the use of "% increase from last year" is another popular tactic used. A lot of readers don't really seem to understand how that works. Using the linked Juniper article as an example, they make the claim:
A 472% increase in Android malware samples since July 2011.Ok. Thanks. But what was the actual number in July, 2011? If there were only 10 malware apps in July, that means now there are only 57.2. A tiny number compared to the vast number of non-malware applications. Why wouldn't they give you the real figures? I'm not privy to what they were thinking, but my first assumption is that the actual numbers are so low that it wouldn't be compelling reading, so they rather use the scarier 472%.
And this practice continues throughout the article. All accurate data, just completely out of context and therefore ... useless!
But along with the use of out-of-context data, writers of these articles like to mix in other data which looks related but actually isn't. Here's an example from the piece:
October showed a 110% increase in malware sample collection over the previous month and a striking 171% increase from what had been collected up to July 2011.
So, the way they collect data about malware is by collecting the actual malware. Kind of like anti-virus software on a computer does it. What they don't mention in the article is that the increase in malware "collection" isn't only a factor of there actually being more malware. Its also a factor of improved methods of detecting and "collecting" them. Those are two very different trends which the article tries to bundle into one in order to help paint a scarier picture.
Readers should (must) read articles far more critically as there are innumerable tactics writers will use to find ways to make you click that link, become outraged at the content, and share it. When you eventually break it down, they are usually full of half truths and data taken out of context.
They find a way to lie by telling you the truth...
(And for the record, with regard to the example article itself, I don't think malware is not an issue on mobile devices, I just don't think its anywhere near as bad as Juniper paints it)