95% Of Readers Don’t Give a Hoot If There’s No Source Provided for Statistics in A Blog Post

  • Home >>
  • NEWS >>

After all, if a stat says “it’s so”, then it MUST be so. Right?Photo by Mary Markevich via Freepik

If you’ve read the stat in the headline and thought “Hang on a minute, I DO care about sources being provided! That stat sounds fishy” — you’re right (congratulations, your “that-sounds-fishy” radar is working beautifully!). But here’s the thing (and it’s the whole premise for this post): Despite that stat being a total fabrication, there’s a chance that someone who isn’t going to bother reading beyond the headline might quote it in their blog post — either with or without attributing it to me — and then someone else will quote it in their blog post, and before long, this bullshit statistic will be repeated again and again throughout the blogosphere until it’s accepted as legitimate. As someone who takes so much care to use verifiable data and statistics in my own blog posts, this phenomenon (which is annoyingly rife) irks me to no end. The result is the rant you’re about to read, so if you’re a blogger too, I hope it resonates.

We all love data and snazzy stats that help reinforce a point we’re trying to make. After all, if the data says “it’s so”, then it MUST be so! Right?

Well yes, in theory. The problem is that there are so many statistics flying around on the web that have been reused so often with the original credit having been left off at some point, that it’s now extremely difficult (if not impossible) to track the original source and to verify if the statistic is credible.

In the course of researching my own blog posts, it has become increasingly frustrating to see that this phenomenon is alive and well, especially since “statistic listicle” posts are both highly-read and highly-shared, and many of those who create them curate stats from other lists, so they cite someone else’s statistic list-post as the source rather than locating the original source.

So many writers don’t seem to care that they’re using a stat that hasn’t been attributed to an original study or research. As long as it appeared in someone else’s blog or many other blogs (including reputable ones) even without proper attribution — it’s good enough to rehash whether it’s accurate or not, and whether it’s current or not.

I can’t tell you how many times I tried to track an uncredited statistic to its original source, only to learn that it’s based on research that’s so old, it’s almost certainly no longer relevant.

For example, I found the following stat in a list of “important blogging statistics every blogger should know in 2021” (I won’t say who wrote it because it’s not my intention to embarrass anyone):

Bad stat example #1“Blog traffic can increase to about 2,000% through quality content.”

It was credited to a website that didn’t include a source, just the statistic. But after googling the crap out of it, I finally found it in a post titled “Overall Content Marketing Strategy Leads to 2,000% Lift in Blog Traffic, 40% Boost in Revenue” on MarketingSherpa, dated March 14, 2012. That’s over 9 years ago (!!) which is like 157 years in ‘tech years’, and most especially in content-marketing terms. And while I personally agree that “quality content increases blog traffic” because I have witnessed it myself and have seen up-to-date research that says so too, the point is that there are statistics being thrown around in the blogosphere without any attribution other than some other site that also published it without any attribution, and it’s just wrong.

Another equally annoying practice that seems to be widespread, is when writers do attribute a statistic to an original source, but again, the original source is so ancient, it’s highly likely it no longer represents a snapshot of our current reality. This defeats the whole purpose of using statistics in a current piece of content, because if the idea is to position a particular insight as factually relevant, then using old statistics to support it is misleading. Many readers don’t bother clicking on the link to the source of a statistic to check its veracity, and are relying on the writer’s credibility and commitment to providing truth and accuracy so that they can learn something that’s relevant in their sphere of interest, so if they’re reading stats based on ancient data, they’re going to draw inaccurate conclusions.

Here’s another example:

Bad stat example #260% of marketers create at least one piece of content each day.

That’s from a curated list of “Content Marketing Stats That Every Marketer Needs to Know” originally published in 2016 and last updated (as of writing this post) in February 2021, but that stat is attributed to an eMarketer piece from 2013 based on research from 2012. If the idea is to claim that a large proportion of marketers are creating at least one piece of content per day, suggesting that if you’re not among those marketers, perhaps you’re doing something wrong and should ‘hop to it’ by creating at least one piece of content per day too — then that’s the takeaway for the reader. But while it may have been a best practice to produce at least one piece of content per day in 2012, times have changed, and there is now research that suggests that “quality trumps quantity”, so in 2021 it might be more appropriate to produce at least one piece of kick-ass content per week rather than a rushed, mediocre piece of content every day.

And that’s just another example of how using outdated statistics to reflect a current reality can be not only misleading to the trusting reader, but possibly even detrimental.

And here’s another:

Bad stat example #377% of internet users read blogs.

I have found countless blog posts (google it and you’ll see for yourself) presenting this stat without any reference whatsoever. In some cases, its only attribution is to another post that also listed it, but again, without citing the original source. It’s a case of people regurgitating a punchy soundbite over and over again just because everyone else has listed it too, without ever bothering to ask “But wait, is it really 77%? Who says it’s 77%? How current is this statistic anyway?” Once again, I googled the crap out of this one and still haven’t found the original source, but I did find the very same stat quoted in a post all about the “latest media and internet statistics” — from 2009. Yes, 2009!

Could it be that bloggers are so lazy about verifying statistics that a stat about blog readership that’s at least 12 years old has been quoted and requoted for over a decade without anyone bothering to check where it came from? Apparently so, it seems, and my guess is that there are plenty of other stats doing the rounds out there that are either equally (or even more) outdated, or simply based on pure B.S. but quoted so often by so many websites without ever having been questioned or challenged, they are regarded as gospel.

Even if the claim of this particular stat rings true (it’s certainly plausible that a huge chunk of internet users read blogs) — the issue isn’t specifically with this stat. I merely used it as another example of how freely marketing bloggers seem to quote unverified statistics, because some of these ancient or unverifiable statistics can potentially fuel misconceptions. Maybe some of those misconceptions are harmless, and maybe they shape marketing strategies backed by million-dollar budgets. Would YOU want to shape your own marketing strategy based on a stat that isn’t current or accurate even though it’s presented as such?

Here’s an example of precisely such a stat, repeated by practically every marketing blog on the planet at one time or another:

Bad (and probably most famous) stat example #4Humans have a shorter attention span than a goldfish.

Oh yeah? Who says? No, really: WHO said it? It turns out I’m not the only statistic-attribution geek bothered by this dubious but snappy soundbite.

Well guess what? It’s baloney.

The claim that the average attention span of humans is down from 12 seconds in the year 2000 to eight seconds now, which is allegedly less than the nine-second attention span of the average goldfish — is based on a statistic attributed to Statistic Brain in a 2015 report by the Consumer Insights team of Microsoft Canada, who studied the brain activity of 112 people (how’s that for a miniscule sample size?) as they carried out various tasks. Faris Yakob, co-founder of nomadic creative consultancy Genius Steals, dug deeper in an effort to verify the statistic: “Upon visiting the site, it appears to be a research company. A chart with the fishy fact appears there. A reverse image search led me to the source of the claim, a software manual called Building Information Modeling and Construction Management. Here the chart is sourced to the National Center for Biotechnology Information and US Library of Medicine but when asked, both denied any knowledge of research that supports it.”

In fact, other evidence suggests that our attention span isn’t shrinking at all, but rather, it’s evolving. While it’s true that there is far more distraction than ever before due to a plethora of content platforms and busy social media feeds, according to Dr. Gemma Briggs, a psychology lecturer at the Open University, “It’s very much task-dependent. How much attention we apply to a task will vary depending on what the task demand is.”

Furthermore, humans are capable of different types of attention, so by heeding the ‘lesson’ implied in the shorter-than-a-goldfish-attention-span stat, marketers who create content that can be consumed in less than eight seconds may be missing the mark, because they are underestimating our willingness to engage with it for longer if it’s of genuine interest to us. As Turtl’s Natasha Keary explains: “As with Netflix series, long books, or popular podcasts, we’re able to selectively focus on something that’s interesting, relevant, and a good experience to consume. The solution to attention span marketing problems is creating high-quality, immersive content.”

And as for the poor goldfish, it turns out their short memories are a fallacy. Culum Brown, an expert in fish cognition at Australia’s Macquarie University, says that goldfish actually have much longer memories spanning weeks, months and even years, and that there are thousands of studies that show fish have excellent memories: “We’ve known about the reasonably good memories of goldfish since the ’50s and ’60s. Despite what everybody thinks, they’re actually really intelligent.”

But none of these actual facts have prevented publications and marketing bloggers from running with the cute (but utterly false) goldfish stat, because most of them don’t do the requisite research when it comes to providing proper attribution to statistics. And often, if you do your own due diligence to track down the original source, you discover it’s so outdated, it’s most likely inaccurate to the point of being bogus.

Image by Noya Lizor

So, what should the shelf life of an ‘industry trend’ or ‘best practice’ statistic be? That depends on the industry and the statistic, but an element of common sense should also prevail when deciding which statistics to use to give credence to a particular point. In the world of content and marketing, for example, people’s content consumption preferences and online shopping habits have evolved dramatically over recent years due to new technologies, cultural influences and even a global pandemic that forced businesses and marketers to rethink pre-pandemic strategies and pivot quickly to stay relevant and profitable.

A statistic about the ‘state of the industry’ or a marketing best practice that applied over five years ago, most likely wouldn’t hold water today. That’s why ongoing research is so important, as is the need to cite the most up-to-date findings when sharing current insights.

It’s one thing to come across a post from five years ago that lists statistics from five years ago, which would make sense. But it’s quite another to “refresh content” every year by updating the title to include the current year and leaving old stats listed as though they are still current. If they do still make sense to include, that’s fine, but if not, it’s the writer’s responsibility to update them too. And for those who want to write a brand-new blog post and bolster their arguments with insightful data, it’s only logical to include the most recent stats available rather than dated research or statistics whose origins are vague or unsubstantiated by a credible source.

My point, is that professional bloggers who understand the importance of establishing trust and credibility should know better, and hold themselves to a higher standard, by only using stats whose actual source can be verified and properly credited. Otherwise, it diminishes the credibility of stats in general, not to mention it misleads a trusting reading-public who might draw incorrect conclusions from poorly-chosen, flimsy stats, to their own detriment. I’m 100% sure of that.

What’s the ‘ideal’ length of a blog post in 2021? (with examples)

95% Of Readers Don’t Give a Hoot If There’s No Source Provided for Statistics in A Blog Post was originally published in The Writing Cooperative on Medium, where people are continuing the conversation by highlighting and responding to this story.

Read more: writingcooperative.com

  • October 17, 2021
  • NEWS