Maybe you’ve experienced this: one article says that in order to optimize engagement on Facebook you should post one time per day. One says two times a day. Another says one-half post per day. Conflicting social media research is often presented for readers to dissect and digest, leaving them to qualify one study or opinion better than another.
How can businesses make heads or tails of a lot of the conflicting data out there? If we were scientists, we might start by checking to see if the results are replicatable and that the sources are reliable.
Are the results repeatable?
What inspired this article was a study by Technorati that stated that 50% of a representative sample of social media users used Google Plus. They also published that 34% of social media users use Twitter. The problems?
- Google itself asserts that only 25% of the people signed up for their service are active on G+.
- If only 25% of people on Google Plus are active, it would be impossible for 50% of all Internet users to be using Google Plus regularly.
- Also, Pew recently released data suggesting that 16% of Internet users are using Twitter.
- And to add more conflicting data to the mix, in Neilsen’s most recent report Twitter has a third of the audience of Facebook (a number that probably falls squarely between Technorati and Pew).
How do you make sense of this? The first consideration would be if the social media research is repeatable. Can two (or more) independent studies come up with the same conclusions? In most of the studies I’ve seen of Twitter users they fall between 20-30%, so if Pew were correct I would suspect subsequent studies by other agencies to substantiate their sub-20% number, and likewise for Technorati’s lofty 34%. There are a ton of one-off studies, but when the numbers become congruent across multiple studies there is a higher likelihood that they are reliable.
As another example, at the beginning of 2012 an infographic was circling the internets about Pinterest stating that men were the dominant gender on Pinterest in the U.K. This made no sense to me, so I researched the source (which was Google’s DoubleClick Ad Planner) and found that it showed men to be the dominant gender for a host of female-dominated web properties. The results weren’t replicated anywhere, leading most to the conclusion that the ad planner data was wrong (despite contrary evidence of some people still insist to me that British men are fanatical Pinners).
This isn’t to say that any of the social media research these companies are providing is intentionally inaccurate, but all of this research is imprecise by definition. Bigger (randomized) sample sizes can increase accuracy, but anything short of 100% is inaccurate to some degree.
The source of social media research is often questionable
Here’s the other thing about social media research: it’s not analogous to the peer-reviewed double-blind placebo-controlled studies that you might read in JAMA.
Oftentimes, studies ask respondents to make a time estimation when it would be better to have some kind of time tracker in their browser. The danger of these type of studies was exhibited last year when Reuters released a study about decreased Facebook usage only to have ComScore release data the next week saying that time-on-site for Facebook had never been higher (in this case, ComScore did actually have a time-tracking mechanism).
You also have to consider the agenda behind the social media research. A lot of these studies are developed as content to get attention for a product or service, so they have a definite (ahem) “point-of-view.” Trying to determine the source data for these studies is helpful to try and vet whether the sample is representative of the general population or skews young or towards another demographic as well. For instance, Pew says 6% of Internet users are on Tumblr, yet when you take a look at 15-25 year olds it is as widely used as Facebook. 6% doesn’t tell the entire story.
These studies are never going to be entirely relevant, despite how precise or imprecise they may be. For example Google dominates search, but in my old stomping grounds of Seattle Bing has a much higher share of market than anywhere else in the world. Why? Because Microsoft has branded sports teams with the Bing logo and is an integral economic driver of the community. A higher percentage of African-Americans use Twitter than any other group. A business with a large African-American clientele may want to focus more on Twitter outreach than another business.
Each business has their own web and media properties. They have their own unique clients and strategy. No social media research is going to provide the specific answers for a problem. The best way to understand this data is to judge it based upon its repeat-ability and the strength of its sources. Then use the best data to inform a specific social media strategy.
It’s incumbent on businesses to measure their own activities to find out what works and what doesn’t. If you’re relying on Pew to tell you how the wind blows, you’re going to be going in many different directions.
So how many Facebook posts should you post a day? I don’t know. What do you think? 🙂