@bizjournalism

The Reynolds Center Horizontal Logo In Color

Two Minute Tips

Reporter’s tip sheet: How to assess a survey

January 9, 2017

Share this article:

Reporter beware: Many corporate surveys are created to amplify a marketing message, not to offer unbiased information. (Credit: Pixabay user andbreit)
Reporter beware: Many corporate surveys are created to amplify a marketing message, not to offer unbiased information. (Credit: Pixabay user andbreit)

Surveys and polls are regular tools of public relations teams. But it’s important to recognize that these studies are primarily created in a company’s own interest, not to enhance your stories or illuminate your readers.

Many surveys and studies are worthless and provide nothing of real value. Instead, they offer numbers as a form of verisimilitude—the appearance of truth and relevance—to support the marketing messages a company wants to disseminate through the media. Which means you.

Even companies that are not attempting to manipulate reporters can make mistakes in creating and executing a study. Whether intentional or not, here are some potentially misleading technicalities you should pay attention to.

Confidence level and margin of error

All polls are conducted within a confidence level, which refers to the percentage of time you could expect roughly the same results. Most reputable polling aims at a 95 percent confidence level, so 95 percent of the time you’d have consistency in the answers.

Margin of error isn’t a promise of how closely the poll represents reality. Instead, it means within that confidence level, the overall answers would be plus or minus the margin of error. For example, you poll consumers for their preferences between Business A and Business B and have a 4 percent margin of error at a 95 percent confidence level. If you ran the poll 100 times, 95 times your answers would all be within 4 percentage points of each other. The other 5 percent of the time, all bets are off. Also keep in mind that a poll can have a tight margin of error and still be terribly deceptive because of other factors.

Sampling method

Polling depends on random samples from a population, whether all adults in a country, all likely voters, people who use particular types of products, or left-handed geriatric former pharmacists. The more the sampling favors one subgroup, the more biased the results can be. Another sampling error happens with what is called a self-selected sample, when a less random group of people is surveyed. For example, survey results coming from a poll run on an organization’s own website will necessarily be slanted.

Sample size

Math governs the number of people you need to question out of a given population size to obtain results within a specific confidence level. In general, the more people answering the poll, the more accurate it can be. The smaller the sample, the more you should question the results. Bigger numbers are also important if you’re looking at preferences of subgroups. If you have 700 responses overall but only 30 from millennials in Milwaukee, the analysis of that subgroup can’t be accurately projected to the nation as a whole.

Polling timetable

People aren’t isolated from the world when they consider answers to a poll. Ask when the poll was conducted and then look at what major news stories appeared during that time period that could have affected public opinion.

Questions and their order

This is often the sleeper issue. The way questions are phrased and in which order they are asked has enormous influence on how people answer. If you can’t get a copy of the questions and how they were organized, consider not using the survey. One of the best examples of how this works comes from the old BBC comedy Yes, Minister.

Check this page from the National Council on Public Polls for more questions a journalist can ask about poll results.


Reporter’s Takeaway

• Be skeptical. Companies often publicize surveys in order to underscore and legitimize their marketing message, not to offer unbiased information.

• Important details to focus on include a survey’s confidence level—the percentage of time a poll would result in similar findings. A confidence level of 95 percent is considered standard. If a confidence level is much lower, even a tight margin of error doesn’t mean the survey is sound.

• Don’t rely only on a survey’s answers without seeing the actual survey. It’s vital to examine the questions, as well as the order in which they were asked, to understand how a poll might have been slanted.

Author

  • Erik Sherman

    Erik is an independent journalist and author who primarily covers business, economics, finance, technology, politics, and legal/regulatory, while elegantly expressing the complex and often incorporating data analysis.

More Like This...

Two Minute Tips

Sign up now.
Get one Tuesday.

Every Tuesday we send out a quick-read email with tips for business journalism.

Subscribers also get access to the Tip archive.

Search

Get Two Minute Tips For Business Journalism Delivered To Your Email Every Tuesday

Two Minute Tips

Every Tuesday we send out a quick-read email with tips for business journalism. Sign up now and get one Tuesday.