Friday, April 14, 2006

'Hey, kids, let's do a poll!'

"Hey, kids, let's put on a show!" Mickey Rooney would say to Judy Garland and others as part of the classic old film formula that was a box office bonanza.

It started with "Babes in Arms," about teens who save their vaudevillian families from ruin by putting on a show. Three more musicals with the same theme followed and made Garland a star.

"Amateurs rule" was the message.

These days, the theme seems to be, "Hey kids, let's do a poll!"

Yesterday it was St. Norbert College providing the fodder for political spin across the state, while offering up numbers that meant almost nothing.

Today it's UW-Milwaukee, with its political poll that uses an interesting methodology.

One problem is that the survey was stretched out to 15 days. If a poll is supposed to be a snapshot of opinion, it's best done in 3-4 days. The UWM poll -- and the St. Norbert poll, which stretched over 10 days -- are blurry photos at best.

But this is what should set off alarms:

In the survey of 471 Wisconsin residents who intended to vote, 44 percent indicated they would vote for Doyle, while 33 percent said they would choose Green, if the gubernatorial election were held today. But 21 percent had not made up their minds, and 107 respondents said they probably would not vote.
It is customary, when doing a poll to try to find out what's happening in an election campaign, to ask people who actually intend to vote.

If 107 people said they probably would not vote, they should have been excluded from the survey to begin with.

When almost 20% of your sample consists of non-voters, it will skew all of the results. The survey used likely voters in the governor's race matchup, which showed Doyle 11% ahead, and in asking about the amendment to ban gay marriage and civil unions, which was favored 53-37. But the other questions used the whole sample. For example, 22% said they don't know whether they like the job Doyle is doing as governor. Those are people who aren't paying attention. and probably are among the non-voters.

The problem is that the news media treat all polls much the same, although there is a wide range in their accuracy and credibility. And the campaigns add to the confusion, twisting them to their advantage.

The most accurate polls, of course, are the ones you never hear or read about, that are done by professional political pollsters for the campaigns themselves. They are used for strategic purposes, not publicity. To campaigns, the horse race numbers are less important than the issues and messages they test, to find out how to move voters in their direction. When you see a poll that's leaked by a campaign, pay attention to whether the campaign has released the entire poll or just one or two questions that make their candidate look good. To be able to judge its credibility, you need to see the context, methodology and questions they asked.

Next best are those done by professional polling firms for media outlets. You'll see many of those done for national media outlets, but few in Wisconsin. In those cases the numbers are mostly reliable. One variable is the analysis -- does it come from an objective polling firm, or from a a political reporter for the media outlet which sponsored the poll?

A friend in the business says: "National media polls are usually pretty good because the sponsors have the resources to pay to do them right. (Often they combine with other news outlets to share the cost like NY Times-CBS, ABC-WashPost, or NBC-Wall St. Journal.) But local media polls can be on varying quality because they don't have the resources to spend and it is a business where it costs more to do it right: like using trained/experienced interviewers, calling back to get representative samples instead of whomever is home, interviewing large enough samples. Unlike some other states, Wisconsin does not have a history of a reliable statewide media poll and that's why these university polls gain so much attention."

Other tips from a longtime pollster:

Be wary of people who say they are doing it without a client (like Strategic Vision, whom the GOP will cite all year). If they are doing it right, it is a pretty costly proposition to do it for publicity purposes. Either someone is sponsoring it in some indirect way, or they are cutting costs.

Be wary of polls like Zogby that take internet volunteers--the cheapest of all polls.

Xoff's bottom line: Don't believe anything you hear, and only half of what you see. And, as my grandfather used to say in bygone days, "Never bet against the Yankees, Notre Dame or the Democrats."

3 Comments:

At 1:11 AM, Blogger James Wigderson said...

Would your grandfather bet in an Indian casino that contributed to the governor?

 
At 12:47 PM, Blogger xoff said...

Actually, I knew the size of the total sample. 107 non-voters is 18.5% of 578. I rounded it off to 20%, well within your margin of error, in writing about it.


578 doesn't provide "a realistic picture of the mood of voters," as you claim, when almost 20% of them are NOT voters.

You've just proven my point.

PS: I'm a professional political consultant who's been reading polls for 20 years. But in this case I guess I was an amateur because no one was paying me.

 
At 5:58 PM, Blogger xoff said...

At the risk of prolonging this discussion ad infinitum:

I understand, and always did, that the 107 non-voters were not included in the questions about who they would vote for. But what were they doing in the sample at all?

As to the UWM poll being evidence that two-week surveys are the wave of the future, I can only say that even a blind pig finds an acorn once in awhile.

I don't disrespect students, whether they are at UW-Madison, St. Norbert or Slippery Rock. i'm sure this exercise is a good learning experience.

But I would never rely on those polls if I were trying to get elected governor.

In truth, the most accurate polling about the last governor's race was the internal polling done by the Doyle campaign's professional pollster. But those were never made public.

My real gripe is that the media cannot distinguish between a reliable poll and an unreliable one, so treat them all as though they are equally valid. I'm sure you'll agree that's not true.

I didn't comment on the St. Norbert poll because I didn't like the results. I didn't like the methodology or the coverage. You did; we disagree.

But at least we're both Democrats. Let's shake on that and agree to disagree on the rest.

 

Post a Comment

<< Home