Katrina Polling – Part One

Legacy blog posts Polls in the News

[Support Katrina relief!]

Obviously, we have all been horrified by the human tragedy wrought by Hurrican Katrina.  As a pollster and student of public opinion, MP has also been very curious about how other Americans are reacting to the tragedy and how polls will measure that reaction. I assume that many of MP’s readers share a similar interest.  Some initial results and a few notes of caution about them follow. 

Unfortunately, the timing of the crisis could not have been worse for those who conduct public polls.  Americans were learning the true dimensions of the crisis toward the middle and end of the week, too late to launch full three night surveys before the holiday weekend.  Most public pollsters try to conduct all or most of their interviews on weekday evenings, when Americans are most likely to be at home.  By contrast, a long holiday weekend in like this one is one of the worst times to do so.  A much higher than usual percentage of Americans was away from home over the last few days.  So the conditions for telephone survey work are poor.   

Thus, as of this hour, most of the national pollsters have yet to release surveys.  MP trusts if we are patient, we should have surveys from virtually all of them by the end of the week, perhaps within the next 24-48 hours.  With the usual perspective gained from multiple surveys, we will have a pretty good sense of American’s first reactions. 

For now, however – or at least as of this posting – we have three very different early surveys using *very* different methodologies:

  • Rasmussen Reports added a few questions about the Katrina relief efforts to its usual automated nightly tracking (500 interview per night using a recorded voice that asks respondents to answer by pressing their telephone keypad).  They are currently reporting results from Friday through Sunday night here.
  • SurveyUSA conducted at least three surveys, one each night on Wednesday, Thursday and Friday nights, contacting a random digit dial sample of 1200 adults each time (links to Friday morning press release, results from all three nights).  SurveyUSA also uses a recorded voice, automated method.

MP is posting early this morning in part to warn readers that all three of these surveys push the envelope of survey methodology.  The WaPo/ABC poll is the most conventional, but compromises by fielding for only one night and then on the Friday night of a holiday weekend.  As such, their response rate will be lower than their usual methodology (fewer days to try to contact those not home and fewer people home on this particular Friday).  The impact of the sample is unknown, although they will weight their sample as always so that demographic characteristics like gender, age, education match the Census estimates for the Nation (or most of it..see below).  The WaPo/ABC poll is also very short, in order to try to interview as many respondents as possible in a short time – they did not have time to ask questions like the standard Bush job rating.

The computers used by the automated pollsters get around the time limitations inherent with a limited numnber of live interviewers.   They can dial hundreds (perhaps thousands) of numbers simultaneously, so they are able to complete many more interviews in less time.  However, without a human being to help persuade the respondent to participate, they are likely getting lower response rates than the live interviewers (how much lower, we do not know — none of the three pollsters released response rate data this weekend).  Automated pollsters typically do not attempt to pick a random respondent in the household.  They will also weight by demographic characteristics to match Census estimates; Rasmussen weights by party ID, Survey USA does not.  These trade-offs are part of an ongoing debate about the quality and reliability of automated polling (also referred to as Interactive Voice Response or IVR).    MP certainly cannot settle these issues in the wee hours this morning, but consumers should be aware the methodologies of these three polls are *very* different.   

Also, the surveys ask different questions and have slightly different timing.  As such, we should not be surprised that they yield very different results.  For example, with respect to the federal government reaction to the crisis:


  • Washington Post
    /ABC shows Americans split on "the federal government’s overall response to the situation caused by the hurricane and flooding." Just less than half (48%) give an excellent or good rating, 51% a not so good or poor rating.
  • Rasmussen, whose field period continued for two more nights than WaPo/ABC, found only 28% giving an excellent or good rating to the "Federal response to Hurricane Katrina," 70% a fair or poor rating (Rasmussen did not provide the exact text of the question).
  • Survey USA reported a different question, but showed a declining confidence in the federal response each night last week.  On Wednesday night, they found 50% saying the "federal government is not doing enough to help the victims of Hurricane Katrina (40% said it was doing "the right amount, 5% said it was doing "too much).  By Thursday night, the "not enough" response had grown to 59%, by Friday night it was 68%.

Similarly, the two surveys that ask respondents to rate the president’s performance in responding to the Hurricane also show different results. 

  • Washington Post/ABC shows 46% approving and 47% disapproving of "the way George W. Bush is handling the situation caused by Hurricane Katrina."
  • On Friday night, Survey USA 40% approving and 53% disapproving, down from 44%-46% approve-disapprove on Thursday night and  48%-39% on Wednesday night.
  • UPDATE:  MP somehow overlooked the CBS Survey which added a Bush hurricane job performance question on the last two nights (Tuesday and Wednesday, 8/30-31) of a three-night survey conducted last week nights last week:  54% approved of "the way George W. Bush has handled the response to Hurricane Katrina," 12% disapproved, 34% were not sure.  The release did not specify the size of the "reduced sample," although the full three-night sample was n=871 adults. 

One can speculate all night long about the similarities and differences among these surveys and which methodology is most reliable under these circumstances.  But the bottom line is that all of these surveys use methodologies that are less than optimal methodology and we will need complete surveys conducted over three or more nights to be able to come to truly reliable conclusions.  Moreover, public opinion is probably quite fluid at the moment.  Americans have no doubt been jarred by the events and images of the last week.  Their opinions are likely to move around as they digest all the information and try to come to terms with the consequences.   

"This national emergency," the Post’s Morin and Deane wrote Sunday, "has not united Americans the way the terrorist attacks of Sept. 11, 2001, did."   On that much, all three surveys agree.  In a few days (or perhaps hours) we’ll know much more.

UPDATE 9/07:  New data and comments from SurveyUSA editor Jay Leve posted here.

——-

Note:  Any national survey done this week will obviously miss those in the affected areas that are currently without working telephone service as well as most of those who have relocated.  Some pollsters will intentionally withdraw from their samples telephone numbers in the affected areas, others will simply call all of their randomly selected phone numbers knowing that they will simply not get through to households in those areas.  It is a finer issue, but most surveys will also miss those who fled the storm and are staying with family or friends in another household in another state might   None of the three surveys provided details on whether or how they altered their methodologies.

The missing respondents are not likely to make much of a difference in the national results even if they have very different opinions.  For example, the full populations of the states of Louisiana and Mississippi amount to roughly 3% of the U.S. population (based on these 2003 census estimates).  MP will pass along details on how the national pollsters are dealing with these unique challenges and will pass them on when he learns them. 

[Apologies in advance for typos missed above – it’s very late early and I assume I’ve missed a few.  Feel free to email me your edits].

Mark Blumenthal

Mark Blumenthal is political pollster with deep and varied experience across survey research, campaigns, and media. The original "Mystery Pollster" and co-creator of Pollster.com, he explains complex concepts to a multitude of audiences and how data informs politics and decision-making. A researcher and consultant who crafts effective questions and identifies innovative solutions to deliver results. An award winning political journalist who brings insights and crafts compelling narratives from chaotic data.