Nate Silver rates the polls

Nate Silver has rated the various polling companies, many of which are included in his algorithms. When you look at their results you can see why Romney conservatives were so far off, they were relying on Gallup and Rasmussen and they were so dreadfully wonky it is lucky they have any credibility at all.

There were roughly two dozen polling firms that issued at least five surveys in the final three weeks of the campaign, counting both state and national polls. (Multiple instances of a tracking poll are counted as separate surveys in my analysis, and only likely voter polls are used.)

For each of these polling firms, I have calculated the average error and the average statistical bias in the margin it reported between President Obama and Mitt Romney, as compared against the actual results nationally or in one state.

For instance, a polling firm that had Mr. Obama ahead by two points in Colorado — a state that Mr. Obama actually won by about five points — would have had a three-point error for that state. It also would have had a three-point statistical bias toward Republicans there.

The bias calculation measures in which direction, Republican or Democratic, a firm’s polls tended to miss. If a firm’s polls overestimated Mr. Obama’s performance in some states, and Mr. Romney’s in others, it could have little overall statistical bias, since the misses came in different directions. In contrast, the estimate of the average error in the firm’s polls measures how far off the firm’s polls were in either direction, on average.

 

109%