Nate Silver of the New York Times writes: "Our method of evaluating pollsters has typically involved looking at all the polls that a firm conducted over the final three weeks of the campaign, rather than its very last poll alone. The reason for this is that some polling firms may engage in "herding" toward the end of the campaign, changing their methods and assumptions such that their results are more in line with those of other polling firms."
Silver's article features a chart that indicates that polls were actually a point or two in favor of Mitt Romney on average in the three weeks before the election, instead of being tilted to Obama. But if it's true that Sandy helped Obama with a point or two (the exit polls suggest it helped), then it does make sense that, until a week before the election, polls were a point or two more favorable to Romney.
The best way of measuring pollster accuracy would perhaps be to analyze their sampling of the three weeks leading up to the election. The last poll from CNN was a D+11 in a D+6 election, but because their headline number was more in line with the actual result, they are ranked higher than, say, Rasmussen who mostly had D+4, D+5 polls.