Polls are facts and data.
That's one of the most ridiculous things you have ever posted.
Polls in the context we are using are *opinion* polls - at the very BEST they are still a subjective gauge of opinion.
They may BE data, but so is a random list of numbers - it's how it's put together that has any relevance.
And THAT is HIGHLY subjective. Doesn't mean it's always wrong, but it's an educated guess.
Now while my bailiwick is statistics, I don't do opinion polls - they are about the single most subjective thing I could measure.
Even with the forms my agency collects, we still have experts who craft the form's content, because we can't have valuable data screwed up because the question was ambiguous, confusing or touched upon a subject the responder didn't want to answer. If the response rate is too low, we can't really publish.
It is difficult enough for an opinion pollster to craft a decent sample - or to find a means to accurately collect it, since the world changes and relying on things like land lines no longer yield the results you want, and cell phones don't always get the kind of person you want to represent. You also have to decide if you want a snapshot of the population as it is - and how will you determine what it is.
If polls were facts - they would all yield very close to the same results - and those results would be borne out by events, such as elections.
They are not. They're a measurement of what the pollster thinks the sample thinks. And the sample itself and the means used to get the data can be flawed.
While my opinion of Nate Silver has definitely been weakened by his most recent efforts - he HAS observed that averages of different polls tend to be more accurate.
Call it the wisdom of the crowd - an average works better when the different polls are different, because it means they're not parroting each other.
I *have* participated in 'push' polls, although I had to wade deep into it before I realized what was being done.
I'd be asked an innocent question - THEN I'd be "enlightened" with some commentary and ask if I STILL thought the same thing.
It wasn't long before I could tell which way the pollster wanted the answers to go - so after fifteen to twenty minutes, I said, sorry.
Don't bother with any more of my answers, because you're not interested in them.
Having been a participant - and having read what they CLAIMED were the questions (without the challenges they posed) - I realized that some pollsters
want to shape the opinion rather than collect it. And that is dishonest.