9 Comments
May 14·edited May 14Liked by Carl Allen

Just one thought on communications here. You're a sharp man who makes some really key points. The difference between a "poll" and a "forecast" seems obvious after you discuss it, but that's true about almost all important insights. I learned a lot here because you made me see the elephant in the room.

That said, you might get more people to hear you if you didn't use words like "lie" and "fraud." Sure, you could conceivably be right, I grant you. But if you want to change the field and get people to hear your insights, that just creates a wall for them. They won't listen to you.

You also don't necessarily understand their motives--which is what "fraud" and "lie" indicate. The reality is we don't know what their motives are for sure. We don't even know if they comprehend your points. They might be so immersed in their field that they can step outside it and see what's going on.

I would suggest that you use other language that is not based on motivation. Saying something is not "scientific" is a valid criticism. It does not ascribe motivation. It's an argument if you explain yourself. This way you're focusing on objective analysis and criteria.

You need to get these ideas out, and sometimes the best way to do that is to stick to the facts and the evidence. And you have that on your side. So you don't need to do the other stuff.

Expand full comment
author

My tone needs work, that's for sure.

When I say someone is "lying" or something is "fraudulent" I'm not ascribing intent, though I can see why it is taken that way.

The best recent example I can give, is the London Mayoral Polls, in which YouGov reported:

"47% of people who intend to vote say they will back Khan (Labour)"

This is a lie.

It is literally not true.

I don't know how else to say it. Not true? Wrong?

Their data did not show this.

I understand the person who reported this probably doesn't have some nefarious intent to deceive people, though deception is the result.

That report came from data that (from memory) said something like 35% of people who say they intend to vote Labour, with another 18% undecided - many of whom said they intend to vote, they just didn't know for whom yet.

The "47%" figure required excluding a large number of people who said they intended to vote.

I understand I could probably present my findings and issues in a kinder way. But (between you and me) being extremely nice just makes you easier to ignore.

There are trade-offs.

Expand full comment

I don't think it's a matter of kindness, but of perceived objectivity. The word, "lie," does in fact indicate intent; that's what "lie" means. Something that's "literally not true" is more likely an error than a lie. They're making a mistake, but probably not lying.

Expand full comment

Btw, I saw you on Twitter because I made a comment on the NYT polling, arguing that it's possibly not trustworthy because of their now uncovered news bias in Biden coverage. You made a comment about clickbait polls (and I agree with you). Even I was a little surprised at how upset people were when we questioned the accuracy of a NYT poll. The whole discussion was totally defensive because everyone knows each other, and I'm definitely an outsider. They're not willing to listen to anything that criticizes their colleagues. So I got criticized there too. That's why it's important sometimes to be as careful as we can when taking on an establishment position. I understand you want attention (I get that), but it's also important to get a hearing if we can.

Expand full comment
author
May 14·edited May 14Author

I don't want to make it sound like I'm saying certain things for attention, that's not the case. I just don't censor myself when it comes to what I write in what should be scientific/academic discussions.

I'm not inflammatory for the sake of being inflammatory, I mean what I say. But your point about ascribing intent or "value" to a statement, which is not how I mean for it to be taken, can be misinterpreted for vitriol when it's just meant to be an (adamant, meaningful) observation.

If I were in an argument with (insert forecaster here) and I said

"But your forecast said Hillary was 100% to win!"

They'd call me a liar. And it is a lie.

They wouldn't very calmly and nicely explain probability to me, or engage in any amount of polite discourse, they'd call me a liar and probably never speak to me.

When pollsters report "47% of people who say they intend to vote say they'll vote for Labour" (despite that being as objective of a lie as the hypothetical one I offered)

I'm expected to be calm, nice, and explain the flaw in the method.

Don't get me wrong, you're right on about challenging the establishment, etc.

I'm just very aware of the hypocrisy in it.

Expand full comment
May 14Liked by Carl Allen

Well, no matter what, your work is very important. I'm rooting for you.

Expand full comment
author

I appreciate it. Can I ask, what is your background - any relationship to the field, of just interested?

Expand full comment
May 14Liked by Carl Allen

What is your view of pollsters who push undecided interviewees to choose a candidate? Is that a scientifically valid practice in your view, or does it depend on how the question(s) is (are) asked?

Expand full comment
author
May 14·edited May 15Author

Awesome question.

I have no problems with pollsters pushing undecideds to a "lean"

There is no perfect way to ask plan poll questions (questions about the future) so trying to "squeeze" a lean can be a good way to see if the voter has some preference, even if it's "soft" opposed to truly being undecided.

Two notes:

1) While most pollsters report "lean" support alongside "solid" support as if it were the same, it's good for people who forecast to know how much is each.

25% lean and 25% solid is very different from 35% lean and 15% solid.

This is a small quibble relative to the more egregious issues I have with how poll data is reported, but it's not nothing.

2) A diversity of methodology is almost certainly better than a "box" in which everyone asks the same questions the same way.

If some pollsters offer "undecided" as an option, but others don't, comparing the two sets can uncover some valuable info.

Likewise for those who ask squeeze questions and those who don't.

But even pollsters who ask squeeze questions are often left with undecideds - and more than a few. How they treat them (if they report them truthfully, versus if they guess about how they will eventually vote) isn't an area that falls under the "methodological diversity" umbrella.

Expand full comment