For negativity bias my wife just told me a great technique that she uses for that. Come up with a list of people whose opinions matter to you. Any time you question yourself, imagine how each person on that list would react to what you did. Since those are the only people whose opinions matter to you, if it’s mostly positive, then you should feel proud of your choice.
What bias is it if the only entry I’ve read in this table is the one for confirmation bias?
Probably… Selection bias?
Actually the reason I order the last item the server mentioned is because of crippling social anxiety
Same for not standing up in the middle of everyone to go out from watching a bad movie in the cinema.
To be clear, sometimes authority bias is good and proper. For instance, valuing the opinion of a climate scientist who has been studying climate chaos for thirty years more than your Aunt who saw Rush Limbaugh say climate change is a hoax in the 1990s is normal and rational.
Basically, authority bias as a reasoning flaw stems from misidentifying who is authoritative on a subject.
In a vacuum, appealing to authority is fallacious. An idea must stand up on its own merits.
IRL, things get fuzzy. No one has the expertise and time to derive everything from first principles and redo every experiment ever performed. Thus we sadly have to have some level of trust in people.
As long as the paper has the experiment well documented and it’s double blind, you don’t need to appeal to authority.
Counterpoint: the replication crisis
Well most people will choose a politician or actor instead of unknown Nobel prize winner. That’s how we got here.
What do I win once I tick them all off?
a senate seat.
I’m out here actively going against my biases and selling someone else’s house above market value 😤
YSK: the Dunning-Kruger effect is controversial because it’s part of psychology’s repeatability problem.
Other famous psychology experiments like the ‘Stanford prison experiment’ or the ‘Milgram experiment’ fail to show what you learned in psych101. The prison experiment was so flawed as to be useless, and variations on the Milgram experiment show the opposite effect from the original.
For those familiar with the Milgram experiment: one variation of the study saw the “scientist” running the test replaced with a policeman or a military officer. In these circumstances, almost everybody refused to use high voltage.
Controversial in the sense that it can be easily applied to anyone. There is some substance to the idea that a person can trick themselves into thinking they know more based on limited info. A lot of these biases are like that, they aren’t cut and dry but more of an gray area where people can be fooled in various ways. Critical thinking is hard even if it’s taught, and it’s not taught well enough or at all.
And all of that is my opinion and falls into various biases, but oh well. The easiest person to fool is yourself because we are hardwired in our brain to want to be right, with rewards to ourselves when we find things that help confirm it even if the evidence is not valid. I think the best way to try and avoid the pitfalls is to always back up your claim with something. I’ve found myself often(!) erasing a response to someone because what I was going to reply didn’t have the data that I thought it did and I couldn’t show I was correct after I dug a bit to find something.
I almost deleted this for the very reason, but I want to see how it hits. I feel that knowing there’s a lot of biases that anyone can fall into can help form better reasoning and argument.