In the light of public comments regarding every single major event since the invention of language, I wanted to sanitize my own little corner of the world regarding opinions.
Disclaimer: If you're already offended by the title, I'm sorry, please refrain from reading further. I won't convince anyone in such a state of mind and will only ruin your day even more, which I have no intention of.
There exist some articles with a slightly similar title treating specifically the common saying "everyone is entitled to his opinions" as a rhetoric argument. You can find such great article by Patrick Stokes on The Conversation, however this is not the goal of this present article which is focusing on our forming opinion hardware, their motivations to be publicly displayed and their consequences.
Humans are pretty terrible at opinions
Old brain
Despite the fact that humans have the biggest brain relative to their total body volume, it still has to juggle, on top of the usual life support functions, such complex tasks as standing on two feet, a continuous stream of conscience, feelings and abstract reasoning.
This latter task seems to be the most challenging in the whole of nature since it leads to language and logic, two features that we never found as developed through the animal kingdom as in the human species. And yet we are far from perfect thinking brains. It maybe will happen someday, but so far the time scale of human existence doesn't even start to compare with evolution's and we came up with technology that renders the normal process of evolution (favoring genetic traits spreading through reproduction) void for humans. We may be able to genetically engineer ourselves (or our children) to be more rational with that big brain of ours, but until then we are stuck with an organ that changed very little since Homo sapiens sapiens appeared 200,000 years ago.
This means that if Doc and Marty jumped back 200,000 years ago, they could teach Homo sapiens sapiens how to drive the DeLorean. It goes the other way: in danger situations, modern humans can revert to very primitive instincts, losing the varnish of modern civilization in the blink of an eye.
New science
Outside of both those extreme situations, our brains routinely use long-standing thought shortcuts that recent psychosocial sciences started cataloguing as "cognitive biases". Those inconscious behaviors are widely common, whatever the IQ or education level, and make us believe that we are flawlessly logic and rational even when we aren't, and it happens even if one knows about cognitive biases themselves! Hardwired in our brain, an iron mental discipline and frequent thought self-checks are required to overcome them.
The twist is that cognitive biases are set in stone in the brain circuitry because they made sense for Homo sapiens sapiens' reproduction. The downside is that knowledge of cognitive biases can make us believe that we got rid of them and believe stronger in our self-righteousness while our actual thought process hasn't changed, only our self-perception of it.
Wrong on so many levels
The bottom line is that those cognitive biases play a huge role in how wrong we can be, beyond the simple ignorance of all the relevant facts. They, among others, dictates which known facts we will find relevant (usually those that confirm an existing belief) or discard as unrelated or inconsequential (those that infirm an existing belief), which argument we will find the most logically sound just because we already agree with the conclusion, etc.
Coupled with our lack of omniscience and our need to rely on human intermediaries (themselves probably biased, if only by omission) to obtain facts, each time we form an opinion, especially about the future, about which facts are non-existent, or about group of people we don't belong to, about who we have limited knowledge, there is a very good chance that we will be far from the actual truth, from the reality itself.
Even science, whose counter-intuitive rules to form hypotheses and test them are quite demanding, is not exempt from biases. For example, authority based on previous publishing success and award wins can sometimes give undue credibility to a poorly testable theory. A famous example is Einstein's positions on quantum mechanics that have been proven wrong only recently, but who knows how much progress in the field has been slowed by other physicists believing his theory based on his science cred'?
Even the Bayesian philosophy [fr], aiming at being even more stringent regarding belief accuracy, is not fool-proof, as it has to be used with extreme rigor to be effective. At the probability level, the truth inside the space of possible beliefs is a needle in a huge haystack, but cognitive biases make us see a huge pile of needles and the closest by reach is good enough. A lot of self-work is required to see the hay again.
Next, we will consider the motivations to publicly declare an opinion that has a good chance to be wrong and the potential consequences on recipients.
Motivations and consequences
Why am I telling/writing this?
No, we aren't always shouting opinions in the public space, be it a family dinner or Facebook, because it is particularly insightful. There are many other motivations that are more or less debatable. For example, the boundary between wanting to transmit knowledge and simply showing off is thin. Or between convincing people you are disagreeing with you they're wrong and convincing people you are agreeing with that you are on their side. Or between heartfelt compassion and moral grandstanding. In each of those cases, one is an actual message; the other is just unnecessary noise.
It seems opportune to me to ask ourselves the simple question "Why am I about to tell/write this?", especially before writing a snarky reply to a stranger on Facebook on a hotly debated topic. Is it really worth it? Are you really going to convince someone they're wrong while you are about to belittle them the best way you can? Are you bringing new thought elements to an old debate or are you just signaling your belonging to either tribe shouting the same catchphrases at each other from a distance? Are you sure that wishful thinking is not clouding your judgment right now?
Closely related, you can follow-up with the question of consequences.
What consequences my sayings/writings can have?
Remember, you're about to tell/write something probably wrong for questionable reasons, even after the previous self-interrogation that another cognitive bias can skew the answer for yourself.
Last check, the lasting effect of your opinions on your recipients. Is the tone of assertion positive or negative? Are you talking about a whole group of people you don't belong to as having a so-called single consistent behavior, even if (or rather especially) it's supposed to be humor? And no, resisting "politically correctness" doesn't make you smarter, it just shows that you don't care about the concerned people and your opinions regarding them are therefore neither relevant nor necessary in a public space.
Bottom line, your opinions can hurt people, and the sooner you realize it, the sooner you can stop spreading them, whether you care about those people or not. Or you really are mean, but nobody is a villain in their own story so I don't expect anyone to actually step forward and admit they're mean, even if they actually are by their actions alone.
Conclusion
Most countries enforce freedom of speech, but that doesn't mean you should exercise it indiscriminately. First, this freedom mainly protect citizens from government censorship, and second, remember it is with a brain machinery 200,000 years old that you are forming opinions and they are probably wrong anyway. Self-asking a few questions before saying/writing anything can help avoid a few pitfalls of public debating.
Post-hoc: This article is of course subject to the same criticism that it presents. I wrote it with high expectations for public debates in mind, even though this concern is not universally shared (or even known). While it can be stern at times, I tried to be respectful of people even if I don't agree with some of their public behaviors (that I sometimes have myself). My intention was not to hurt anyone, but hopefully provoke some self-thoughts to the reader. I'm particularly not proud of the provocative title, but I submitted to the click-bait rule at least a little.
The top 3 messages you get when interacting with an AI assistant are :
Hence, there is an experience far more frustrating than talking to an opinionated human : that is talking to a (bayesian or whatever) perfectly logic being. Don't even try it unless you don't care about your hairs.
Do you believe Dr. Spock has no hardcoded cognitive bias ?
Do you believe there is one true logic ?
Well, it seems you are wrong, there are many (incompatible and/or undecidable) families of logics.
Do you want a classical logic ? Do you want a constructive logic ?
Do you want a modal logic ? Do you want a deontic logic ? Do you want a doxastic logic ?
May be a custom-made logic would be better ?
Anyway, you have to cook your own logic salad.
So what do you prefer : hardcoded logic or hardcoded cognitive bias ?
The answer is you prefer hardcoded cognitive bias.
That is the true reason why humans are far from obsolete : they are opinionated rather than being logically blinded.
Moreover, they don't wish the logic better world you seem to miss.
Your own cognitive bias is, as any transhumanist does, you ignore the meta-logic facts. Because they make your ill-conceived dream an American fantasy story.
While I don't know if I would enjoy interacting with an AI on a regular basis, at least I know it would be reliable, barring mechanical failure.
But that's not even the point. It is not because I would not like to interact with an AI that we can't do anything as humans to try to overcome known and documented biases. This false alternative offering you just made probably has a name of its own in the fallacy/bias glossary.