Whose opinions can we trust?

I know I’m not the only person who finds it incredibly difficult to definitively state which side of a debate one falls on. Sure, there are certain issues which I don’t have to deliberate over for prolonged periods before feeling that my position is solid (marriage equality comes to mind) but my intuition is that those are the exception.

I generally take the view that if I was to spend a week researching as much as I could about a particular topic, then my opinion about that topic is almost certainly going to change. I don’t necessarily mean that I would end up flipping from one side of the issue to the total opposite; rather, I think of it in more of a Bayesian sense i.e. that the probability I’d assign to each side being correct would shift along a continuum in one direction or another.

Consider abortion. I consider myself to be pro-choice but not “anti-pro-life”. In other words, I do find it possible to put myself in the shoes of someone who is pro-life. I can think to myself “Okay, if I had had the same life experience as this person then I’m sure their arguments would seem far more logical and mine less so”.

Does that mean that I’m just pro-choice because of my life experience? I’d like to think that I hold whatever view I hold because I’ve considered all of the arguments for and against and have come to the rational conclusion. Having said that, most of my peers (at least the ones who seem to have given it sufficient thought) are pro-choice. That makes it incredibly easy for me to hold this set of views rather than the alternative. Whenever that is the case one should always examine one’s beliefs closer.

If all of your opinions are the same as those of your friends then there are two possibilities: either you have favoured one set of views because you want to fit in (consciously or unconsciously) or else you have evaluated all of the arguments and have just happened to come to the exact same conclusions. The latter outcome might apply to some of your views but the chance of it applying to all of them is low indeed.

On the other hand, that argument seems to imply that one should give one’s own opinions more weight than the combined opinions of a whole host of people. Is that too hubristic? I criticised people for adopting a set of views based on what other people believe but perhaps they are just modest and trust the wisdom of the crowd over anything they could come up with themselves. The idea that “the average opinion of humanity will be a better and less biased guide to the truth than my own judgement” has been referred to as “philosophical majoritarianism”.

A question then: should my own views get special epistemic privilege just because they’re mine? Or should I believe in the average opinion of everyone else?

Damn the hubris, I’ll bite the bullet and say that yes, I trust my own views more than the average opinion of everyone else. This avoids the middle ground fallacy i.e. to believe that the truth always lies in the middle of two extremes. If Person A claims that child abuse is morally good and Person B claims that child abuse is morally evil then screw the middle ground, I’m siding with Person B.

This reminds me of a story recounted by Richard Feynman, the great physicist. Feynman was asked to help the Board of Education decide which science books to include in their curriculum. There was one book in particular that he rubbished but which was approved by the board because the average rating given by 65 engineers at a particular company indicated that the book was great.

Feynman’s response to this was “I didn’t doubt that the company had some pretty good engineers, but to take sixty-five engineers is to take a wide range of ability–and to necessarily include some pretty poor guys! It was once again the problem of averaging the length of the emperor’s nose, or the ratings on a book with nothing between the covers. It would have been far better to have the company decide who their better engineers were, and to have them look at the book. I couldn’t claim that I was smarter than sixty-five other guys–but the average of sixty five other guys, certainly!”

Having said that, I’m sure that Feynman would not have made that claim about every other subject other the sun. In the case of evaluating a science book it does seem reasonable to trust the opinion of a Nobel Prize winner over and above that of the average engineer. But we surely wouldn’t trust Feynman’s opinion about the Israel-Palestine conflict more than the average diplomat’s or his opinion about the works of Homer more than the average Classics professor.

This suggests that that we should adopt the philosophical majoritarianism approach whenever we know less than the average person but not when we know more. However, we could surely improve our opinions further by taking not the average of all humanity but the average of the experts in that field. As Feynman said, the company should first have selected the best of their engineers.

I think it is fair to say that the opinions of people who are considered to be experts in their fields should be given higher probabilities of being correct, when those opinions pertain to the field that they are experts in. For example, I listen to what Richard Dawkins has to say about evolutionary biology but I care less for his thoughts on social issues.

Ideally these well informed people should also desire to have beliefs that match reality (rather than being tied to one ideology) and they should be trustworthy.

The idea of having beliefs that match reality is best seen in the scientific community. A particularly good example comes from a tale told by the aforementioned Richard Dawkins:

“I have previously told the story of a respected elder statesman of the Zoology Department at Oxford when I was an undergraduate. For years he had passionately believed, and taught, that the Golgi Apparatus (a microscopic feature of the interior of cells) was not real… Every Monday afternoon it was the custom for the whole department to listen to a research talk by a visiting lecturer. One Monday, the visitor was an American cell biologist who presented completely convincing evidence that the Golgi Apparatus was real. At the end of the lecture, the old man strode to the front of the hall, shook the American by the hand and said – with passion – ‘My dear fellow, I wish to thank you. I have been wrong these fifteen years.’”

One finds it more difficult to imagine such a Road to Damascus conversion in other fields although they do happen. For instance, Elizabeth Warren, the ideal Presidential candidate of many liberals actually used to be a voting Republican. That very fact makes me rate Elizabeth Warren’s opinion higher than many other people’s because it proves that she has past experience of being able to set ideology aside and think for herself.

I submit then that a good heuristic for finding people who want a map that matches the territory is to look for those who have demonstrated their ability to update their opinions.

I don’t want to give the impression that people who flip-flop from one side of an issue to another are the most trustworthy people of all. It’s not as if I’d trust someone who thinks creationism and evolution both have points in their favour more than I’d trust someone who has resolutely supported evolution for decades. Perhaps that’s because evolution has so much evidence behind it that there is little need for me to factor opinions in to the calculus.

The scenario could be different if considering something like raising the minimum wage. When I combine my lack of knowledge about economics with my observation that the experts in the field are divided then I think I would trust someone who says that both sides have valid points more than I would trust the economist who maintains that only their own side is reasonable.

This can probably be most succinctly codified as “Trust those who base their opinions on evidence”.

Is there a similar rule of thumb that we could use to decide who is trustworthy?  An obvious solution is to trust someone who is trusted by someone we already trust. But this only begs the question. We could easily find ourselves stuck in an echo chamber where we only expose ourselves to the views that confirm whatever views we already have (see confirmation bias). I’ve made a conscious approach in the past to read articles by people on the opposite side of the spectrum (and to follow them on Twitter) but that isn’t quite enough because I’m still more likely to read their writing with a more critical eye than writing that confirms my preconceptions.

To quote Terry Pratchett (specifically his character Lord Vetinari): “What people think they want is news, but what they really crave is olds…Not news but olds, telling people that what they think they already know is true”.

Perhaps the best answer is to trust people who we have found using the previous heuristic i.e. trust people with experience of updating their opinions and trust the people they trust. I’m not totally satisfied with this, but it’s a start.

I hope to revisit this topic soon to expand upon my thoughts but I’m just going to post this now anyway as I’m trying to avoid obsessing over my projects. Finished is better than perfect.

 

Advertisements

One thought on “Whose opinions can we trust?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s