Tribalism in Science (and Economics)

If you ask the average person what ‘science’ is, they’ll probably answer something like ‘it’s what we know about the world’. To the lay person, ‘science’ is a body of facts.

To the trained scientist, however, ‘science’ means something different. It’s not a body of knowledge. It’s a method for determining what’s true and what’s not. To determine the way the world works, science appeals to evidence.

The ideal of science is beautifully summarized by the motto of the Royal Society: nullius in verba. It means ‘take nobody’s word for it’. In science, there is no authority. There are no gods, no kings, and no masters. Only evidence.

In this post, I reflect on how ‘taking nobody’s word for it’ cuts against some of our deepest instincts as humans. As social animals, we have evolved to trust members of our group. Among these group members, our instinct is to ‘take their word for it’. I call this the ‘tribal instinct’.

When we do science, we have to fight against this tribal instinct. Not surprisingly, we often fail. Rational skepticism gets overpowered by the instinct to trust members of our group. If the group happens to be powerful — say it dominates academia in a particular discipline — then false ideas get entrenched as ‘facts’.

This is a problem in all areas of science. But it’s a rampant problem in economics. The teaching of economics is dominated by the neoclassical sect, which has managed to entrench itself in academia. Among this sect, I believe, tribal instincts trump the rational appeal to evidence.

Recent empirical work highlights this fact. Neoclassical economists, it seems, pay deference to fellow members of their sect. But before getting to the fascinating empirical results, we’ll take a brief foray into evolutionary biology. This will help us understand why skepticism and the rational appeal to evidence so often get trumped by the tribal instinct to believe members of our group.

Our evolved sociality

While economists like to pretend otherwise, humans are social animals. We spend the vast majority of our lives in groups, and much of our time is spent negotiating social relations. Like all other social animals, we’ve evolved to behave this way. Sociality is an instinctual behavior.

In my opinion, the best explanation for the evolution of sociality comes from
multilevel selection theory
. Like orthodox Darwinism, multilevel selection theory appeals to ‘survival of the fittest’. The difference, however, is that multilevel selection looks at many ‘levels’ of selection. Orthodox Darwinism, in contrast, is concerned only with selection between individuals. One individual out-reproduces another, and so passes more of its genes to future generations.

Multilevel selection keeps this selection between individuals, but adds other units of selection — both smaller and larger. At the smaller level, there is selection between cells within organisms. This explains how multi-cellular organisms evolved in the first place. At the larger level, there is selection between groups of organisms. This explains how sociality evolves.

To explain the evolution of sociality, multilevel selection proposes that there must be competition between groups. The fiercer this competition, the stronger the selection for sociality. Like Darwin’s original theory of natural selection, the premise of this group selection is simple. Here’s how David Sloane Wilson and E.O. Wilson put it:

Selfishness beats altruism within groups. Altruistic groups beat selfish groups. Everything else is commentary. (Source)

If there is strong competition between groups (especially if larger groups beat smaller groups), we expect sociality to evolve. The catch is that strong selection at the group level seems to be rare. We infer this because a minuscule fraction of all species are eusocial (or ‘ultra-social’).

Yet when group selection does happen, it produces potent results. Ultra-social species may be rare (in terms of the number of species), but they are spectacularly successful. In The Social Conquest of the Earth, evolutionary biologist E.O Wilson observes that the few ultra-social species that do exist — bees, ants, termites … and humans — dominate the planet (in terms of biomass).

Natural selection for tribalism: ‘taking somebody’s word for it’

With group selection in mind, let’s think about how the human instinct for tribalism — ‘taking somebody’s word for it’ — might evolve.

Imagine that you’re a member of a small tribe of hunter gatherers. A member of your tribe returns from a scouting mission and warns that a rival tribe is about to attack.

What do you do?

Do you take the scouts’ word for it? Or are you skeptical until you see the evidence first hand?

Let’s imagine how these two options might play out.

If you (and every other member of the tribe) take the scout’s word for it, then you immediately prepare for battle. If the scout was lying, the worst thing that could happen is that your tribe needlessly prepares for battle. This wastes your time, but little else. But if the scout was telling the truth, your tribe potentially avoids a devastating defeat (worst-case scenario … everyone dies).

Now imagine your tribe is filled with skeptical scientists. You (and every other member of the tribe) take nobody’s word for it. Seeking first-hand evidence that a rival tribe is actually approaching, each tribe member leaves the camp undefended. If the scout was lying, you avoid needlessly preparing for battle. But if the scout was telling the truth, your tribe potentially gets massacred by its rival. Viola, your tribe of skeptics is eliminated from the gene pool.

If we repeat this scenario a few hundred thousand times, we can see how selection for trust of group members would occur. The groups that follow the ideals of science — ‘taking nobody’s word for it’ — slowly get wiped out. The groups that reject the ideals of science and ‘take a tribe-member’s word for it’ win out.

The point I want to make with this parable is that in many situations, trust of group members is a more adaptive trait than rational skepticism. This is certainly the case with warfare — violent competition between groups. And as Peter Turchin convincingly argues, warfare may have been the driving force behind human sociality. Warfare rewards groups that are able to function collectively, and punishes those that are not.

Tribalism in science

In light of the evolution of sociality, it’s not surprising that humans have an instinct to ‘take the word’ of fellow group members. What is surprising is our capacity for rational skepticism. Clearly we do have the ability to ‘take nobody’s word for it’. Science depends on this ability. But it’s by no means our dominant instinct.

Doing science, I argue, is a precarious act. The scientist must foster rational skepticism and suppress the instinct to conform to the ideas of the group. At the same time, testing scientific theories often requires large-scale cooperation. Experiments in particle physics, for instance, involve the cooperation of thousands of people. These scientists must maintain skepticism while simultaneously having faith in the actions of fellow group members.

This balancing act can easily veer in the wrong direction. Thus we should not be surprised when tribalism prevails, and false ideas get ensconced as ‘facts’. And we should celebrate (because of its improbability) when the rational appeal to evidence wins the day.

Measuring tribalism in science

Here’s a fun idea: what if we scientifically studied tribalism in science? It would be quite simple to do. To study tribalism, we’d measure the degree to which scientists hold the following ideals:

Ideal of Tribalism: Take group members’ word for it.

Ideal of Science: Take nobody’s word for it.

Of course, we can’t directly ask scientists which ideal they hold. They’ll almost certainly respond that they respect the ideals of science. The problem of tribalism, I suspect, is an unconscious one. Scientists know (or at least profess to know) that they should respect evidence. But tribal instincts get in the way. For instance, a scientist might selectively interpret evidence (or even ignore it entirely) based on the ideas of his/her group. In most cases, the scientist will be unaware of what’s going on.

To test for tribalism, we need to measure unconscious bias. Here’s one way we could do it. We test if scientists’ agreement with a given statement is affected by its attribution.

If tribalism dominates, we expect scientists to agree with a statement if it is attributed to a member of their tribe. Conversely, they should disagree with the same statement if it is attributed to a non-member of their tribe.

Here’s a fun example applied to the most tribal of human activities — organized religion. Suppose we asked American Christians if they agree with the following statements.

“God is great.” — Pope Francis

“God is great.’ — Osama Bin Laden

My guess is that agreement would drop precipitously when the statement is attributed to Osama Bin Laden. His ‘God’, after all, is not the Christian God.

Conversely, suppose we asked physicists if they agree with the following statements:

“The Earth is round.” — Stephen Hawking

“The Earth is round.’ — Donald Trump

Because there’s overwhelming evidence that the Earth is round, we expect 100% agreement with both statements. In other words, the evidence trumps physicists’ loathe for Trump (who is clearly not a member of their physics tribe).

These are toy examples, meant to illustrate the extreme between tribalistic and scientific ideals. But they illustrate a point. If the ideals of science dominate a discipline, attribution shouldn’t affect scientists’ agreement with a statement. Conversely, if the ideals of tribalism dominate a discipline, attribution should strongly affect agreement with a statement. Scientists should be more likely to agree with a statement if it’s attributed to a member of their tribe.

Measuring tribalism in economics

This brings us to the results that you’ve been waiting for — the evidence of tribalism in economics.

Mohsen Javdani and Ha-Joon Chang recently conducted a survey of economists that mirrors the logic described above. Javdani and Chang asked economists if they agreed with a given statement. With one exception, the statements came from members of the mainstream tribe in economics (the neoclassical sect). But unbenounced to the economists, Javdani and Chang randomly switched the attribution of the statement from its original (neoclassical) source to a heterodox (non-mainstream) source. Javdani and Chang then tested how this change in attribution affected economists’ agreement with the statement.

Before getting to the results, let me frame my expectations. After studying economics for a decade, I’ve come to believe that the field is incredibly tribal. It’s dominated by a sect of neoclassical economists. Among these economists, deference to authority (prestigious members of the tribe) is everything. Evidence is an afterthought. In-house critic Paul Romer puts it this way:

Progress in the field [of economics] is judged by the purity of its mathematical theories, as determined by the authorities.

Given my experience in economics, I wasn’t surprised when I read Javdani and Chang’s results. They found that agreement with a given statement strongly depended on its attribution. Economists were more likely to agree with a statement if it was attributed to a mainstream source versus a heterodox source. Here’s their results:

javdani_chang

Evidence for tribalism in economics. This chart shows the difference in economists’ agreement with a statement when attribution was changed from a mainstream to a non-mainstream source. The table below shows the economists to which each statement was attributed. The first name is the actual source. The bold name is the non-mainstream economist. Source: “Who Said or What Said?” by Mohsen Javdani and Ha-Joon Chang.

To interpret this chart, look at the sign of the plotted point. If it’s positive, agreement increased when the statement was attributed to a non-mainstream source. If it’s negative, agreement decreased. All but one points are negative, meaning economists agree more with a statement if it is attributed to a mainstream economist.

These results nicely demonstrates how tribalism dominates economics. Most economists are members of the neoclassical tribe. And rather than ‘take nobody’s word for it’, economists preferentially take fellow tribe-members word for it.

Rationalizing tribalism

Perhaps even more interesting than Javdani and Chang’s results, is neoclassical economists reaction to these results. I recently posted the above figure on Twitter, causing a bit of a firestorm. Here are some of the reactions:

Many of the heterodox economists listed are … known to not know much of value so brand name capital is either minimal or negative. Real finding of paper is there aren’t ten heterodox economists that are even well known.
(Source)

The Econ Nobel is a good indicator of being within the scientific consensus, if anything it’s less of an appeal to authority than trusting a random heterodox academic on economic[s] because “they’re an economist”. (Source)

I’m … not sure that reliance on authority is necessarily fallacious if authority only granted on basis of expertise and judgment earned through years of careful study. (Source)

Rather than refute the study’s findings, these reactions demonstrate that mainstream economists know little about the ideals of science. Among members of the tribe, deference to authority is the ‘rational’ course of action. And we all know how neoclassical economists worship rationality.

A broader problem?

Lest we be too hard on economics, I think Javdani and Chang’s findings would probably replicate in other social sciences, and to a lesser extent in the natural sciences. But I suspect that the results would be less spectacular. My experience is that economics is far and away the most tribal of disciplines.

I’d like to see this type of research extended to all disciplines, and studied over time. Given our evolutionary heritage, we should expect to find tribal instincts at work in science, even when scientists profess to respect only the evidence.

The ideal of science — to take nobody’s word for it — contradicts our social instincts. It’s far easier to believe people we trust — members of our tribe. That science works at all is something we should marvel at. And when we fall short of the ideals of science, we shouldn’t obfuscate. We should admit that we’ve given in to base urges, and that we need to do better.


Support this blog

Economics from the Top Down is where I share my ideas for how to create a better economics. If you liked this post, please consider becoming a patron. You’ll help me continue my research, and continue to share it with readers like you.

patron_button


Stay updated

Sign up to get email updates from this blog.



This work is licensed under a Creative Commons Attribution 4.0 License. You can use/share it anyway you want, provided you attribute it to me (Blair Fix) and link to Economics from the Top Down.


3 comments

  1. Fascinating. Finding a method like Javdani and Chang’s for capturing instances of blind deference in scholarship is something I have been thinking about a lot in recent years. But, unlike them, I have failed to come up with anything as elegant as their experiment.

    In any case, there is a discussion in social epistemology around the role of trust in knowledge. It touches specifically on the “balancing act” you mentioned (“These scientists must maintain skepticism while simultaneously having faith in the actions of fellow group members”).

    Here are two relevant readings:

    https://www.researchgate.net/profile/John_Hardwig/publication/259863793_The_Role_of_Trust_in_Knowledge/links/5570700708aeffcab353c729/The-Role-of-Trust-in-Knowledge.pdf

    https://onlinelibrary.wiley.com/doi/pdf/10.1111/j.1933-1592.2001.tb00093.x

    I think that in the “balancing act” you briefly mentioned there is a hint of what one might call — at the risk of self-contradiction — rational trust. Quite often, one is not in a position to fully verify the say-so of another person, particularly in matters involving expertise. However, as Goldman points out in the above reading, even a novice can sometimes extract verifiable bits from the most arcane say-so, and acquire tentative reasons for rational belief (he discusses, among other things, track record and the dialectical performance of an expert in a debate).

    On the basis of introspection, I would also add to Goldman’s heuristics the fuzzy, qualitative impression that the scholars I tend to trust have a way of anticipating and articulately addressing what you have called elsewhere “bullshit alarms” in their audiences minds (vis-a-vis widely held beliefs and vis-a-vis their own work). I’ve had this experience with your work and Nitzan and Bichler’s work, even when I had trouble getting through the technical bits. But I realize this can easily deteriorate into blind trust, unless I maximize the opportunities for direct verification.

    In the spirit of skepticism, let me pose the following challenge. I wonder if your evolutionary account of the tribal instinct indeed applies to blind deference among economists, who may not interpret each encounter with a heterodox argument as an instance where physical survival (or even career survival) is at stake. And if so, their goals would differ from the immediate survival goals of the hunter-gatherer tribe in your example. Don’t the goals of the group matter in assessing whether the tribal instinct is at play in any given case?

  2. Excellent points, Yigal. My understanding is that evolutionary arguments are meant to explain the existence of instincts, not how these instincts get applied in every situation (especially those that are far different than the ancestral environment).

    The default assumption for most theories of human evolution is that we evolved in small groups. Our tribal instincts are to trust members of this small group.

    Obviously modern societies no longer look like this. The ‘tribe’ is no longer a distinct physical group. Being a member of a ‘tribe’ becomes more abstract — a group of people whom we trust and agree with.

    I’m not sure that defining an instinct as tribal depends on the goals of the group. In my view, a tribal instinct is deference to those we trust, regardless of evidence.

    • “My understanding is that evolutionary arguments are meant to explain the existence of instincts, not how these instincts get applied in every situation (especially those that are far different than the ancestral environment).”

      As ignorant as I am about the relevant literature, my superficial encounters with experts on the topic suggest you’re absolutely right about the explanatory agenda of evolutionary arguments and I should have been more careful in phrasing my comment. So yes, evolutionary arguments do not seek to explain how the instincts get applied. They only explain why instincts exist. My question — upon reflection — is whether an explanation of why an instinct exists, logically suffices to explain why it is activated/triggered in any given case?

      In other words, wouldn’t we also need to explain why the instinct wasn’t suppressed or counteracted (e.g. by another biological, physical or social circumstance) in the case of neoclassical economists?

      On a side note, I’ve recently finished reading Adam Becker’s book “What is Real” about the intellectual history of quantum mechanics. Clearly, blind deference did not skip over physics. And yet, if physics had been, on the whole, as given to blind deference as neoclassical economics, I doubt it would enjoy the remarkable breakthroughs it had.

      Put otherwise, if the tribal instinct unequally manifests among different academic/research communities, it seems to call for an additional explanatory step, beyond the explanation of its mere existence. As you point out above: “My experience is that economics is far and away *the most tribal of disciplines*.” (my emphasis)– this suggests that the relatively higher level of tribal behavior in economics (as compared to other disciplines) needs to be explained, beyond the very fact of the tribal instinct. But I could be wrong.

Leave a Reply