The wrong way round
We seem intuitively to understand the meaning of statistical distributions, but our interpretation can be the wrong way round...
You don’t need to be a statistician to know that there are fewer very small, or very large houses than average-sized ones, or that there are fewer people with a very high or very low IQ, than with an IQ around 100 (which is the average). The size and weight of most living creatures (including us) tend to be distributed such that the more extreme individuals in either direction are less prevalent, whether it concerns humans, frogs, shrubs, cows or blue whales. Even natural phenomena like the flow of a river might be close the average most of the time, with highs and lows more infrequent the further they deviate from the average. In other words, we seem to be intuitively quite familiar with the so-called bell curve or normal distribution, if not in its mathematical definition, then at least in its manifestation all around us.
If extremes are rare, is what is rare extreme?
This gives us a powerful heuristic, a rule of thumb. Provided we have some idea of what the average of a particular quantitative characteristic of a category is, we can judge how prevalent a member of that category is, based on this characteristic. If we know, for example, that an average family car weighs around 1.4 tonnes, we can say with some confidence that a car weighing less than a tonne, or more than two, is going to be a pretty unusual sight. Of course, we need to be careful with our assumption about the average. To a Belgian, a skyscraper with 30 stories will be well above average (the tallest building in this diminutive country is barely 150m in height). In Hong Kong, however, there would be more than 500 buildings that are taller. Similarly, you would find very few women of 1m 74cm (5’9”) tall in Macedonia where they are on average 1m 54cm (5’1”) tall, while in Sweden there would be plenty, where it is the average female height.
A closely related heuristic links more subjective characteristics to prevalence. If we evaluate an item as positive, we would tend to evaluate an identical item that is scarcer as more positive. An early example of this comes from research by social psychologist Stephen Worchel and colleagues dating back to 1975, in which cookies in scarce supply were rated as more desirable than identical biscuits available in abundance. This rule of thumb too has a plausible explanation. Consider two vital commodities that are essential for or survival: oxygen and water. Both are inherently extremely valuable, but while oxygen is mostly abundantly available, water is comparatively scarce. Hence, we don’t pay for the former, but we do for the latter, according to its scarcity (it will most likely be more expensive in a Sahara oasis than in your local supermarket). Even before our ancestors invented bartering and trading, they will have learned to devote more resources to fulfil their need for scarcer commodities (like food or shelter) than for equally important but more abundant ones.
Interestingly, we also tend to perceive characteristics we consider as negative as more extremely so when they are rarer – intuitively we would typically consider rare diseases or accidents as worse. In one experiment, participants were asked to evaluate a fictitious enzyme, according to whether it was presented as beneficial or detrimental to health. Unsurprisingly, it was rated as positive in the former case, and negative in the latter, but when it was stated as being rare, the evaluations were more positive for the favourable framing, and more negative for the harmful framing. Similar results were found in experiments with personal characteristics and decisions.
Sound as this scarcity heuristic, popularized by psychologist Robert Cialdini, might be, we sometimes apply it the wrong way round, and that is not always correct. When a product quickly sells out, we tend to assume that it must be good, because everyone is buying it. If a restaurant is fully booked for the next several months, we conclude that it has to be extraordinary, because so many people want to eat there. But those are not the only possible reasons for scarcity – the supply of goods (or indeed the publicity around it) can be easily manipulated by astute marketeers and salespeople. When products and services are offered in limited editions, in a few exclusive outlets, or in restricted order sizes, the sellers exploit the perception of scarcity as a sign of superior value. If something is extremely good, it is likely to be rarer than the moderately good equivalent, but it is not because it is rare, that it is necessarily extremely good.
Minorities are seen as more extreme
So, assuming that low prevalence implies extremity in some or other pertinent characteristic is a cognitive bias. According to research by psychologists Yvonne Emig and Hans-Peter Erb at the Helmut Schmid University in Hamburg, this bias extends to how we judge demographic minorities. The authors first established that the term ‘minority’ (rather than ‘majority’) is generally associated with a higher degree of extremity, both in the abstract, and for specific natural (e.g., vegetarians/meat eaters; swimmers/non-swimmers; smokers/non-smokers) or fictitious ethnic groups (“Suchomi”/”Abchasen”; orange and purple stick figures).
Next, they confirmed that minorities with fewer members would be seen as more extreme than minorities with more members. They also found that participants tended to assign specific hypothetical men to a majority group if they were described in moderate terms (e.g., an accountant who likes pasta, with a pet dog and who plays football in his spare time), and to a minority group if they were described in more unusual/extreme terms (e.g., an undertaker loves no dish more than lobscouse, whose favourite animals are calamari and who goes bungee jumping in his leisure time).
In a further study, participants were asked to rate both a minority group (vegetarians) and a majority group (non-vegetarians) on scales between opposite extremes (e.g., ugly/beautiful, strict/mild, and relaxed/temperamental). Here too, the minority group (and its members) was rated as more extreme (in either direction) than the majority group (and its members).
The two final studies showed, (a) for 64 natural groups (e.g., pensioners, scientists, teenagers), that the less prevalent ones (e.g. artists or self-employed people) were stereotyped to a greater extent, and (b) that for two fictitious groups (to control for stereotypes), the minority group tended to be rated more extreme on a warmth vs coldness scale (e.g., caring, social, tolerant vs selfish, unpleasant, temperamental).
The authors conclude that there is strong evidence for the existence of what they call the Minority Extremity Bias, a tendency to judge minority groups in a population as more extreme than majority groups, and that this bias is stronger the smaller a minority is. This result supports the idea that we automatically equate low prevalence with extreme characteristics, and vice versa, and that we are not too bothered whether we interpret this link correctly or the wrong way round.
Of course, there is no reason why vegetarians, artists or politicians would be considerably more good-looking or ugly than meat eaters, non-artists or non-politicians. And given our tendency to form snap judgements and rely on first impressions and copy others who do so, our intuitive inference that someone who belongs to a minority group is more likely to exhibit any random characteristic to a more extreme extent than a member from the majority could lead us astray.
Wouldn’t it be ironic if we became the minority that does not exaggerate the traits of minorities?