Below average
The other day, this meme cropped up on
Facebook.
I surmise that the intent is to
ridicule the innumeracy of those who would consider such a revelation
shocking when in reality it is a perfectly banal observation. What
could be more obvious than that it's the nature of an 'average'
(presumably the median – see here if you were absent the day they
taught averages)
that half will be above it and half below?
Or is it?
First of all, the assertion assumes
that we know what 'intelligence' means. In reality, the only rigorous
definition I've encountered is in terms of performance on
intelligence tests. It may be circular, but that's what I'll assume the meme is talking
about.
In a population with an odd number of
observations, the median is the one exactly in the middle of the
distribution when ranked from highest to lowest. So in a population
of five with intelligence test scores ('IQ') of 120, 110, 105, 90,
85, the median is 105, and clearly 50% are not 'below average' –
40% are. In such populations, the proportion 'below average' is
always going to be less than 50%, even if there are over 316 million
observations and the difference is slight.
When the number of observations is
even, however, the median is the arithmetic mean of the two
observations in the middle. So in a population of four, with IQs of
110, 100, 90, 85, the two central observations – 100 and 90 –
average to 95 and in this case, exactly 50% really are below average.
But what if their scores were 110, 100, 100, 90? The two observations
to average are both 100, so the median is 100 and only 25% are 'below
average'. Since IQ scores are deliberately calculated to conform to a
normal distribution, it is, if I'm not mistaken, inevitable that
there will be a cluster exactly in the middle of the range and there
will never be 50% below average because a proportion, probably a
plurality, will always be 'average'.
Even if there were some possibility
that not one of the 316,044,000 Americans had an IQ of exactly 100,
it transpires that intelligence test scores are grouped into ranges
and the range 90-109 (sometimes 85-115) is, uncoincidentally,
denominated 'Average'.
About 50% of the population fall into the 90-109 range and 68% into
the 85-115 range, so only 25% in the first case and 16% in the second
would be 'below average'.
If it were true that a study showed 50% of Americans
'have below-average intelligence', then, that really would be a shocker.
Sorry, but I have to disagree with Ernie on this one.
ReplyDelete1. Firstly, it is not necessary for the discussion for everyone to agree on a definition of intelligence. Rather, it is merely necessary to agree that intelligence is, in principle, definable and also, in principle, measurable in a way that would allow an average to be calculated.
2. Secondly, if we take a population of 316,044.000 and add one, to get an odd number, then the number who are below the median would be 158,022,000, which is 49.9999998% of the population. Most people would round that to 50%.
Thanks for your comment, Abim.
ReplyDeleteMy view is that it is not really possible to define intelligence in such a way as to be measurable in principle unless the definition is an artifact of the measurement itself. But I'm content to define it that way for this purpose.
If the median IQ of a population of 316,044,000 has a value of 100 and 31,604,400 (10%) have an IQ of 100, then only 45% of the total population will actually have IQs below that value, as I understand the term below.
In the more plausible scenario, construing 'average intelligence' as a range of IQ scores, then the proportion 'below average' is much smaller than 50%.
Well, as a representative of the lower half, I can tell you it's not so bad, as long as you don't try and think too much, cause that hurts. I get dizzy and everything sort of quivers, and I half to sit down for a while.
ReplyDelete