Sunday, October 7, 2018

245: How Far Apart Are Numbers?

Audio Link

When you draw a number line, say representing the numbers 1 through 10, how far apart do you space the numbers?    You might have trouble even comprehending the question,   If you’ve been educated in any modern school system in a developed country, you would probably think it’s obvious that the numbers are naturally placed at evenly spaced intervals along the line.    But is this method natural, or does it simply reflect what we have been taught?    In fact, if you look at studies of people from primitive societies, or American kindergarten students who haven’t been taught much math yet, they do things slightly differently.   When asked to draw a number line, they put a lot of space between the earlier numbers, and then less and less space for each successive one, with high numbers crowded together near the end.    As you’ll see in the show notes, ethnographers have found similar results when dealing with primitive Amazon tribesmen.  Could this odd scaling be just as ‘natural’ as our evenly spaced number line?

The simplest explanation for this observation might be that less educated people, due to their unfamiliarity with the task, simply don’t plan ahead.   They start out using lots of space, and are forced to squish in the later numbers closer together simply because they neglected to leave enough room.    But if you’re a fellow math geek, you have probably recognized that they could in fact be drawing a logarithmic scale, where linear intervals are proportional to ratios rather than quantities.   So for example, the space between 1 and 2 is roughly the same as the space from 2 to 4, from 4 to 8, etc.    This scale has many important scientific applications, such as its use in constructing a slide rule, the old-fashioned device for performing complex calculations in the days before electronic calculators.    Is it possible that humans have some inborn tendency to think in logarithmic scales rather than linear ones, and we somehow unlearn that during our early education?

Actually, this idea isn’t as crazy as it might sound.   If you think about it, suppose you are a primitive hunter-gatherer in the forest gathering berries, and each of your children puts a pile of their berries in front of you.   You need to decide which of your children gets an extra hunk of that giant platypus you killed today for dinner.    If you want to actually count the berries in each pile, that might take a very long time, especially if your society hasn’t yet invented a place-value system or names for large numbers.    However, to spot that one pile is roughly twice or three times the size of the other can probably be done visually.    In practice, quickly estimating the ratio between two quantities is often much more efficient than trying to actually count items, especially when the numbers involved are large.   So, thinking in ratios, which leads to a logarithmic scale, could very well be perfectly natural, and developing this sense may have been a useful survival trait for early humans.     As written by Dehaene et al, in their study linked in the show notes, “In the final analysis, the logarithmic code may have been selected during evolution for its compactness: like an engineer’s slide rule, a log scale provides a compact neural representation of several orders of magnitude with fixed relative precision.”

The same study notes that even after American children learn the so-called “correct” way to create a number line for small numbers, for a few years in elementary school they still tend to draw a logarithmic view when asked about higher numbers, in the thousands for example.   But eventually their brains are reprogrammed by our educational system, and they learn that all number lines are “supposed” to be drawn in the linear, equally-spaced way.    However, even in adults, additional studies have shown that if the task is made more abstract, for example using collections of dots too large to count or using sound sequences, a logarithmic type of comparison can be triggered.    So, it looks like we do not truly lose this inherent logarithmic sense, but are taught to override it in certain contexts. 
   
I wonder if mathematical education could be improved by taking both views into account from the beginning?   It seems like we might benefit from harnessing this innate logarithmic sense, rather than hiding it until much more advanced levels of math are reached in school.   On the other hand, I could easily imagine young children getting very confused by having to learn to draw two types of number lines depending on the context.    As with everything in math education, there really is no simple answer.

And this has been your math mutation for today.

References: