Leading Ones Prevail In Nature

What Are Leading Ones?

I will be addressing what I call “leading ones” and their occurrence in nature in some posts to come. This first post serves as an introduction. The list below shows the ten largest localities (tätorter) in Sweden as of 31 December 2017, and illustrates what I mean by leading ones. The first digit in each population number is the leading digit, indicated by the blue border. Leading ones simply means that the leading digit equals one.

top 10 localities populations in Sweden

Many years ago I noticed that leading ones were in majority in lists like the one above. It could just be a coincidence of course, but to me it seemed natural that small digits should dominate in a set of growth related numbers. Not everyone I mentioned this to agreed though, why would nature prefer leading ones?

My thinking went something like this: given two populations, say, 1000 and 9000, what would it take to grow them until their leading digit change? Well, starting with 1000 it needs to grow by 100% to reach 2000. On the other hand, the growth from 9000 to 10000 is just over 11%. Needless to say, under most circumstances, doubling a population takes longer time than growing it by 11%. So, sampling the number at a random point in time, you are more likely to find a low leading digit.

Benford’s Law

This idea recently came back to me and when searching for information I found that this phenomena has a hundred year old history. It is commonly known as Benford’s law, which roughly states “in many naturally occurring collections of numbers, the leading significant digit is likely to be small”.

Contrary to a flat distribution, where each number 1 to 9 would be equally likely at about 11%, Benford’s law states that a leading 1 should occur about 30% of the time. Higher digits then occur with progressively lower probability, ending with about 5% chance of a leading 9. The probability distribution (using \(d\) for digit) can be expressed as

\(P(d)=log_{10}(1+\frac{1}{d}) \tag{1}\)

Benford’s law applies not only to populations, or growth related numbers, but to a wide variety of data sets. For example, electricity bills, river lengths, physical constants and many more. It also works best when numbers are distributed across multiple orders of magnitude.

This has led Benford’s law to find practical use in audits to detect numbers fraud (e.g. financial and voting frauds). See for instance this short introductory paper: Understanding and Applying Benford’s Law.

A Chessy Example

A story told since the invention of chess involves a king who wants to reward his adviser for good services. The adviser requests grains, and cheekily asks that the amount follow from a simple summation rule. One grain is placed on the first square of a chess board, two grains on the next square and so on for all the board’s 64 squares. Thus doubling the number of grains for each subsequent square. The king laughs at what seems like a meager reward, only to later realize that the amount far exceeds his total storage.

The illustration below shows the growth through the first 14 squares of the board. Square number 14 holds 8192 grains, while the last square will hold close to 1019 grains. To make a rough estimation for grasping this number, assume 10 grains fit in a cm3. The grains on the last square would then be enough to cover the whole of Sweden in a 2 meter thick layer.

Collecting the leading digits in the resulting set of numbers and keeping count of how often they occur, we can plot this against Benford’s distribution. The graph below shows the clear resemblance between occurrence (blue bars) and Benford’s distribution (red line). The flat distribution is included (green line) for comparison. Note that the low number of data points (64) results in a noisy distribution. If we extend the iteration to hundreds of numbers (a much larger chessboard), the occurrence would get very close to the Benford distribution.

chess digit distribution

Scale Invariance

To widen our understanding, consider a different set of numbers. It starts with 10 and subsequent numbers follow by repeated multiplication by a factor 1.02. In other words, there is a 2% growth between numbers. We stop the process 348 iterations later, just shy of 10000.

Numbers then sort into “bins” according to both their leading digit and order of magnitude. This means that not only do, say, 10 and 20 sort into different bins, but 10 and 100 also sort into different bins. To represent the bins I use [x, y), where “[x” is the lower limit including x, and “y)” is the upper limit not including y. For example [10, 20) is a bin containing numbers from 10 to 19.999… but does not include 20.

Next, we distribute all 349 numbers into respective bins and plot them in a histogram together with the expected Benford distribution. In this case there is a very close resemblance, and we also clearly see the dominance of lower leading digits.

constant growth distribution

The underlying structure is independent of scale. Dividing the numbers in the above set by \(\pi\) results in new numbers, but their distribution still follow that of Benford.

Constant growth distribution rescaled

Further Reading

This post has introduced leading ones and Benford’s law and given some examples of why it manifests in growth numbers. I plan to follow up with a couple of posts on actual population and administration area number distributions, and more on the mathematics behind them.

Update: I have made two posts on the topic: leading ones in population distribution numbers and leading ones in Swedish locality areas.

Leave a Reply

Close Menu