Limited Time Offer: Get 2 Months of for only $5!

The History of Numbers

     The exact origins of numbers are stull unknown today. But, I think it’s safe to say that as civilization advanced numbers advanced with it; and it is equally safe to say that civilization could not have advanced without it.

Common intuition, and recently discovered evidence, indicates that numbers and counting began with the number one (even though in the beginning, they likely didn’t have a name for it). The first solid evidence of the existence of the number one, and that someone was using it to count, appears about 20,000 years ago. It was just a unified series of unified lines cut into a bone. It’s called the Ishango Bone.                


The Ishango Bone (it’s a fibula of a baboon) was found in the Congo region of Africa in 1960. The lines cut into the bone are too uniform to be accidental. Archeaologists believe that the lines were tally marks, to keep track of something, but what that was isn’t clear.

But numbers, and counting, didn’t truly come into being until the rise of cities. Indeed numbers and counting weren’t really needed until then. Numbers, and counting, began about 4,000 BC in Sumeria, one of the earliest civilizations. With so many people, livestock, crops, and artisanal goods located in the same place, cities needed a way to organize and keep track of it all, as it was used up, added to, or traded, Their method of counting began as a series of tokens, each token a man held represented something tangible, say five chickens. If a man had five chickens he was given five tokens. When he traded or killed one of his chickens, one of his tokens was removed. This was a big step in the history of numbers and counting because with that step subtraction, and thus the invention of arithmetic, was invented.

   In the beginning Sumerians kept a group of clay cones inside of clay pouches. The pouches were then sealed up and secured. Then the number of cones that were inside the clay pouch was stamped on the outside of the pouch, one stamp for each cone inside. Someone soon hit upon the idea that cones weren’t needed at all. Instead of having a pouch filled with five cones with five marks written on the outside of the pouch, why not just write those five marks on a clay tablet and do away with the cones altogether. This is exactly what happened.

This development of keeping track on clay tablets had ramifications beyond arithmetic, for with it, the idea of writing was also born.

But, if you’re keeping track of your wealth with marks made on a clay tablet what’s to stop you from making your own clay tablet and stamping in 50 marks, and trading those 50 marks on a clay tablet for grain?

To prevent this from happening, the Sumerians needed an official method of keeping track, and an official group of people who kept track. A select few were allowed to enter this group. They essentially became the world’s first accountants. So a farmer may make his own clay tablet with 50 marks on it and claim that this proved that he was the owner of 50 chickens, but if that tablet didn’t have an official seal from the accountants it was worthless.


It was the Egyptians who transformed one from a unit of counting things to a unit of measuring things. In Egypt, around 3,000 BC, the number one became used as a unit of measurement to measure length. If you’re going to build pyramids, temples, canals and obelisks you’re going to need a standard unit of measurement, and an accurate method of applying it to real objects. What they invented was the cubit, which they considered to be a sacred measurement. A cubit is the length of a man’s forearm, from elbow to fingertips, plus the width of his palm. Considered sacred as they were, they had officially ordained sticks which they kept in the temples. If copy cubits were needed they were made from one of the original cubits kept in the temple. Thanks to this very official, very guarded, and very precise unit of measurement the Egyptians were able to create colossal buildings and monuments with wondrous accuracy.

The Egyptians were the first civilization to invent different symbols for different numbers. They had a symbol for one, which was just a line. The symbol for ten was a rope. The symbol for a hundred was a coil of rope. They also had numbers for a thousand, and ten thousand. The Egyptians were the first to dream up the number one million, and its symbol was a prisoner begging for forgiveness, which was a person on its knees, hands upraised in the air, in a posture of humility.

Greece made further contributions to the world of numbers and counting, much of it under the guidance of Pythagoras. He studied in Egypt and upon returning to Greece, established a school of math, introducing Greece to mathematical concepts already prevalent in Egypt. Pythagoras was the first man to come up with the idea of odd and even numbers. To him, the odd numbers were male, the evens were female. He is most famous for his Pythagorean Theorem, but perhaps his greatest contribution to math was laying the groundwork for Greek mathematicians who would follow him.


Pythagoras was one of the world’s first theoretical mathematicians, but it was another famous Greek mathematician, Archimedes, who took theoretical mathematics to a level no one had ever taken it to before. Archimedes is considered to the greatest mathematician of antiquity and one of the greatest of all time. Archimedes enjoyed doing experiments with numbers, and playing games with numbers.

But as trivial as his math games may have seemed to outsiders they often led to results that proved practical in the real world, some of which we still benefit from today. One example: Archimedes wondered if you could turn the surface of a sphere into a cylinder, and if you did, what would be the difference in area covered? Archimedes successfully worked this problem out, and to him that was the end of it, but thanks to the formulas he left behind, later mapmakers were able to turn the surface of the globe into a flat map.

Archimedes is also famous for his Archimede’s Screw, which is a circular inclined plane (a screw) inside a tube that pumps water from one level to a higher level. He is equally famous for inventing a method of determining the volume of an object with an irregular shape. The answer came to him while he was bathing. He was so excited he leapt from his tub and ran naked through the streets screaming “Eureka!,” which is Greek for “I have found it.”

But, the Greek’s role in mathematics ended, quite literally, with Archimedes, who was killed by a Roman soldier during the Siege of Syracuse, in 212 BC. Under the rule of Rome, mathematics entered a dark age, and for a couple different reasons.

The main reason being that Romans simply weren’t interested in mathematics (they were more concerned with world domination), and secondly, because Roman numerals were so unwieldy, they couldn’t be used for anything more complicated than recording the results of calculations. Romans did all their calculating on a counting board, which was an early version of an abacus. And because of that Roman mathematics couldn’t, and didn’t, go far beyond adding and subtracting. Their use of numbers was good for nothing more than a simple counting system. The Romans use of numbers was no more advanced than the notches on the Ishango Bone. There’s a good reason there are no famous Roman mathematicians.

The next big advance (and it was a huge advance) in the world of numbers and mathematics came around 500 AD, in India. It would be the most revolutionary advance in numbers since the Sumerians invented math. The Indians invented an entirely new number: zero.

Under Hinduism, the Indians possessed concepts such as Nirvana, and eternity. These are some very abstract concepts that need some abstract math to help describe them. Take for instance a Rajju. A Rajju is the distance that a deity can fly in a six month period. Or a Palya, which is the length of time it would take to build a cube of lamb’s wool ten km high if you were to lay one strand of lamb’s wool every century. Try expressing that idea with Roman numerals.

The Indians needed a way to express very large numbers, and so they created a method of counting that could deal with very large numbers. It was they who created a different symbol, for every number from one to nine. They are known today as Arabic numerals, but they would more properly be called Indian numbers, since it was the Indians who invented them. The Indians have been using “Arabic” numbers them since about 500 BC.

Once zero was invented it transformed counting, and mathematics, in a way that would change the world. Zero is still considered India’s greatest contribution to the world. For the first time, in human history, the concept of nothing had a number.

Zero, by itself, wasn’t necessarily all that special. The magic happened when you paired it with other numbers. With the invention of zero the Indians gained the ability to make numbers infinitely large, or infinitely small. And that enabled Indian scientists to advance far ahead of other civilizations that didn’t have zero, due to the extraordinary calculations that could me made with it. For example, Indian astronomers were centuries ahead of the Christian world. With the help of the very plastic and fluid Arabic numbers, Indian scientists worked out that the Earth spins on its axis, and that it moves around the sun, something that Copernicus wouldn’t figure out for another thousand years.

The next big advance in numbers, the invention of fractions, came in 762 AD in what is now Baghdad, what was then Persia. The Persians were Muslims, and it was there adherence to the Koran and the teachings of Islam that led to the invention of fraction.

The Koran taught that possessions of the deceased had to be divided among their descendents. Unlike Christianity at the time, Islam—which was scarcely a hundred years old at the time—divided their belongings among women as well as men. But women got a lesser share. Working all of that out required fractions. But prior to 762 AD they didn’t have a system of mathematics sophisticated enough to do a very proper job. Enter Arabic numbers.

It’s not known for certain how Arabic numbers came to the Islamic world but the most prevalent theory states that one day an ambassador from India arrived in Baghdad and presented the Kaliph with the greatest gift he could think of: which was the gift of Arabic numbers.

Using Arabic numbers Muslim mathematicians invented entirely new methods of mathematics. Beside just simple fractions they turned Arabic numbers into quadratic equations, and algebra, and these numeric breakthroughs enabled science, mathematics and astronomy to reach new levels in the Middle East.

By 1200 AD, Arabic numerals made their way to North Africa, and from there, thanks to the curious son of an Italian merchant, they would soon make their way to Europe.

Leonardo Pisano Bigollo, who would later be known as Fibonacci, had been raised using Roman numerals. He was first introduced to Arabic numbers in Algeria while traveling with his merchant father. Fibonacci became enthralled with this new method of counting, and its very practical and plastic abilities. He introduced Arabic numbers to Europe when he returned to Italy. In 1202 he published a book of mathematics called Liber Abaci and it was through that, that Europe was introduced to Arabic numbers.

But the Roman numeral system was deeply entrenched in Europe, and it took a while for the Arabic system to catch on. The name for zero in italian was cipha, and it was regarded with such suspicion that it became the word for secret code: cipher. What finally caused the Arabic number system to catch on was good old-fashioned human greed, and a merchant class who could use it to quickly, easily and more precisely calculate interest on their goods and properties.

Prior to the Catholic Reformation, Christians weren’t allowed to charge interest on loans, because the Catholic Church said it was a sin to do so. But after the Catholic Revolution charging interest was allowed and the merchant class quickly adopted the new Arabic system, because interest could be calculated out to twelve decimal points, which when a merchant is dealing in compound interest, worked to his advantage. An abacus, the old system of counting under the Romans, could only calculate interest out to two decimal points.

From there, Arabic numbers spread to conquer the world.

The next big evolution in numbers came in Germany in 1679. German mathematician Gottfried Liebnitz invented a system of counting that used only ones and zeros; what would eventually be called the binary system. In the binary system ones stand for something, and zeros stand for nothing.

Liebnitz even went so far as to design a machine that would count in binary. The digital age, it seemed, had arrived. But—and this is big—though he designed his binary machine he never built it, and the world would have to wait another 265 years before one and zero would usher in the modern world.

The machine that would usher in the digital age was named Collosus, built in England in 1944, during World War II, as a codebreaking apparatus. Colossus was able to perform millions of rapid calculations, and with its help the Allies cracked numerous Nazi codes. Thanks to Collosus Ally codebreakers often knew what the Germans said even before Hitler did. Some experts believe that Collosus may have shortened the war by as much as two years.

From there the binary system was adopted and used in every computer ever built. Computer code—and with it the internet, space exploration, indeed modern life—would all be impossible without it.


Get 2 Months for $5!