bannerbannerbanner
The Tyranny of Numbers: Why Counting Can’t Make Us Happy
The Tyranny of Numbers: Why Counting Can’t Make Us Happy

Полная версия

The Tyranny of Numbers: Why Counting Can’t Make Us Happy

текст

0

0
Язык: Английский
Год издания: 2018
Добавлена:
Настройки чтения
Размер шрифта
Высота строк
Поля
На страницу:
2 из 6

It was too late. Pfungst’s report became a legend in experimental psychology. He argued, completely convincingly, that Hans was able to pick up the slight incline of the questioners’ heads when they had finished asking the question and expected the answer to be tapped out. When Hans had reached the right number of taps, he was able to notice the tiny relaxation, the minute straightening up or raised eyebrow with which the questioners betrayed themselves, and he stopped tapping. Hans also tapped faster when he knew it was a long answer (a practice that added to his intellectual reputation) and this too, said Pfungst, he was able to deduce from tiny changes of facial expression.

Pfungst’s own reputation was made, modern science had been vindicated – animals could not count. Von Ostein died a few months later. History does not relate what happened to Hans, but I’m not hopeful.

It was, of course, the dawn of the century of numbers. A hundred years later, we prove our humanity every time we open our newspapers with the mass of statistics on offer. Numbers are our servants, the tools of human domination. For centuries, counting was accepted as one of the key differences between human beings and animals. ‘Brutes cannot number, weigh and measure,’ said the great pioneer of quantification, the fifteenth-century cardinal Nicholas of Cusa. The arrival of a mathematical horse was a serious challenge to the numerical world view.

But 1904 was not just the year of Rolls-Royce and the entente cordiale, it was a moment of fantasy and wish-fulfilment. Peter Pan was on stage for the first time, British troops were taking the mysterious Tibetan city of Lhasa, and there was an absolute rash of ‘clever’ animals on offer, each one challenging the accepted view of numeracy as exclusively human. There was the English bulldog Kepler, owned by Sir William Huggins, which barked out its numerical answers. There was Clever Rosa, the so-called Mare of Berlin, and doyenne of the local music-hall stage. There was the clever dog of Utrecht, the reading pig of London, all forerunners of Babe in their own way. Pfungst despatched many of their reputations, but he was too old later to investigate Lady, the talking and for-tune-telling horse of Virginia.

Lady managed to count and tell fortunes by flipping up letters on a special chart. Pfungst’s biographer told the story of a colleague of his who had visited Lady to ask where his missing dog had gone. The horse spelled out the word DEAD. Actually, the dog turned up alive and well a few days later, and following Pfungst he gave his opinion – having studied Hans in such detail – that Lady had probably been able to sense the man’s conviction that the dog was dead.

So we can all breathe a sigh of relief – animals can’t count; numbers are safely human. But a century later, I still want to shake them all and say: ‘Hang on a minute!’ Here was a horse that was apparently able to read minds and spell correctly, never mind counting.

The accepted order of things is not absolutely safe, but we will never be able to set the clock back long enough to find out. Lady and Hans have long since gone to the knackers, and modern science is blind to strange phenomena like that. But the issue of counting and who is entitled to do so is still with us. Numbers have been in constant use for the past 6,000 years, but we have never quite resolved what they are. Are they intellectual tools for humans, invented by us for our own use? Or are they fantastical concepts, pre-existing in the universe before Adam, which we had to discover along with America and the laws of thermodynamics? Which came first: man or numbers? Are they available for any species to use or just an aspect of mankind? Are they real or human?

The consensus moves backwards and forwards through the centuries, and always with political implications. If numbers are a mysterious aspect of the universe put there by God, we tend to become subject to control and manipulation by accountant-priests. If they are a method by which humanity can control chaos, they become part of the tools of a technocratic scientific elite. The modern world is firmly in the second camp. We have rejected rule by priests in favour of rule by science. Measuring is something humans have invented for themselves, and animals – by definition – can’t hack it. They might be able to spell or pick up astonishingly subtle body language, but it is important for our world view that they can’t count.

The other view – that numbers have meaning in their own right – was represented by the Greek philosopher Pythagoras, in the sixth century BC, who was the great believer in the natural God-given beauty of numbers. For Pythagoras, numbers corresponded to a natural harmony in the universe, as bound up with the music of the spheres as they are with calculations. Music and beauty were underpinned by numbers. The story goes that Pythagoras listened to a blacksmith hammering away and heard the musical notes made by the anvil. He realized that they were generated by different lengths of hammer, and that there were perfect ratios of halves, thirds and quarters which generated perfect chords. They were the secret harmonies generated by the real numbers in nature. Another legend says that he learned about such things from the wisest people among the Egyptians and Phoenicians, and spent 12 years studying with the Magi after being taken captive and imprisoned in Babylon.

Numbers existed even before the universe itself, according to Pythagoras. But even that was too mild for St Augustine of Hippo, who declared that six was such a perfect number that it would be so even if the world didn’t exist at all. ‘We cannot escape the feeling,’ said the mathematician Heinrich Hertz, ‘that these mathematical formulae have an independent existence and an intelligence of their own, that they are wiser than we are, wiser even than their discoverers, that we get more out of them than was originally put into them.’

Numbers rule the universe, said Pythagoras and his followers. Anything less like irrational numbers was ‘unutterable’ and initiates were sworn to secrecy about them. According to his follower Proclos, the first people who mentioned such possibilities all died in a shipwreck. ‘The unutterable and the formless must needs be concealed,’ he said. ‘And those who uncovered and touched this image of life were instantly destroyed and shall remain forever exposed to the play of the eternal waves.’

It was irrational numbers that eventually did for Pythagoras. When his descendants opened up a whole new world of paradoxes, irrationality, bizarre computations, negative numbers, square roots, then nothing ever seemed the same again. And although technocrats might breathe a sigh of relief about this evidence of the modern rationality breaking through, we may also have lost something from that sense of pre-existing perfection.

II

The tyranny of numbers over life began with the simple counting of things with marks on wood. You find notched reindeer antlers from 15000 BC, well before Britain separated itself from continental Europe. These methods lasted into modern times, and were known in the English medieval treasury as ‘tally sticks’. Tally sticks were finally abandoned by the British civil service as a method of keeping track of public spending as late as 1783. After that, the old ones hung around for a generation or so, piled into the Court of Star Chamber until they needed the room. Someone then had the bright idea of burning them in the furnace that was used to heat the House of Lords. The result was that the furnace set light to the panelling and led to the conflagration in 1834 which burned down the Palace of Westminster, and led to the world-famous monstrosity that we know today, complete with Big Ben and mock Gothic.

A few more of these dangerous items were found during repairs to Westminster Abbey in 1909, and they were put safely into a museum, where they could do less damage.

Notches probably came before language. Prehistoric people probably used words like ‘one’, ‘two’, ‘three’ and ‘many’ for anything more complicated. In fact, sometimes ‘three’ might mean ‘many’. Take the French, for example: ‘trois’ (three) and ‘tres’ (very). Or the Latin: ‘tres’ (three) and ‘trans’ (beyond). A tribe of cave dwellers was discovered in the Philippines in 1972 who couldn’t answer the question ‘How many people are there in your tribe?’ But they could write down a list of all 24. But then counting is a philosophical problem, because you have to categorize. You have to be able to see the similarity in things and their differences, and decide which are important, before you can count them. You have to be able to do Venn diagrams in your head. ‘It must have required many ages to discover that a brace of pheasants and a couple of days were other instances of the number two,’ said the philosopher Bertrand Russell. But once you have grasped that concept, there are so many other categories you have to create before you can count how many people there are in your tribe. Do you count children? Do you count foreigners who happen to live with you? Do you count people who look completely different from everybody else? Counting means definition and control. To count something, you have to name it and define it. It is no coincidence that it was the ancient Sumerian civilization, the first real empire, which developed the idea of writing down numbers for the first time. They had to if they were going to manage an imperial culture of herds, crops and people. Yet any definition you make simply has to be a compromise with the truth. And the easier it is to count, the more the words give way to figures, the more counting simplifies things which are not simple. Because although you can count sheep until you are blue in the face, actually no two sheep are the same.

The old world did not need precision. If Christ’s resurrection was important, it wasn’t terribly vital to know what the actual date was. Instead Europeans used numbers for effect – King Arthur was described as killing tens of thousands in battles all by himself. Modern politicians are the last remaining profession which does this, claiming unwieldy figures which they have achieved personally, and pretending a spurious accuracy by borrowing the language of statistics, when actually they are using the numbers for impact like a medieval chronicler. Nor were the numbers they used much good for calculation. Nowadays Roman numerals only exist for things which powerful people want to look permanent – like television programmes or the US World Series – but which are actually very impermanent indeed.

The new world needed accuracy and simplicity for its commerce. Although they were briefly banned by an edict at Florence in 1229, the new Arabic numbers – brought back from the Middle East by the crusaders – began to be spread by the new mercantile classes. These were the literate and numerate people – with their quill pens tracing the exchange of vast sums – plotting the despatch of fleets for kings, managing the processing of wool with the new counting boards.

And soon everybody was counting with the same precision. King John’s Archbishop of Canterbury, Stephen Langton, had already organized a system of chapters and verses for the Bible, all numbered and meticulously indexed, which by the following century used the new Arabic numerals. Soon the new numbers were being used to measure much more elusive things. By 1245, Gossoin of Metz worked out that if Adam had set off the moment he was created, walking at the rate of 25 miles a day, he would still have to walk for another 713 years if he was going to reach the stars. The great alchemist Roger Bacon, who tried to measure the exact arc of a rainbow from his laboratory above Oxford’s Folly Bridge, calculated shortly afterwards that someone walking 20 miles a day would take 14 years, seven months and just over 29 days to get to the moon.

It’s a wonderful thought, somehow akin to Peter Pan’s famous directions for flying to Never Never Land, ‘turn right and straight on till morning’. But it was a different time then, when space was measured in the area that could be ploughed in a day and when time was dominated by the unavoidable changes between day and night. There were 12 hours in the medieval day, and 12 hours in the night too, but without proper tools for measuring time, these were expanded and compressed to make sure the 12 hours fitted into the light and the dark. An hour in the summer was much longer than an hour in the winter, and actually referred to the ‘hours’ when prayers should be said.

Nobody knows who invented clocks, though legend has it that it was the mysterious Gerbert of Aurillac, another medieval monk who spent some time in Spain learning from the wisdom of the Arabs, and who, as Sylvester II, was the Pope who saw in the last millennium. He was said to be so good at maths that contemporaries believed he was in league with the Devil. It was not for 250 years that clocks arrived in the mass market, but once they had, you could not argue with their accuracy. From the 1270s, they dominated European townscapes, insisting that hours were all the same length and that trading times and working times should be strictly regulated. Counting in public is, after all, a controlling force, as the people of Amiens discovered in 1335 when the mayor regulated their working and eating time with a bell, attached to a clock.

Clocks had bells before they had faces, and were machines of neat precision, as you can see by the fourteenth-century one still working in the nave of Salisbury Cathedral, with its careful black cogs swinging backwards and forwards, the very model of the new medieval exactitude. Soon every big city was imposing heavy taxes on themselves to afford the clock machinery, adding mechanical hymns, Magi bouncing in and out and – like the one in Strasbourg in 1352 – a mechanical cockerel which crowed and waggled its wings.

Where would they stop, these medieval calculators? Scholars at Merton College, Oxford in the fourteenth century thought about how you can measure not just size, taste, motion, heat, colour, but also qualities like virtue and grace. But then these were the days when even temperature had to be quantified without the use of a thermometer, which had yet to be invented. They must have been heady days, when the whole of quality – the whole of arts and perception – seemed to be collapsing neatly into science.

Renaissance humanity was putting some distance between themselves and the animals, or so they believed. Anyone still dragging their feet really was holding back history. Some dyed-in-the-wool conservatives insisted that people know pretty well when it was day and night, and when the seasons change, without the aid of the new counting devices. But anyone who thinks that, said the Protestant reformer Philip Melanchthon, deserves to have someone ‘shit a turd’ in his hat. The new world of number-crunchers had arrived.

III

To really get down to the business of measuring life, two important ideas about numbers were still needed – a concept of zero and a concept of negative numbers. But to emerge into common use, both had to run the gauntlet of the old battle lines about numbers drawn across medieval Europe. Then there were the adherents of the old ways of the abacus, whose computations were not written down, and whose ritual movements as they made their calculations were inspired by the old wisdom of Pythagoras. The new computations were all written down. They had no mystery. There was something open and almost democratic about them, and they needed no priests to interpret them. Calculation was no longer a mysterious art carried out by skilled initiates.

And the big difference between them now was zero. Its arrival in Europe was thanks to a monk, Raoul de Laon – a particularly skilful exponent of the art of the abacus – who used a character he called sipos to show an empty column. The word came from the Arabic sifr, meaning ‘empty’, the origin of the word ‘cypher’. Either way, the old abacus could be put away in the medieval equivalent of the loft.

Inventing zero turns numbers into an idea, according to the child psychologist Jean Piaget. It’s a difficult idea too: up to the age of six and a half, a quarter of all children write 0+0+0 = 3. But once people had begun to grasp it, they tended to regard zeros with suspicion. Division by zero meant infinity and infinity meant God, yet there it was bandied around the least important trade calculations for fish or sheep for everyone to see. Even more potent were the objections of the Italian bankers, who were afraid this little symbol would lead to fraud. It can, after all, multiply other figures by ten at one slip of the pen.

So zero was among the Arabic numbers banned in 1229. But the enormous increase in trade because of the crusades and the activities of the Hanseatic League meant that something of the kind was needed. Italian merchants increasingly used zero as an underground sign for ‘free trade’. Bootleggers and smugglers embraced the idea with enthusiasm. Like the V sign across the continent under Nazi tyranny, zero became a symbol of numerical freedom, a kind of medieval counterculture.

What normally happens with countercultures is that they get adopted by everyone, and that’s exactly what happened here. Soon everyone was using zero quite openly and adding and subtracting happily using a pen and ink. Soon the abacus had died out so much that it became a source of fascination. One of Napoleon’s generals was given one in Russia when he was a prisoner-of-war, and he was so astonished that he brought it back with him to Paris to show the emperor. Don’t let’s dismiss the abacus completely, though. In occupied Japan in 1945, the US army organized a competition between their automatic calculator and skilled Japanese abacus-users. The abacus turned out to be both quicker and more accurate for every computation except multiplication.

The people of Western Europe resisted negative numbers for much longer. They called them ‘absurd numbers’, believing they were futile and satanic concepts, corresponding to nothing real in the world. Now, of course, our lives are dominated by them, because the debts they represent correspond to positive numbers at the bank. Debt opened the way to negatives via the world-shattering invention of double-entry book-keeping. This may not have been the brainchild of a friend of Leonardo da Vinci, a Milanese maths teacher called Fra Luca Pacioli, but it was Pacioli’s destiny to popularize it. The writer James Buchan described his method as a ‘machine for calculating the world’. It was one of the ‘loveliest inventions of the human spirit’, according to Goethe. It could work out, at any moment, when your complex deals were profitable, allowing you to compare one deal with another.

Pacioli was a Franciscan who knew all about profit. He had special dispensation from the Pope (a friend of his) to own property. ‘The end and object of every businessman is to make a lawful and satisfactory profit so that he may sustain himself,’ he wrote. ‘Therefore he should begin with the name of God.’ Pacioli and his followers duly wrote the name of God at the beginning of every ledger. Before Pacioli, traders tended to give any fractions to the bank. After Pacioli they could record them. They could grasp at a glance where they stood while their cargoes were on the high seas, or while they waited two years or more for them to be fabricated into something else. They could make them stand still to be counted.

A Neo-Platonist, fascinated by Pythagoras and his ideas of divine proportion, Pacioli filled his book with other stuff like military tactics, architecture and theology. He chose a potent moment to publish it: the year after Columbus arrived back from discovering America. But despite his Pythagorean roots, Pacioli provided the foundations for a more complex idea of profit and loss, of assets and liabilities, making all of them clearly measurable. His critics feared he had abolished quality altogether. All that you could put down in the double entries were quantities – numbers of sheep, amounts of wool: there was no column for qualities like good or bad. The numbers had taken over, simplifying and calculating the world in their own way.

‘If you cannot be a good accountant, you will grope your way forward like a blind man and may meet great losses,’ said Pacioli, the first accountant. He explained that it was all a matter of taking a piece of paper, listing all the debit totals on one side and all the credit totals on the other. If they add up and there’s a profit – the result is happiness, he said, sounding like a Renaissance Mr Micawber. If not, you have to find out where the mistake is – as millions of frustrated amateur accountants have been doing ever since.

Within three centuries, accountants had developed into the professionals you called in after bankruptcy, a kind of undertakers for the business world, which is why the Companies Act of 1862 which regulated such matters became known as ‘the accountant’s friend’. ‘The whole affairs in bankruptcy have been handed over to an ignorant set of men called accountants, which was one of the greatest abuses ever introduced into law,’ said Mr Justice Quinn during a bankruptcy case in 1875. By 1790, the Post Office directory for London lists one accountant. By 1840 there were 107 of them and by 1845 – right in the middle of the railway boom – there were 210, ready to assist cleaning up the mess in the financial collapse the following year. Maybe they were even responsible for the rash of suicides in London in 1846; maybe they helped prevent more. We shall never know. Either way, it was just the beginning for the accountants. By the turn of the century there were over 6,000 in England and Wales. Now there are 109,000, but – as far as I know – no counting horses left at all.

IV

Pacioli and his spiritual descendants have helped to create the modern world with its obsession with counting, and the strange idea that once you have counted the money, you have counted everything. There is a hard-headed myth that numbers are serious and words are not – that counting things is a rigorous business for a serious man’s world. ‘When you can measure what you are speaking of and express it in terms of numbers, you know something about it,’ said the scientist Lord Kelvin. ‘When you cannot express it in terms of numbers your knowledge of it is of a meagre kind.’

Armed with this attitude, Lord Kelvin dismissed radio as pointless, aeroplanes as impossible and X-rays as a hoax, so we might wonder if he was right. But is my knowledge really of a meagre kind? Can I express something about myself in numbers? If Lord Kelvin’s successors managed to express my entire genetic code in numbers, would they know me better than I do myself when I can do no such thing? Well, in some ways, maybe they can – but I doubt it. Any more than the Nazis could know anything about the victims in concentration camps by branding a unique number on their arms.

We are more than branded now. We are in a world obsessed with numbers, from National Insurance and interest rates to buses, from bank balances and bar codes to the cacophony of statistics forced on us by journalists, politicians and marketeers. They seem to agree with Lord Kelvin that it provides us with a kind of exactitude. Actually it is exact about some of the least interesting things, but silent on wider and increasingly important truths.

We have to count. I’ve used piles of statistics in this book. Not counting is like saying that numbers are evil, which is even more pointless than saying that money is evil. We need to be able to count, even if the results aren’t very accurate. ‘Without number, we can understand nothing and know nothing,’ said the philosopher Philolaus in the fifth century BC, and he was right. But 25 centuries after Philolaus, the French philosopher Alain Badiou put the other point of view, and he was right too: ‘what arises from an event in perfect truth can never be counted’. Both Philolaus and Badiou are right. The more we rely on numbers to understand problems or measure aspects of human life, the more it slips through our fingers and we find ourselves clinging to something less than we wanted. Because every person, every thing, every event is actually unique and unmeasurable.

На страницу:
2 из 6