bannerbanner
The Slow Fix: Solve Problems, Work Smarter and Live Better in a Fast World
The Slow Fix: Solve Problems, Work Smarter and Live Better in a Fast World

Полная версия

The Slow Fix: Solve Problems, Work Smarter and Live Better in a Fast World

Язык: Английский
Год издания: 2018
Добавлена:
Настройки чтения
Размер шрифта
Высота строк
Поля
На страницу:
3 из 5

Politics is also steeped in the quick fix. Elected officials have every incentive to favour policies that will bear fruit in time for the next election. A cabinet minister may need results before the next reshuffle. Some analysts argue that each US administration enjoys only six months – that window between the Senate’s confirming its staff and the start of electioneering for the mid-term elections – when it can look beyond the daily headlines and polling numbers to concentrate on strategic decisions over the long term. Nor does it help that we tend to favour decisive, shoot-from-the-hip leadership. We love the idea of a lone hero riding into town with a ready-made solution in his saddle bag. How many figures have ever won power by declaring ‘It will take me a long time to work out how to solve our problems?’ Slowing down to reflect, analyse or consult can seem indulgent or weak, especially in moments of crisis. Or as one critic of the more cerebral Barack Obama put it: ‘We need a leader, not a reader.’ Daniel Kahneman, author of Thinking, Fast and Slow and only the second psychologist ever to win the Nobel Prize for Economics, believes our natural preference for politicians who follow their gut turns democratic politics into a carousel of quick fixes. ‘The public likes fast decisions,’ he says, ‘and that encourages leaders to go with their worst intuitions.’

Nowadays, though, it is no longer just politicians and business chiefs that believe they can wave a magic wand. We’re all at it in this age of bullshit, bluster and blarney. Look at the parade of tone-deaf wannabes vowing to be the next Michael Jackson or Lady Gaga on The X Factor. With so much pressure to stand out, we embellish our CVs, post flattering photos on Facebook and holler for attention on blogs and Twitter. A recent study found that 86 percent of 11-year-olds use social media to build their ‘personal brand’ online. Some of this chest-thumping may win friends and influence people, but it can also drive us into the arms of the quick fix. Why? Because we end up lacking the humility to admit that we do not have all the answers, that we need time and a helping hand.

The self-help industry must take some of the blame for this. After years of reading and writing about personal development, Tom Butler-Bowdon fell out of love with his own field. Too many motivational gurus, he decided, hoodwink the public with short cuts and quick fixes that do not really work. As a riposte, he published Never Too Late to Be Great, which shows how the best solutions in every field, from the arts to business to science, usually have a long gestation period. ‘By glossing over the fact that it takes time to produce anything of quality, the self-help industry has bred a generation of people that expect to fix everything tomorrow,’ he says.

The media add fuel to that fire. When anything goes wrong – in politics, business, a celebrity relationship – journalists pounce, dissecting the crisis with glee and demanding an instant remedy. After the golfer Tiger Woods was outed as a serial philanderer, he vanished from the public eye for three months before finally breaking his silence to issue a mea culpa and announce he was in therapy for sex addiction. How did the media react to being made to wait that long? With fury and indignation. The worst sin for a public figure on the ropes is to fail to serve up an instant exit strategy.

That impatience fuels a tendency to overhype fixes that later turn out to be complete turkeys. An engineer by training, Marco Petruzzi worked as a globetrotting management consultant for 15 years before abandoning the corporate world to build better schools for the poor in the United States. We will meet him again later in the book, but for now consider his attack on our culture of hot air. ‘In the past, hard-working entrepreneurs developed amazing stuff over time, and they did it, they didn’t just talk about it, they did it,’ he says. ‘We live in a world now where talk is cheap and bold ideas can create massive wealth without ever having to deliver. There are multi-billionaires out there who never did anything but capture the investment cycle and the spin cycle at the right moment, which just reinforces a culture where people don’t want to put in the time and effort to come up with real and lasting solutions to problems. Because if they play their cards right, and don’t worry about the future, they can get instant financial returns.’

From most angles, then, the quick fix looks unassailable. Everything from the wiring of our brains to the ways of the world seems to favour band-aid solutions. Yet all is not lost. There is hope. Wherever you go in the world today, and in every walk of life, more people are turning away from the quick fix to find better ways to solve problems. Some are toiling below the radar, others are making headlines, but all share one thing in common: a hunger to forge solutions that actually work.

The good news is the world is full of Slow Fixes. You just have to take the time to find and learn from them.

CHAPTER TWO

CONFESS: The Magic of Mistakes and the Mea Culpa

Success does not consist in never making mistakes but in never making the same one a second time.

George Bernard Shaw

On a crisp night in early September, four Typhoon fighter jets roared across the sky above the freezing waters of the North Sea. Locked in a two-on-two dogfight, they swooped, banked and sliced through the darkness at up to 500 miles per hour, searching for a kill-shot. It was a training exercise, but to the pilots it all seemed very real. Strapped into his cockpit, with 24,000 pounds of killing machine throbbing at his fingertips, Wing Commander Dicky Patounas was feeling the adrenaline. It was his first night-time tactical sortie in one of the most powerful fighter jets ever built.

‘We’re in lights off because we’re doing this for real, which we don’t do very often, so it’s pitch black and I’m on goggles and instruments only,’ Patounas recalls. ‘I’m working the radar, putting it in the right mode by shortening the range, changing the elevation, all basic stuff. But the plane was new to me, so I’m maxed out.’ And then something went wrong.

A few months later Patounas relives that night back on the ground. His air base, RAF Coningsby, is in Lincolnshire, an eastern county of England whose flat, featureless terrain is prized more by aviators than by tourists. Dressed in a green flight suit festooned with zippers, Patounas looks like a Top Gun pilot from central casting – square jaw, broad shoulders, ramrod posture and cropped hair. He whips out pen and paper to illustrate what happened next on that September night, speaking in the clipped tones of the British military.

Patounas was flying behind the two ‘enemy’ Typhoons when he decided to execute a manoeuvre known as the overshoot to a Phase 3 Visual Identification (VID). He would pull out to the left and then slingshot back onto his original course, popping up right behind the trailing enemy plane. But something unforeseen happened. Instead of holding their course, the two rival jets up ahead banked left to avoid a helicopter 20 miles away. Both pilots announced the change on the radio but Patounas failed to hear it because he was too distracted executing his manoeuvre. ‘It’s all quite technical,’ he says. ‘You’ve got to do 60 degrees angle of bank through 60 degrees and then roll out for 20 seconds, then put your scanner down by 4 degrees, then change your radar to 10-mile scale, and after 20 seconds you come right using 45 degrees angle of bank, you go through 120 degrees, you roll out and pick up the guy on your radar and he should be at about 4 miles. So I’m working all this out and I miss the radio call stating the new heading.’

When Patounas rolled back out of the manoeuvre, he spotted an enemy Typhoon in front of him just as expected. He was pumped. ‘This aircraft now appears under my cross where I put it for the guy to appear, so I think I’ve done the perfect overshoot,’ he says. ‘I’ve set my radar up, pitched back in and the guy I’m looking for is under my cross in the pitch black. And I go, “I’m a genius, I’m good at this shit.” I was literally thinking I’ve never flown one so perfectly.’

He shakes his head and laughs wryly at his own hubris: it turned out the wrong Typhoon was in his crosshairs. Instead of ending up behind the trailing jet, Patounas was following in the slipstream of the frontrunner – and he had no idea. ‘It was my mistake: I basically lost awareness of two of the aircraft,’ he says. ‘I knew they were there but I didn’t ensure I could see two tracks. What I should have done was bump the range scale up and have a look for the other guy, but I didn’t because I said to myself, “This is perfect.”’

The result was that Patounas passed within 3,000 feet of the rear Typhoon. ‘It wasn’t that close but the key is I had no awareness, because I didn’t even know he was there,’ he says. ‘It could have been three feet, or I could have flown right into him.’ Patouanas falls quiet for a moment, as if picturing the worst-case scenario. On that September night his wingman watched the whole fiasco unfold, knew there was no real danger of a collision and allowed the exercise to continue, but a similar mistake in real combat could have been catastrophic – and Patounas knew it.

The rule of thumb in civil aviation is that a typical air accident is the result of seven human errors. Each mistake on its own may be harmless, even trivial, but string them together and the net effect can be lethal. Flying modern fighter jets, with their fiendishly complex computer systems, is an especially risky business. While enforcing the no-fly zone over Libya in 2011, a US F-15E crashed outside Benghazi after a mechanical failure. A month earlier, two F-16s from the Royal Thai air force fell from the sky during a routine training exercise.

What was surprising about the Typhoon incident over the North Sea was not that it happened but how Patounas reacted: he told everyone about his mistake. In the macho world of the fighter pilot, mea culpas are thin on the ground. As a 22-year veteran of the RAF and commander of a squadron of 18 Typhoon pilots, Patounas had a lot to lose yet still gathered together his entire crew and owned up. ‘I could have come away from this and not said anything, but the right thing to do was to raise it, put it into my report and get it in the system,’ he says. ‘I briefed the whole squadron on how I make mistakes and the mistake I made. That way people know I’m happy to put my hand up and say I messed up too, I’m human.’

This brings us to the first ingredient of the Slow Fix: admitting when we are wrong in order to learn from the error. That means taking the blame for serious blunders as well as the small mistakes and near misses, which are often warning signs of bigger trouble ahead.

Yet highlighting errors is much harder than it sounds. Why? Because there is nothing we like less than owning up to our mistakes. As social animals, we put a high premium on status. We like to fare bella figura, as the Italians say, or look good in front of our peers – and nothing ruins a nice figura more than screwing something up.

That is why passing the buck is an art form in the workplace. My first boss once gave me a piece of advice: ‘Remember that success has many fathers but failure is an orphan.’ Just look at your own CV – how many of your mistakes from previous jobs are listed there? On The Apprentice, most boardroom showdowns involve contestants pinning their own blunders on rivals. Even when big money is at stake, companies often choose to bury their heads in the sand rather than confront errors. Nearly half of financial services firms do not step in to rescue a floundering project until it has missed its deadline or run over budget. Another 15 per cent lack a formal mechanism to deal with a project’s failure.

Nor does it help that society often punishes us for embracing the mea culpa. In a hyper-competitive world, rivals pounce on the smallest error, or the tiniest whiff of doubt, as a sign of weakness. Though Japanese business chiefs and politicians sometimes bow and beg for forgiveness, their counterparts elsewhere bend both language and credibility to avoid squarely owning up to a mistake. In English, the word ‘problem’ has been virtually excised from everyday speech in favour of anodyne euphemisms such as ‘issue’ and ‘challenge’. Hardly a surprise when studies show that executives who conceal bad news from the boss tend to climb the corporate ladder more quickly.

In his retirement, Bill Clinton makes it a rule to say ‘I was wrong’ or ‘I didn’t know that’ at least once a day. If such a moment fails to arise naturally, he goes out of his way to engineer one. He does this to short-circuit the Einstellung effect and all those other biases we encountered earlier. Clinton knows the only way to solve problems in a complex, ever-changing world is to keep an open mind – and the only way to do that is to embrace your own fallibility. But can you imagine him uttering those phrases while he was President of the United States? Not a chance. We expect our leaders to radiate the conviction and certainty that come from having all the answers. Changing direction, or your mind, is never taken as proof of the ability to learn and adapt; it is derided as flip-flopping or wimping out. If President Clinton had confessed to making mistakes, or entertaining doubts about his own policies, his political enemies and the media would have ripped him to pieces.

The threat of litigation is another incentive to shy away from a proper mea culpa. Insurance companies advise clients never to admit blame at the scene of a traffic accident, even if the crash was clearly their fault. Remember how long it took BP to issue anything resembling an official apology for the Deepwater Horizon oil spill? Nearly two months. Behind the scenes, lawyers and PR gurus pored over legal precedents to fashion a statement that would appease public opinion without opening the door to an avalanche of lawsuits. Nor is it just companies that shrink from accepting blame. Even after they leave office and no longer need to woo the electorate, politicians find it hard to own up to their errors. Neither Tony Blair nor George W. Bush has properly apologised for invading Iraq in search of weapons of mass destruction that did not exist. Remove individual ego from the equation, and collectively we still shy away from mea culpas. Britain waited nearly four decades to issue a formal apology for the Bloody Sunday massacre in Northern Ireland in 1972. Australia only apologised in 2008 for the horrors visited upon its aboriginal peoples, followed a year later by the US Senate apologising to African-Americans for the wrongs of slavery.

Even when there are no witnesses to our slip-ups, admitting we are wrong can be wrenching. ‘Nothing is more intolerable,’ Ludwig van Beethoven noted, ‘than to have to admit to yourself your own errors.’ Doing so forces you to confront your frailties and limitations, to rethink who you are and your place in the world. When you mess up, and admit it to yourself, there is nowhere to hide. ‘This is the thing about fully experiencing wrongness,’ wrote Kathryn Schulz in her book Being Wrong. ‘It strips us of all our theories, including our theories about ourselves … it leaves us feeling flayed, laid bare to the bone and the world.’ Sorry really is the hardest word.

This is a shame, because mistakes are a useful part of life. To err is human, as the saying goes. Error can help us solve problems by showing us the world from fresh angles. In Mandarin, the word ‘crisis’ is rendered with two characters, one signifying ‘danger’, the other ‘opportunity’. In other words, every screw-up holds within it the promise of something better – if only we take the time to acknowledge and learn from it. Artists have known this for centuries. ‘Mistakes are almost always of a sacred nature,’ observed Salvador Dalí. ‘Never try to correct them. On the contrary: rationalise them, understand them thoroughly. After that, it will be possible for you to sublimate them.’

That same spirit reigns in the more rigorous world of science, where even a failed experiment can yield rich insights and open new paths of inquiry. Many world-changing inventions occurred when someone chose to explore – rather than cover up – an error. In 1928, before leaving to spend August with his family, Sir Alexander Fleming accidentally left a petri dish containing staphylococcus bacteria uncovered in his basement laboratory in London. When he returned a month later he found a fungus had contaminated the sample, killing off all the surrounding bacteria. Rather than toss the dish in the bin, he analysed the patch of mould and found it contained a powerful infection-fighting agent. He named it Penicillium notatum. Two decades later, penicillin, the world’s first and still most widely used antibiotic, hit the market, revolutionising healthcare and earning Fleming a Nobel prize in Medicine. ‘Anyone who has never made a mistake,’ said Einstein, ‘has never tried anything new.’

Military folk have always known that owning up to mistakes is an essential part of learning and solving problems. Errors cost lives in the air force, so flight safety has usually taken precedence over fare bella figura. In the RAF’s long-running monthly magazine, Air Clues, pilots and engineers write columns about mistakes made and lessons learned. Crews are also fêted for solving problems. In a recent issue, a smiling corporal from air traffic control received a Flight Safety Award for overruling a pilot and aborting a flight after noticing a wingtip touch the ground during take-off.

In the RAF, as in most air forces around the world, fighter pilots conduct no-holds-barred debriefings after every sortie to examine what went right and wrong. But that never went far enough. RAF crews tended to share their mistakes only with mates rather than with their superiors or rival squadrons. As one senior officer says: ‘A lot of valuable experience that could have made flying safer for everyone was just seeping away through the cracks.’

To address this, the RAF hired Baines Simmons, a consulting firm with a track record in civil aviation, to devise a system to catch and learn from mistakes, just as the transportation, mining, food and drug safety industries have done.

Group Captain Simon Brailsford currently oversees the new regime. After joining the RAF as an 18-year-old, he went on to fly C130 Hercules transport planes as a navigator in Bosnia, Kosovo, northern Iraq and Afghanistan. Now 46, he combines the spit-and-polish briskness of the officers’ mess with the easy charm of a man who spent three years as the Equerry to Her Majesty Queen Elizabeth II.

On the whiteboard in his office he uses a red felt-tip pen to sketch me a picture of a crashed jet, a dead pilot and a plume of smoke. ‘Aviation is a dangerous business,’ he says. ‘What we’re trying to do is stop picking up the deceased and the bits of the broken aeroplane on the ground and pull the whole story back to find out the errors and the near misses that can lead to the crash, so the crash never happens in the first place. We want to solve issues before they become problems.’

Every time crew members at RAF Coningsby catch themselves doing something that could jeopardise safety, they are now urged to submit a report online or fill in one of the special forms pinned up in work stations all over the base. Those reports are then funnelled to a central office, which decides whether to investigate further.

To make the system work, the RAF tries to create what it calls a ‘just culture’. When someone makes a mistake, the automatic response is not blame and punishment; it is to explore what went wrong in order to fix and learn from it. ‘People must feel that if they tell you something, they’re not going to get into trouble, otherwise they won’t tell you when things go wrong, and they might even try to cover them up,’ says Brailsford. ‘That doesn’t mean they won’t get told off or face administrative action or get sent for extra training, but it means they’ll be treated in a just manner befitting what happened to them, taking into account the full context. If you make a genuine mistake and put up your hand, we will say thank you. The key is making sure everyone understands that we’re after people sharing their errors rather than keeping it to themselves so that we’re saving them and their buddies from serious accidents.’

RAF Coningsby rams home that message at every turn. All around the base, in hallways, canteens and even above the urinals, posters urge crew to flag even the tiniest safety concern. Toilet cubicles are stuffed with laminated brochures explaining how to stay safe and why even the smallest mishap is worth reporting. Hammered into the ground beside the main entrance is a poster bearing a photo of the Station Flight Safety Officer pointing his finger in the classic Lord Kitchener pose. Printed above his office telephone number is the question: ‘So what did you think of today?’ The need to admit mistakes is also baked into cadets at military academy. ‘It’s definitely drilled into us from the start that “we prefer you mess up and let us know”,’ says one young engineer at RAF Coningsby. ‘Of course, you get a lot of stick and banter from your mates for making mistakes, but we all understand that owning up is the best way to solve problems now and in the future.’

The RAF ensures that crew see the fruits of their mea culpas. Safety investigators telephone all those who flag up problems within 24 hours, and later tell them how the case was concluded. They also conduct weekly workshops with engineers to explain the outcome of all investigations and why people were dealt with as they were. ‘You can see their eyebrows go up when it’s clear they won’t be punished for making a mistake and they might actually get a pat on the back,’ says one investigator.

Group Captain Stephanie Simpson, a 17-year veteran of the RAF, is in charge of safety in the engineering division at Coningsby. She has quick, watchful eyes and wears her hair scraped back in a tight bun. She tells me the new regime paid off recently when an engineer noticed that carrying out a routine test on a Typhoon had sheared off the end of a dowel in the canopy mechanism. A damaged canopy might not open, meaning a pilot trying to jettison from the cockpit would be mashed against the glass.

The engineer filed a report and Simpson’s team swung into action. Within 24 hours they had figured out that an elementary mistake during the canopy test could damage the dowel. There was no requirement to go back and check afterwards. Flight crews immediately inspected the suspect part across the entire fleet of Typhoons in Europe and Saudi Arabia. The procedure was then changed to ensure that the dowel is no longer damaged during the test.

‘Ten years ago this would probably never have been reported – the engineers would have just thought, “Oh, that’s broken, we’ll just quietly replace it,” and then carried on,’ says Simpson. ‘Now we’re creating a culture where everyone is thinking, “Gosh, there could be other aircraft on this station with the same problem that might not be spotted in future so I’d better tell someone right now.” That way you stop a small problem becoming a big one.’

Thanks to Patounas’s candour, an RAF investigation discovered that a series of errors led to the near miss above the North Sea. His own failure to hear the order to bank left was the first. The second was that the other pilots changed course even though he did not acknowledge the fresh heading. Then, after Patounas overshot, the whole team failed to switch on their lights. ‘It turned out a whole set of factors were not followed and if anyone had done one of the things they should have, it wouldn’t have happened,’ says Patounas. ‘The upside is this reminds everyone of the rules for doing a Phase 3 VID at night. So next time we won’t have the same issue.’

Others in his squadron are already following his lead. Days before my visit, a young corporal pointed out that certain procedures were not being properly followed. ‘What she said was not a particularly good read, but that’s going in her report as a positive because she had the courage of her convictions to go against the grain when she could have been punished,’ says Patounas. ‘Twenty years ago, she wouldn’t have raised the question or if she had she’d have been told, “Don’t you say how rubbish my squadron is! I want my dirty laundry kept to me,” whereas I’m saying thank you.’

На страницу:
3 из 5