bannerbanner
The Death of Truth
The Death of Truth

Полная версия

The Death of Truth

Язык: Английский
Год издания: 2019
Добавлена:
Настройки чтения
Размер шрифта
Высота строк
Поля
На страницу:
2 из 3

Such absurd details are unnerving rather than merely comical because this is not simply a Twilight Zone case of one fantasist living in a big white house in Washington, D.C. Trump’s proclivity for chaos has not been contained by those around him but has instead infected his entire administration. He asserts that “I’m the only one that matters” when it comes to policy making, and given his disdain for institutional knowledge he frequently ignores the advice of cabinet members and agencies, when he isn’t cutting them out of the loop entirely.

Ironically, the dysfunction that these habits fuel tends to ratify his supporters’ mistrust of Washington (one of the main reasons they voted for Trump in the first place), creating a kind of self-fulfilling prophecy, which, in turn, breeds further cynicism and a reluctance to participate in the political process. A growing number of voters feel there is a gross disconnect between their views and government policies. Commonsense policies like mandatory background checks for gun purchases, supported by more than nine out of ten Americans, have been stymied by Congress, which is filled with members who rely on donations from the NRA. Eighty-seven percent of Americans said in a 2018 poll that they believe Dreamers should be allowed to stay in the States, and yet DACA has remained a political football. And 83 percent of Americans (including 75 percent of Republicans) say they support net neutrality, which was overturned by Trump’s FCC.

THE DECLINING ROLE of rational discourse—and the diminished role of common sense and fact-based policy—hardly started with Donald J. Trump. Rather, he represents the culmination of trends diagnosed in prescient books by Al Gore, Farhad Manjoo, and Susan Jacoby, published nearly a decade before he took up residence at 1600 Pennsylvania Avenue. Among the causes of this decline, Jacoby (The Age of American Unreason) cited an “addiction to infotainment,” the continuing strength of religious fundamentalism, “the popular equation of intellectualism with a liberalism supposedly at odds with traditional American values,” and an education system that “does a poor job of teaching not only basic skills but the logic underlying those skills.”

As for Gore (The Assault on Reason), he underscored the ailing condition of America as a participatory democracy (low voter turnout, an ill-informed electorate, campaigns dominated by money, and media manipulation) and “the persistent and sustained reliance on falsehoods as the basis of policy, even in the face of massive and well-understood evidence to the contrary.”

At the forefront of Gore’s thinking was the Bush administration’s disastrous decision to invade Iraq and its cynical selling of that war to the public, distorting “America’s political reality by creating a new fear of Iraq that was hugely disproportionate to the actual danger” posed by a country that did not attack the United States on 9/11 and lacked the terrifying weapons of mass destruction that administration hawks scared Americans into thinking it possessed.

Indeed, the Iraq war remains a lesson in the calamities that can result when momentous decisions that affect the entire world are not made through a rational policy-making process and the judicious weighing of information and expert analysis, but are instead fueled by ideological certainty and the cherry picking of intelligence to support preconceived idées fixes.

From the start, administration hawks led by Vice President Dick Cheney and Secretary of Defense Donald Rumsfeld pressed for “forward-leaning” intelligence that would help make the case for war. A shadowy operation called the Office of Special Plans was even set up at the Defense Department; its mission, according to a Pentagon adviser quoted by Seymour M. Hersh in The New Yorker, was to find evidence of what Rumsfeld and Deputy Secretary of Defense Paul Wolfowitz already believed to be true—that Saddam Hussein had ties to al-Qaeda and that Iraq possessed a huge arsenal of biological, chemical, and possibly nuclear weapons.

Meanwhile, planning for the war on the ground ignored sober warnings from experts, like the army chief of staff, Eric K. Shinseki, who testified that postwar Iraq would require “something on the order of several hundred thousand soldiers.” His recommendation was quickly shot down, as were reports from the Rand Corporation and the Army War College, both of which also warned that postwar security and reconstruction in Iraq would require a large number of troops for an extended period of time. These assessments went unheeded—with fateful consequences—because they did not mesh with the administration’s willfully optimistic promises that the Iraqi people would welcome American troops as liberators and that resistance on the ground would be limited. “A cakewalk,” as one Rumsfeld ally put it.

The failure to send enough troops to secure the country and restore law and order; the sidelining of the State Department’s Future of Iraq Project (because of tensions with the Pentagon); the ad hoc decisions to dissolve the Iraqi army and to ban all senior members of the Baath Party: such disastrous and avoidable screwups resulted in a bungled American occupation that one soldier, assigned to the Coalition Provisional Authority, memorably described as “pasting feathers together, hoping for a duck.” In fact, the Iraq war would prove to be one of the young century’s most catastrophic events, exploding the geopolitics of the region and giving birth to ISIS and a still unspooling set of disasters for the people of Iraq, the region, and the world.

ALTHOUGH TRUMP frequently criticized the decision to invade Iraq during the 2016 campaign, his White House has learned nothing from the Bush administration’s handling of that unnecessary and tragic war. Instead, it has doubled down on reverse-engineered policy making and the repudiation of experts.

For instance, the State Department has been hollowed out as a result of Steve Bannon’s vow to fight for the “deconstruction of the administrative state” and the White House’s suspicion of “deep state” professionals. The president’s son-in-law, Jared Kushner, a thirty-six-year-old real-estate developer with no government experience, was handed the Middle East portfolio, while the shrinking State Department was increasingly sidelined. Many important positions stood unfilled at the end of Trump’s first year in office. This was partly because of downsizing and dereliction of duty, partly because of a reluctance to appoint diplomats who expressed reservations about the president’s policies (as in the case of the crucial role of ambassador to South Korea), and partly because of the exodus of foreign service talent from an agency that, under new management, no longer valued their skills at diplomacy, policy knowledge, or experience in far-flung regions of the world. Combined with Trump’s subversion of longtime alliances and trade accords and his steady undermining of democratic ideals, the carelessness with which his administration treated foreign policy led to world confidence in U.S. leadership plummeting in 2017 to a new low of 30 percent (below China and just above Russia), according to a Gallup poll.

In some respects, the Trump White House’s disdain for expertise and experience reflected larger attitudes percolating through American society. In his 2007 book, The Cult of the Amateur, the Silicon Valley entrepreneur Andrew Keen warned that the internet not only had democratized information beyond people’s wildest imaginings but also was replacing genuine knowledge with “the wisdom of the crowd,” dangerously blurring the lines between fact and opinion, informed argument and blustering speculation.

A decade later, the scholar Tom Nichols wrote in The Death of Expertise that a willful hostility toward established knowledge had emerged on both the right and the left, with people aggressively arguing that “every opinion on any matter is as good as every other.” Ignorance now was fashionable.

“If citizens do not bother to gain basic literacy in the issues that affect their lives,” Nichols wrote, “they abdicate control over those issues whether they like it or not. And when voters lose control of these important decisions, they risk the hijacking of their democracy by ignorant demagogues, or the more quiet and gradual decay of their democratic institutions into authoritarian technocracy.”

THE TRUMP White House’s preference for loyalty and ideological lockstep over knowledge is on display throughout the administration. Unqualified judges and agency heads were appointed because of cronyism, political connections, or a determination to undercut agencies that stood in the way of Trump’s massive deregulatory plans benefiting the fossil fuel industry and wealthy corporate donors. Rick Perry, who was famous for wanting to abolish the Department of Energy, was named to head it, presiding over cutbacks to renewable energy programs; and the new EPA head, Scott Pruitt, who had repeatedly sued the Environmental Protection Agency over the years, swiftly began dismantling and slow walking legislation designed to protect the environment.

The public—which opposed the GOP tax bill and worried that its health care would be taken away—was high-handedly ignored when its views failed to accord with Trump administration objectives or those of the Republican Congress. And when experts in a given field—like climate change, fiscal policy, or national security—raised inconvenient questions, they were sidelined, or worse. This, for instance, is what happened to the Congressional Budget Office (created decades ago as an independent, nonpartisan provider of cost estimates for legislation) when it reported that a proposed GOP health-care bill would leave millions more uninsured. Republicans began attacking the agency—not just its report, but its very existence. Trump’s director of the Office of Management and Budget, Mick Mulvaney, asked whether the CBO’s time had “come and gone,” and other Republicans proposed slashing its budget and cutting its staff of 235 by 89 employees.

For that matter, the normal machinery of policy making—and the normal process of analysis and review—were routinely circumvented by the Trump administration, which violated such norms with knee-jerk predictability. Many moves were the irrational result of a kind of reverse engineering: deciding on an outcome the White House or the Republican Congress wanted, then trying to come up with rationales or selling points afterward. This was the very opposite of the scientific method, whereby data is systematically gathered and assessed to formulate and test hypotheses—a method the administration clearly had contempt for, given its orders to CDC analysts to avoid using the terms “science-based” and “evidence-based.” And it was a reminder that in Orwell’s dystopia in 1984 there is no word for “science,” because “the empirical method of thought, on which all the scientific achievements of the past were founded,” represents an objective reality that threatens the power of Big Brother to determine what truth is.

In addition to announcing that it was withdrawing from the Paris climate accord (after Syria signed on, the United States was left as the lone country repudiating the global agreement), the Trump administration vowed to terminate President Obama’s Clean Power Plan and reverse a ban on offshore oil and gas drilling. Scientists were dismissed from government advisory boards, and plans were made to cut funding for an array of research programs in such fields as biomedicine, environmental science, engineering, and data analysis. The EPA alone was facing proposed cuts from the White House of $2.5 billion from its annual budget—a reduction of more than 23 percent.

IN APRIL 2017, the March for Science, organized in Washington to protest the Trump administration’s antiscience policies, grew into more than four hundred marches in more than thirty-five nations, participants marching out of solidarity with colleagues in the United States and also out of concern for the status of science and reason in their own countries. Decisions made by the U.S. government about climate change and other global problems, after all, have a domino effect around the world—affecting joint enterprises and collaborative research, as well as efforts to find international solutions to crises affecting the planet.

British scientists worry about how Brexit will affect universities and research institutions in the U.K. and the ability of British students to study in Europe. Scientists in countries from Australia to Germany to Mexico worry about the spread of attitudes devaluing science, evidence, and peer review. And doctors in Latin America and Africa worry that fake news about Zika and Ebola are spreading misinformation and fear.

Mike MacFerrin, a graduate student in glaciology working in Kangerlussuaq, a town of five hundred in Greenland, told Science magazine that the residents there had practical reasons to worry about climate change because runoff from the ice sheet had partially washed out a local bridge. “I liken the attacks on science to turning off the headlights,” he said. “We’re driving fast and people don’t want to see what’s coming up. Scientists—we’re the headlights.”

ONE OF THE most harrowing accounts of just how quickly “the rule of raison”—faith in science, humanism, progress, and liberty—can give way to “its very opposite, terror and mass emotion,” was laid out by the Austrian writer Stefan Zweig in his 1942 memoir, The World of Yesterday. Zweig witnessed two globe-shaking calamities in his life—World War I, followed by a brief respite, and then the cataclysmic rise of Hitler and descent into World War II. His memoir is an act of bearing witness to how Europe tore itself apart suicidally twice within decades—the story of the terrible “defeat of reason” and “the wildest triumph of brutality,” and a lesson, he hoped, for future generations.

Zweig wrote about growing up in a place and time when the miracles of science—the conquest of diseases, “the transmission of the human word in a second around the globe”—made progress seem inevitable, and even dire problems like poverty “no longer seemed insurmountable.” An optimism (which may remind some readers of the hopes that surged through the Western world after the fall of the Berlin Wall in 1989) informed his father’s generation, Zweig recalled: “They honestly believed that the divergencies and the boundaries between nations and sects would gradually melt away into a common humanity and that peace and security, the highest of treasures, would be shared by all mankind.”

When he was young, Zweig and his friends spent hours hanging out at coffeehouses, talking about art and personal concerns: “We had a passion to be the first to discover the latest, the newest, the most extravagant, the unusual.” There was a sense of security in those years for the upper and middle classes: “One’s house was insured against fire and theft, one’s field against hail and storm, one’s person against accident and sickness.”

People were slow to recognize the danger Hitler represented. “The few among writers who had taken the trouble to read Hitler’s book,” Zweig writes, “ridiculed the bombast of his stilted prose instead of occupying themselves with his program.” Newspapers reassured readers that the Nazi movement would “collapse in no time.” And many assumed that if “an anti-semitic agitator” actually did become chancellor, he “would as a matter of course throw off such vulgarities.”

Ominous signs were piling up. Groups of menacing young men near the German border “preached their gospel to the accompaniment of threats that whoever did not join promptly, would have to pay for it later.” And “the underground cracks and crevices between the classes and races, which the age of conciliation had so laboriously patched up,” were breaking open again and soon “widened into abysses and chasms.”

But the Nazis were careful, Zweig remembers, not to disclose the full extent of their aims right away. “They practiced their method carefully: only a small dose to begin with, then a brief pause. Only a single pill at a time and then a moment of waiting to observe the effect of its strength”—to see whether the public and the “world conscience would still digest this dose.”

And because they were reluctant to abandon their accustomed lives, their daily routines and habits, Zweig wrote, people did not want to believe how rapidly their freedoms were being stolen. People asked what Germany’s new leader could possibly “put through by force in a State where law was securely anchored, where the majority in parliament was against him, and where every citizen believed his liberty and equal rights secured by the solemnly affirmed constitution”—this eruption of madness, they told themselves, “could not last in the twentieth century.”

2

THE NEW CULTURE WARS

The death of objectivity “relieves me of the obligation to be right.” It “demands only that I be interesting.”

—STANLEY FISH

IN A PRESCIENT 2005 ARTICLE, DAVID FOSTER Wallace wrote that the proliferation of news outlets—in print, on TV, and online—had created “a kaleidoscope of information options.” Wallace observed that one of the ironies of this strange media landscape that had given birth to a proliferation of ideological news outlets (including so many on the right, like Fox News and The Rush Limbaugh Show) was that it created “precisely the kind of relativism that cultural conservatives decry, a kind of epistemic free-for-all in which ‘the truth’ is wholly a matter of perspective and agenda.”

Those words were written more than a decade before the election of 2016, and they uncannily predict the post-Trump cultural landscape, where truth increasingly seems to be in the eye of the beholder, facts are fungible and socially constructed, and we often feel as if we’ve been transported to an upside-down world where assumptions and alignments in place for decades have suddenly been turned inside out.

The Republican Party, once a bastion of Cold War warriors, and Trump, who ran on a law-and-order platform, shrug off the dangers of Russia’s meddling in American elections, and GOP members of Congress talk about secret cabals within the FBI and the Department of Justice. Like some members of the 1960s counterculture, many of these new Republicans reject rationality and science. During the first round of the culture wars, many on the new left rejected Enlightenment ideals as vestiges of old patriarchal and imperialist thinking. Today, such ideals of reason and progress are assailed on the right as part of a liberal plot to undercut traditional values or suspicious signs of egghead, eastern-corridor elitism. For that matter, paranoia about the government has increasingly migrated from the Left—which blamed the military-industrial complex for Vietnam—to the Right, with alt-right trolls and Republican members of Congress now blaming the so-called deep state for plotting against the president.

The Trump campaign depicted itself as an insurgent, revolutionary force, battling on behalf of its marginalized constituency and disingenuously using language which strangely echoed that used by radicals in the 1960s. “We’re trying to disrupt the collusion between the wealthy donors, the large corporations, and the media executives,” Trump declared at one rally. And in another he called for replacing this “failed and corrupt political establishment.”

More ironic still is the populist Right’s appropriation of postmodernist arguments and its embrace of the philosophical repudiation of objectivity—schools of thought affiliated for decades with the Left and with the very elite academic circles that Trump and company scorn. Why should we care about these often arcane-sounding arguments from academia? It’s safe to say that Trump has never plowed through the works of Derrida, Baudrillard, or Lyotard (if he’s even heard of them), and postmodernists are hardly to blame for all the free-floating nihilism abroad in the land. But some dumbed-down corollaries of their thinking have seeped into popular culture and been hijacked by the president’s defenders, who want to use its relativistic arguments to excuse his lies, and by right-wingers who want to question evolution or deny the reality of climate change or promote alternative facts. Even Mike Cernovich, the notorious alt-right troll and conspiracy theorist, invoked postmodernism in a 2016 interview with The New Yorker. “Look, I read postmodernist theory in college. If everything is a narrative, then we need alternatives to the dominant narrative,” he said, adding, “I don’t seem like a guy who reads Lacan, do I?”

SINCE THE 1960s, there has been a snowballing loss of faith in institutions and official narratives. Some of this skepticism has been a necessary corrective—a rational response to the calamities of Vietnam and Iraq, to Watergate and the financial crisis of 2008, and to the cultural biases that had long infected everything from the teaching of history in elementary schools to the injustices of the justice system. But the liberating democratization of information made possible by the internet not only spurred breathtaking innovation and entrepreneurship; it also led to a cascade of misinformation and relativism, as evidenced by today’s fake news epidemic.

Central to the breakdown of official narratives in academia was the constellation of ideas falling under the broad umbrella of postmodernism, which arrived at American universities in the second half of the twentieth century via such French theorists as Foucault and Derrida (whose ideas, in turn, were indebted to the German philosophers Heidegger and Nietzsche). In literature, film, architecture, music, and painting, postmodernist concepts (exploding storytelling traditions and breaking down boundaries between genres, and between popular culture and high art) would prove emancipating and in some cases transformative, resulting in a wide range of innovative works from artists like Thomas Pynchon, David Bowie, the Coen brothers, Quentin Tarantino, David Lynch, Paul Thomas Anderson, and Frank Gehry. When postmodernist theories were applied to the social sciences and history, however, all sorts of philosophical implications, both intended and unintended, would result and eventually pinball through our culture.

There are many different strands of postmodernism and many different interpretations, but very broadly speaking, postmodernist arguments deny an objective reality existing independently from human perception, contending that knowledge is filtered through the prisms of class, race, gender, and other variables. In rejecting the possibility of an objective reality and substituting the notions of perspective and positioning for the idea of truth, postmodernism enshrined the principle of subjectivity. Language is seen as unreliable and unstable (part of the unbridgeable gap between what is said and what is meant), and even the notion of people acting as fully rational, autonomous individuals is discounted, as each of us is shaped, consciously or unconsciously, by a particular time and culture.

Out with the idea of consensus. Out with the view of history as a linear narrative. Out with big universal or transcendent meta-narratives. The Enlightenment, for instance, is dismissed by many postmodernists on the left as a hegemonic or Eurocentric reading of history, aimed at promoting colonialist or capitalistic notions of reason and progress. The Christian narrative of redemption is rejected, too, as is the Marxist road to a Communist utopia. To some postmodernists, the scholar Christopher Butler observes, even the arguments of scientists can be “seen as no more than quasi narratives which compete with all the others for acceptance. They have no unique or reliable fit to the world, no certain correspondence with reality. They are just another form of fiction.”

THE MIGRATION OF postmodern ideas from academia to the political mainstream is a reminder of how the culture wars—as the vociferous debates over race, religion, gender, and school curricula were called during the 1980s and 1990s—have mutated in unexpected ways. The terrorist attacks of 9/11 and the financial crisis of 2008, it was thought, had marginalized those debates, and there was hope, during the second term of President Barack Obama, that the culture wars in their most virulent form might be winding down. Health-care legislation, the Paris climate accord, a stabilizing economy after the crash of 2008, same-sex marriage, efforts to address the inequities of the criminal justice system—although a lot of essential reforms remained to be done, many Americans believed that the country was at least set on a progressive path.

На страницу:
2 из 3