Saturday, February 29, 2020





William Barr: The Carl Schmitt of Our Time

AG William Barr speaking at the SEC
Attorney General William Barr delivering remarks to a conference at the Securities and 
Exchange Commission, Washington, D.C., October 03, 2019
Chip Somodevilla/Getty Images

US Attorney General William Barr’s defense of unchecked executive authority in his recent speech to the Federalist Society had an unpleasant familiarity for me. It took me back to a time in my life—during the late 1990s, as a graduate student in England, and the early 2000s, teaching political theory in the politics department at Princeton University—when I seemed to spend altogether too much time arguing over the ideas of a Nazi legal theorist notorious as the “crown jurist” of the Third Reich.
Carl Schmitt’s work had then become popular in universities, and particularly in law schools, on both sides of the Atlantic. The frequent references to his “brilliance” made it evident that in the eyes of his admirers he was a bracing change from the dull liberal consensus that had taken hold in the wake of the cold war. Schmitt’s ideas thrived in an air of electrifyingly willed dangerousness. Their revival wasn’t intended to turn people into Nazis but to rattle the shutters of the liberal establishment.
Schmitt was supposed to be a realist. For him, laws and constitutions didn’t arise from moral principles. At their basis, there was always a sovereign authority, a decision-maker. Schmitt stipulated that the essential decision was not a moral choice between good and evil but the primally political distinction between friend and enemy. And that distinction, in order to be political in the most important sense, had to generate such intense commitment that people would be prepared to die for it. He set out this view in a brief work, The Concept of the Political, in 1932. Only a few years later, he and many of his fellow Germans showed that they were prepared to kill for it. And kill and kill and kill.
Nazi jurist Carl Schmitt speaking in Germany, 1930

Nazi jurist Carl Schmitt, 1930
In spite of his ideas’ serving this dismal role in recent history, I found that there were always one or two students for whom the simplicity of Schmitt’s dualistic choice was appealing. All the complex reflections and reasoning that enter into responsible political judgment could be dismissed in favor of one very simple binary opposition. We all want our justifications to end somewhere, to bring an end to the process of providing reasons for our beliefs, values, and actions, and those prepared to listen to an authority telling them exactly where that should be are unburdened. I liked to think that most of the students who embraced Schmitt’s work were attracted to the view merely as a theoretical position and were as lacking in real enmity as the anemic liberals whom Schmitt despised.
But in the same period that I encountered these tousle-haired Schmittians in flip-flops, there was in the United States a tremendous craving for action to avenge the terror attacks of September 11, 2001, and prevent further loss of American lives. Impatience at endless diplomatic wrangling, including the United Nations’ failure to take seriously the Bush administration’s case for invading Iraq, made simple decisionism and the idea of the unchecked executive ferociously appealing in US political life. The concept of the “enemy combatant” became sufficiently vague for the executive branch to apply it more or less at will to whatever person it pleased. And for many of those advocating pre-emptive war, or rendition and torture, on the grounds that terrorists might attack us with weapons of mass destruction, the sense of “it’s them or us” developed a new and ugly intensity.
The problem of containing the terrorist threat was not construed as one of detection and policing. It was a war. And for prominent thinkers and politicians on the right, it was a civilizational battle for “Judeo-Christian values” against “Islamo-fascism,” a friend-enemy distinction that remains deeply rooted in parts of American society today. This ill-defined enemy is the specter constantly evoked by the people Trump has chosen as his advisers and officials. Their mythic world-historical struggle has become detached from the actuality of counter-terrorism operations. General Michael Flynn, a veteran of the “global war on terror,” wrote in his 2016 book The Field of Fight:
We’re in a world war against a messianic mass movement of evil people, most of them inspired by a totalitarian ideology: Radical Islam. But we are not permitted to speak or write those two words, which is potentially fatal to our culture.
The idea that “evil people” might destroy “our culture” is alarmingly bellicose rhetoric, but one that is evidently persuasive for certain people. Steve Bannon uses it, too. He made a speech via Skype at a Vatican conference in 2014 in which he stated that “we are in an outright war against jihadist Islamic fascism.” These are just the beginning stages “of a very brutal and bloody conflict.” He called upon the “church militant” to fight this “new barbarity.” In the White House, Bannon’s main initiative was the racist and deeply harmful travel ban on people from Muslim countries. Trump’s Executive Order 13769 restricted entry to the United States for citizens of Iraq, Iran, Libya, Somalia, Sudan, Syria, and Yemen on the grounds—disputed by most counter-terrorism experts—that it was an essential counter-terrorism measure.
So the views expressed in Barr’s Federalist Society address were not new ones. Not only has this Schmittian approach to constitutional doctrine come to be accepted as one respectable view among others in law schools, it has become increasingly accepted as a feature of American politics as well. Barr described the travel ban as “just the first of many immigration measures based on good and sufficient security grounds that the courts have second guessed since the beginning of the Trump Administration.” The president, when acting, or claiming to act, in the interests of national security, should be allowed unlimited scope for decision-making. Barr told his audience that the Framers’ view of executive power—
entailed the power to handle essential sovereign functions—such as the conduct of foreign relations and the prosecution of war—which by their very nature cannot be directed by a pre-existing legal regime but rather demand speed, secrecy, unity of purpose, and prudent judgment to meet contingent circumstances. They agreed that—due to the very nature of the activities involved, and the kind of decision-making they require—the Constitution generally vested authority over these spheres in the Executive.
Constitutional experts have denounced this view of the unchecked executive not only as a misinterpretation of Article II of the Constitution, but as a dangerous attempt to place the president above the law.
Barr’s view is the final reductio ad Schmitt of our political era. As US attorney general, at the head of the Justice Department, he is charged with upholding the rule of law, but he admires only lawlessness in moments of crisis. He claims in his Federalist Society speech that America’s greatness has been achieved through its most savage conflicts, from the Civil War, through World War II “and the struggle against fascism,” to, most recently, “the fight against Islamist Fascism and international terrorism.” Barr even folds into this narrative the “struggle against racial discrimination,” while using his office to defend a blatantly racist president. These “critical junctures” when the country has been most challenged, he claims, are the moments that have brought the Republic “a dynamism and effectiveness that other democracies have lacked.” Such moments of decision gained that significance precisely because this was when the presidency “has best fulfilled the vision of the founders.” By this, Barr means the urgent, secret decisions that forge a strong and vital sovereign authority.
This is not just a theoretical position for Barr. He previously supported President George H.W. Bush’s pardoning of Caspar Weinberger, Reagan’s secretary of defense who was charged with obstruction of federal investigations and lying to congress about the Iran-contra affair. That reckless criminal enterprise presumably represented the kind of “dynamism and effectiveness” that the liberal rule of law tries to suppress. Barr admires action.
He also fears that the commitments motivating sovereign decisions and actions will become etiolated if removed from the sustaining light of Christian faith. Religion is the only basis, he has claimed, for the kind of commitment that is conducive to greatness. In a recent address at Notre Dame, Barr poured contempt on the “high-tech popular culture” that distracts us from our fundamental commitments. Willfully disregarding the First Amendment’s establishment clause, he told his audience that “as Catholics, we are committed to the Judeo-Christian values that have made this country great.”
Barr claims to have extremely high-minded motivations. Unlike the frivolous and profane culture that surrounds him, he has understood how to lead a dignified human life:
Part of the human condition is that there are big questions that should stare us in the face. Are we created or are we purely material accidents? Does our life have any meaning or purpose? But, as Blaise Pascal observed, instead of grappling with these questions, humans can be easily distracted from thinking about the “final things.”
The “militant secularism” rampant in America, he insists, endangers this spiritual vocation. Will Barr be remembered, I wonder, for his determination to confront important and intractable theological and philosophical problems? Surely not.
He will be remembered, rather, for defending an impeached president who tried to bribe a foreign power to dig up dirt on his domestic political rivals. And for mischaracterizing Robert Mueller’s report into Russian interference in the 2016 election, then lending his support to the bizarre view, promoted by Donald Trump and Vladimir Putin, that Ukraine was responsible for the electoral meddling. Barr may also be remembered for eccentrically jetting around Italy, Britain, and Australia, in an effort to gin up support for a baseless conspiracy theory about the “deep state” involving George Papadopoulos, a junior adviser to the Trump campaign and convicted felon. He will be remembered, too, for accusing the FBI of “spying” on the Trump campaign and for dismissing the Inspector General’s report that vindicated the FBI on the substantive issues on the same day that the president called the agency’s officers “scum.”
Like Barr, Carl Schmitt held a moralizing view of secularism and liberalism as inadequate to the true seriousness of life. But commentators have been puzzled by what this “seriousness” consists of and whence the demand for such deep enmity arises. The political philosopher Leo Strauss criticized Schmitt for implicitly resting “the political” on moral foundations, even as Schmitt claimed the realm of politics was entirely distinct from morality. One prominent German scholar, Heinrich Meier, made a very influential argument that this moral basis was supplied by Schmitt’s Catholic faith. The “ultimate values” that demand the “ultimate sacrifice” must derive from the hidden theistic basis of this thought. Is this what Schmitt has in common with Barr?
A more obvious explanation for Schmitt’s demand for enmity, for his condemnations of a liberal culture that weakens that hostile will and of the rule of law that obstructs it, is racism. He was, after all, not just a Catholic but a Nazi. His actions and unpublished writings betray a virulent anti-Semitism. We do not need to invoke an esoteric theism to account for what can be sufficiently explained by bigotry. And many would argue that in our own time the mythic crusade against the civilizational threat of “Islamo-fascism,” and the manufactured fear of the supposed barbarian hordes who must be prevented from flooding in from the south by an impregnable border wall, are barely disguised manifestations of a similar racism. It has nothing at all to do with the meaning of human existence, or “final things,” or anybody’s God.
The reality of the “war on terror,” though, is becoming further and further detached from the mythic crusade against “Islamo-fascism.” The greatest terrorist threat America now faces comes from domestic terrorists. And national security policy has pivoted away from the Islamic world toward more traditional geopolitical rivals such as Russia and China. We’re not engaged in a binary civilizational battle. As the myth becomes unsustainable, Barr and his allies will have to find a new enemy to justify their view of the president as the indispensable, unitary sovereign at the head of an executive branch that is perpetually on a war footing. They will be forced to ramp up the rhetoric of apocalyptic conflict—as Barr is indeed doing.
This doesn’t mean, of course, that they will need a foreign war (though, as the escalating confrontation with Iran shows, this may remain an option); all authoritarian regimes create their own internal enemies. No one should doubt Barr’s capacity and will to do this. In his original tenure as attorney general, from 1991 to 1993, Barr disbanded the counterintelligence team of the FBI that had been focused on the Soviet Union during the cold war but whose expertise and Russian language skills seemed less relevant in 1992. He instead turned the FBI’s attention to gang-related crime and violence and became a leading advocate of the mass incarceration policies that have since so disproportionately affected America’s black population.
And today, during his second tenure leading the Justice Department, in December 2019, he returned to these themes—apparently disputing the right of Americans of color to protest police violence. Law enforcement must be respected, he said, adding ominously that “if communities don’t give that support and respect, they might find themselves without the police protection they need.” In effect, he reserved the right to make decisions about who should and should not be afforded the protection of the rule of law, a notably Schmittian reflex.
Barr is not just another craven defender of Trump. Just as Carl Schmitt’s identification of parliamentary democracy’s weaknesses in the 1920s and his increasingly authoritarian rhetoric in that period had a basis that was quite independent of the cult of Hitler, so William Barr represents a cast of legal thinking (or perhaps, more accurately, anti-legal thinking) that has its own origins and supporters. One clear ally is John Durham, whom Barr has placed in charge of his inquiry into the FBI’s conduct in the Trump–Russia investigation.
Durham’s colleagues have constantly assured us of his integrity, but his natural tendency, too, is to speak in the crudest terms of friends and enemies. In a lecture to the Thomistic Institute at Yale University in November 2018, Durham was discussing his investigation into the destruction of tapes showing CIA torture (an investigation that resulted in no indictments) when he chose to describe the alleged malfeasance revealed by the tapes as no more than “the unauthorized treatment of terrorists by the CIA.” This is a troubling description: as Durham must know, not all of the young Muslim men subjected to torture were terrorists. At least twenty-six of them were found to have been wrongfully detained. He also chose a bizarre analogy to describe his deeper motivations: he compared himself to the 1970s movie cop “Dirty Harry,” a white officer with little respect for due process who often shoots-to-kill black criminals and displays a sociopathic lack of conscience about doing so. Durham told his audience a story about Harry Callahan encountering a “black militant” who can’t comprehend Callahan’s unselfish motivation “to serve.”
Durham now serves a master who views lawless conflict—the “critical junctures” that “demand speed, secrecy, unity of purpose, and prudent judgment”—as the most important engine of progress, the vital heart of American power, that will bring a wider reckoning with “final things.” Independently of Trump and this presidency, William Barr, his henchmen, and his Federalist Society supporters represent a powerful threat to the fundamental values of liberal democracy.


The Tyranny of the Minority, from Iowa Caucus to Electoral College

Corey Robin
Jessica Rinaldi/The Boston Globe via Getty Images
A book stall sign at a store in Des Moines, Iowa, the day of 
the Iowa caucuses, February 3, 2020

It has been more than two weeks since the Iowa caucuses, and we still don’t know who won. That should give us pause. We don’t know in part because of a combination of technological failing and human error. But we’re also in the dark for a political reason. That should give us further pause.

No one disputes that Bernie Sanders won the most votes in Iowa. Yet Pete Buttigieg has the most delegates. While experts continue to parse the flaws in the reporting process, the stark and simple fact that more voters supported Sanders than any other candidate somehow remains irrelevant, obscure.

America’s democratic reflexes have grown sluggish. Not only has the loser of the popular vote won two out of the last five presidential elections, but come November, he may win a third. Like the children of alcoholics, we’ve learned to live with the situation, adjusting ourselves to the tyranny of its effects. We don’t talk anymore about who will win the popular vote in the coming election. We calculate which candidate will win enough votes in the right states to secure a majority in the Electoral College. Perhaps that’s why the scandal coming out of Iowa is the app that failed and the funky math of the precinct counters—and not the democratic embarrassment that the winner of the most votes doesn’t automatically win the most delegates.

In the original edition of his definitive history The Right to Vote, which came out two months before the 2000 election, Harvard scholar Alexander Keyssar never mentioned the Electoral College. Trying, in an afterword he wrote later, to account for his omission, Keyssar explained that before the election, he didn’t think the Electoral College had much to do with voting rights. He thought of it simply as a “device for aggregating” the popular vote.

At the time, this was an understandable belief. (Keyssar has since devoted himself to writing what is likely to be the definitive history of the Electoral College. It is scheduled to be published in June.) Not since 1888 had a candidate won the presidential election while losing the popular vote. So insignificant had the Electoral College become by 2000 that CBS News had to remind people on the eve of the election that the Electoral College wasn’t an “institution of higher learning” but the means by which the president is chosen. A relic of the nineteenth century, the divergence between the electoral vote and the popular vote was unlikely in the twentieth and twenty-first.

Like the Electoral College, the Iowa caucuses don’t aggregate people’s votes; they weigh them, assigning different values to the votes depending on where they are cast. Much has been made of the alchemy whereby support for a candidate is converted into “state delegate equivalents,” or SDEs. Less attention has been paid to the fact that the number of SDEs each caucus is assigned and distributes among the candidates depends not on the size of the local population or turnout at the caucus but on a formula biased toward rural parts of the state. In 2016, forty-five citizens in remote, sparsely populated Fremont County could effectively select one SDE. In more populous Jefferson County, it required two hundred and thirteen citizens to select one SDE. That’s how it’s possible to win the popular vote and lose Iowa—even if the app works fine and the caucus chairs are good at math.

“The basic principle of representative government,” the US Supreme Court declared in 1964, is that “the weight of a citizen’s vote cannot be made to depend on where he lives.” In Iowa, as in other parts of the American constitutional order, that principle is not in effect.


Even with their acceptance of slavery and a highly restricted franchise, many of the Framers were uneasy about the notion that some people’s votes might count more than others. When one group of delegates proposed that each state, regardless of the size of its population, should have an equal vote in Congress, James Madison denounced the plan as “confessedly unjust,” comparing it to the scheme of “vicious representation in Great Britain.” State-based apportionment, claimed Pennsylvania’s James Wilson, would only reproduce the inequality of Britain’s rotten boroughs, where a nearly depopulated Old Sarum—described at the time as sixty acres without a home—had two representatives in Parliament, while London, with 750,000 to one million residents, had four.

Madison and Wilson lost that debate; the United States Senate is the result. Within a year of the ratification of the Constitution, the 50,000 free residents of Delaware, the least populous state in the nation, had the same number of senators as the 455,000 free residents of Virginia, the most populous state. That makes for a ratio of power of nine to one. Today, according to a recent report by the Roosevelt Institute, that ratio has expanded to sixty-seven to one. Wyoming’s 583,000 residents enjoy as much power in the Senate as the nearly 40 million residents of California. (In the Electoral College, the power ratio is four to one.) 
Bill Clark/CQ Roll Call via Getty Images
Placards at the Women’s March, the day after the Inauguration of Donald Trump as president, Washington, D.C., January 21, 2017

Eighteen percent of the American population—on average, whiter and older than the rest of the population—can elect a majority of the Senate. If those senators are not united in their opposition to a piece of legislation, the filibuster enables an even smaller group of them, representing 10 percent of the population, to block it. Should legislation supported by a vast majority of the American people somehow make it past these hurdles, the Supreme Court, selected by a president representing a minority of the population and approved by senators representing an even smaller minority, can overturn it.

The problem of minority rule, in other words, isn’t Trumpian or temporary; it’s bipartisan and enduring. It cannot be overcome by getting rid of the filibuster or racist gerrymanders—neither of which have any basis in the Constitution—though both of these reforms would help. It’s not an isolated embarrassment of “our democracy,” restricted to newly problematic outliers like the Electoral College and the Iowa caucuses. Minority rule is a keystone of the constitutional order—and arguably, given the constitutional provision that “no state, without its consent, shall be deprived of its equal suffrage in the Senate,” not eliminable, at least not without a huge social upheaval.

In the second half of the twentieth century, the United States had two extremely close presidential elections: 1960 and 1968. Given how effectively the Soviet Union propagandized about American apartheid, prodding US officials to push for desegregation in order to avoid bad publicity in the decolonizing world, it’s not inconceivable that the cold war also may have helped prevent those elections from producing awkward splits between the electoral vote and the popular vote. If American elites feared that the tyranny of a white majority couldn’t withstand scrutiny in the Global South, what kind of legitimation crisis might the tyranny of a white minority have provoked? As the United States today approaches a multiracial majority, the tyranny of a white minority is precisely what we are hurtling toward, without the countervailing pressure of an ideologically challenging superpower to shame us into stopping it.

It was also during the cold war that the United States grew increasingly uncomfortable with the rotten boroughs of the American South. “To say that a vote is worth more in one district than in another,” the Supreme Court declared in Wesberry v. Sanders (1964), would “run counter to our fundamental ideas of democratic government.” Though malapportionment in the South produced power ratios—three to one in Georgia, forty-one to one in Alabama—that seem quaint in light of today’s Senate, they were sufficiently large to provoke Chief Justice Earl Warren, in Reynolds v. Sims (1964), to spell out, in simple numerical terms, the challenge they posed to basic principles of democratic morality.

“It would appear extraordinary,” wrote the chief justice, if the votes of citizens in one part of a state were “given two times, or five times, or ten times the weight of votes of citizens” in another part of the state. That would mean that the first group of citizens “could vote two, five, or ten times for their legislative representatives” while the second group “could vote only once.” Under such a scheme of representation, the right to vote would not be “the same right” for all citizens. “To sanction minority control” of a legislative body “would appear to deny majority rights in a way that far surpasses any possible denial of minority rights.”

Warren clearly was worried about the implications of his argument for federal institutions like the Electoral College and the Senate. That is why he devoted several pages of his opinion to a preemptive rebuttal of “the so-called federal analogy.” Much of Warren’s rebuttal depended on the invocation of history. The apportionment scheme of the Senate and the Electoral College was a necessary compromise “arising from unique historical circumstances” of thirteen sovereign states contracting to form a national government. That claim from history sits uneasily with Warren’s claim, later in the opinion, that “history alone” is not a “permissible factor” in justifying departures from one-person, one-vote, that “citizens, not history or economic interests, cast votes.”

More important, however, is the history Warren invoked. Like the court’s other liberal justices, Warren cast the constitutional settlement over representation as a compromise between large and small states. As the Yale legal scholar Akhil Reed Amar has noted, this is “the stodgy version” of constitutional history that many of us grew up with. The less stodgy and increasingly accepted version holds that the battles over representation at the Convention had more to do with slavery than the size of states. As Madison pointed out, “the real difference of interests” at the Convention “lay, not between large and small but between Northern and Southern States. The institution of slavery and its consequences formed the line of discrimination.”

It’s clear why such historical truths could not be acknowledged during the cold war. Given the centrality of the Senate to Jim Crow and the near impossibility of eliminating the Senate, as well as the improbability of reforming the Electoral College, it made sense to describe these institutions as musty and ancient compromises between large and small states. In the 1960s, ironically, anachronism was less threatening than relevance. Now that the cold war is over, however, we can admit these truths. Now that the cold war is over, that may not matter.

We are at a strange moment in American history. On the one hand, the country has never been more interested in, and desperate to know, what the majority wants. As the rise of data geeks like the two Nates—Silver of fivethirtyeight.com and Cohn of The New York Times—and outlets like Vox show, our appetite for polling is ravenous; our capacity to digest the results, prodigious. On the other hand, we have an electoral system that makes it ever more difficult to determine the will of the majority, and a political system that makes that will ever more difficult to enact. Something’s gotta give.

Or not. In her 1997 collection of critical essays The End of the Novel of Love, Vivian Gornick remarks on that “climactic moment” in a John Cheever story “when the husband realizes his wife holds him in contempt, or the wife knows the husband is committing adultery.” With mounting dread, the reader wonders how either character can go on after this moment of truth. What makes the story truly “large, awesome, terrible,” however, is when the reader realizes that the characters do “go on like this.” That moment of truth leaves the reader “staring into space, the void opening at her feet.”

February 21, 2020, 8:38 am
SCREWBALL!: The Cartoonists Who Made the Funnies Funny
March 12, 2020
Art Spiegelman
MARCH 12, 2020 ISSUE

Foolish Questions
SCREWBALL!: The Cartoonists Who Made the Funnies Funny
by Paul C. Tumey
Library of American Comics, 303 pp., $59.99


The Art of Rube Goldberg
an exhibition at the Museum of Pop Culture, Seattle, February 11–April 23, 2017; the Grand Rapids Art Museum, May 21–August 27, 2017; Citadelle Art Foundation and Museum, Canadian, Texas, September 15–November 26, 2017; the Contemporary Jewish Museum, San Francisco, March 15–July 8, 2018; the Portland Public Library, Portland, Maine, August 3–September 22, 2018; the National Museum of American Jewish History, Philadelphia, October 12, 2018–January 21, 2019; the Evansville Museum of Arts, History and Science, April 28–July 21, 2019; and the Queens Museum, October 6, 2019–February 9, 2020

King Features Syndicate/The Library of American Comics
Panels from Count Screwloose of Tooloose by Milt Gross, April 5, 1931

Two ladies on an outing to the Queens Museum one weekend last fall wander into “The Art of Rube Goldberg” exhibition. They enter casually and chuckle at a monitor playing a few moments from Charlie Chaplin’s Modern Times. A factory worker is immobilized in a complicated lunch-feeding contraption inspired by Rube Goldberg, a pal of Chaplin’s. It shovels some soup into his mouth, then short-circuits as it rams a whirring cob of corn up against his teeth, force-feeds him a couple of loose bolts, shoves a slice of cream pie into his dazed mug and then smears it with an automated napkin. Next, there’s a clip from a 1930 comedy, Soup to Nuts, written by Goldberg. (It includes a memorable antiburglar contraption but today is better known for featuring Larry, Moe, and Shemp before they became the Three Stooges.) The women glance at some of the original art on the walls as they drift out and one says, “Gosh, I never knew he was a cartoonist, too!”

Being a cartoonist too was the price of immortality for a cartoonist so famous that he became an adjective in Merriam-Webster’s dictionary as early as 1931: “accomplishing by complex means what seemingly could be done simply.” The adjective still has currency, as in a recent Foreign Policy opinion piece that describes the electoral college as “that cockamamie Rube Goldberg mechanism that never quite worked as intended.” (It shows up often in discussions of government policy and single-payer-health-care math.)

Rube Goldberg was the Christopher Columbus of the screwball contraption, finding a way to get from point A to point B by traveling through all the other letters of the alphabet. And, like Columbus, a number of other intrepid explorers had gotten there first. At least two years before Goldberg, the renowned British illustrator and cartoonist Heath Robinson began publishing deadpan-droll tableaux that featured useless inventions, as did Denmark’s hidden treasure, cartoonist and humorist Storm Petersen. Both “Heath Robinson” and “Storm P.” were adjectivized in their own nation’s lexicons. None of this has anything to do with plagiarism; it’s a marker of the disorienting Machine Age these artists were born into, and of cartooning’s singular role as a Zeitgeist barometer.

Goldberg was born on July 4, 1883, to a Prussian-Jewish immigrant father who became a fixture in San Francisco Republican politics. Fearful that his son would become an artist, Max Goldberg insisted that Rube study to be an engineer at UC Berkeley. He graduated in 1904 to a job mapping out sewer mains for the city of San Francisco but bailed just four weeks later to become a sports cartoonist for the San Francisco Chronicle. Goldberg’s engineering background allowed his ingratiatingly lumpy cartoons to retain the diagrammatic clarity both comics and patent drawings demand. So “Father Was Right”!—to quote one of the many pre-Internet memes Goldberg generated in the more than sixty series he drew in his lifetime. Others include “No matter how thin you slice it, it’s still baloney,” “Mike and Ike they look alike!” (identical twins, one Irish and one Jewish), and his first big hit, “Foolish Questions,” from 1908 (as in Foolish Questions—No. 40,976: “Son, are you smoking that pipe again?” “No, Dad,” says the son sucking a pipe larger than his head, “this is a portable kitchenette and I’m frying a smelt for dinner”).

Goldberg is said to have produced about 50,000 drawings in his lifetime, and his inventions made up only a small part of his vast and mixed-up mix of features. He was himself a master of reinvention: in his early days a vaudeville performer, then an animator, song lyricist, radio personality, short story and essay writer for popular magazines, toastmaster, and star of his own TV show. His last long goodbye as a cartoonist was drawing political cartoons from 1939 to 1964, before he “retired” and became a sculptor until his death at eighty-seven in 1970. His editorial cartoons were drawn in the style of Herblock, but with regrettable anti–New Deal and occasional pro-McCarthy stances (perhaps shaped by Goldberg’s class interests—his cartoons had made him wealthy, he was married to the White Rose Tea heiress, and apparently he had inherited his father’s Republicanism; as I mentioned, “Father Was Right”!). 

Abrams ComicArts/© 2020 Heirs of Rube Goldberg
‘Peace Today’; cartoon by Rube Goldberg from The New York Sun, July 22, 1947

The traveling retrospective at the Queens Museum left out Goldberg’s more embarrassing political cartoons, and didn’t show even a tear-sheet of his powerful 1948 Pulitzer Prize–winning emblem of cold war anxiety, “Peace Today.” It depicts a suburban American family lounging on the lawn next to their two-story home, sitting atop a giant A-bomb that teeters over an abyss labeled “World Destruction.”

The exhibition supplemented the dozens of comic art originals on the walls with full broadsheet-size Sunday comic pages, vitrines over-stuffed with book covers, licensed games, postcards, buttons, and other ephemera, all to show the artist as an observer of social foibles with an acute sense of the absurd. The visitor was encouraged to linger over Goldberg’s deft yet humble grotesqueries and also to savor the rhythms of his copious prose. Back in the golden age of newspaper comics, there used to be space and time for written language.

The Art of Rube Goldberg, the definitive coffee table book from 2013 that served as the catalyst for this exhibit, provides over seven hundred well-selected images and several valuable historical and biographical essays. In the spirit of excess that the artist was known for, it even comes with a paper-engineered moving contraption operated by a pull-tab on its cover that will make the book enticing to any child near that coffee table. Whatever childhood pleasures Goldberg’s work may offer, as Adam Gopnik points out in his introduction, there seems, to adult eyes, to be in [Goldberg’s] work some fatal, almost unconscious, commentary on the madness of science and the insanity of modern invention…. He doubtless would have laughed, or shaken his head in disbelief, if asked how his work related to Duchamp’s machine aesthetic, or to Dada—and yet every mark an artist makes takes place in a moment of time, and within a common frame of meaning.1

Duchamp and Man Ray embraced Goldberg as a fellow Dada traveler by putting one of his cartoons in their 1921 issue of New York Dada, but the feeling wasn’t exactly mutual. Like many American cartoonists of his day, Goldberg was dismissive of nonobjective art. As Peter Marzio, his biographer, wrote in 1973, “Rube believed that fine art was good only if it won public acceptance. Sales were Rube’s test of beauty.”2 Still, the cartoonist’s inventions showed up in MoMA’s landmark 1936 exhibit, “Fantastic Art, Dada and Surrealism,” and were also part of its 1968 show “The Machine As Seen at the End of the Mechanical Age.”
Abrams ComicArts/© 2020 Heirs of Rube Goldberg
Drawing by Rube Goldberg from Cosmopolitan, 1928

In 1970, two weeks before his death, a full retrospective of his work, entitled “Do It the Hard Way: Rube Goldberg and Modern Times,” was unveiled at the Smithsonian Institution. The show, according to Marzio, who cocurated the exhibition, was something of a blockbuster, with over two thousand guests attending the opening. Goldberg’s cartoons were also something of a “block buster” in the sense of opening up the redlined ghetto of “low” art, welcoming it into the hallowed precincts of High Culture. They were among the earliest examples of comic art ever to be displayed on art museum walls.

Over on the comics side of the collapsing high-low divide, Goldberg’s influence can be found in the work of generations of influential cartoonists, including Dr. Seuss, Harvey Kurtzman, and Robert Crumb—all of whom have now been exhibited in museums. In fact, with categories of every kind crumbling around us daily, seeing comic art on walls has become delightfully commonplace, though the celebration of Goldberg’s pioneering art in Queens was met with smaller crowds than he deserves. (The museum organizers tried to entice its audience, even having a well-intended if less-than-successful “Machine for Introducing an Exhibition” built to stand in front of the wall into the first gallery. The press of a button sets off a chain reaction involving an electric fan, a windmill, a die-cut Rube Goldberg drawing of a boot, a watering can, and three separate computer screens, each with simple animations of animals in a process that eventually unfurls a welcome banner. Mixing analog and digital technologies could have provoked thoughts about what the creator of useless complexity might have thought about life in our age of sleek electronics, but the whole device—barely a gizmo, let alone a contraption—looked minimalist and wan rather than deliriously tangible and maximalist, like the artist it was meant to introduce. The Saturday I visited, I pressed the green start button, and nothing happened. Then I noticed a sign on a stand nearby: “This work is temporarily out of order. We apologize for the inconvenience.”) 
Abrams ComicArts/© 2020 Heirs of Rube Goldberg
A 1995 US postage stamp adapted from artwork by Rube Goldberg in Collier’s, September 26, 1931

*Foolish Question #25,743,000: “So are you somehow trying to say that Rube Goldberg was a serious Fine Artist???”

“No, you Boob! I’m pointing out to the uninitiated that Rube Goldberg was a fine Screwball Artist!” 
*

Now that comics have put on long pants and started to strut around with the grownups by calling themselves graphic novels, it’s important to remember that comics have their roots in subversive joy and nonsense. For the first time in the history of the form, comics are beginning to have a history. Attractively designed collections of Little Nemo, Krazy Kat, Thimble Theater, Barnaby, Pogo, Peanuts, and so many more—all with intelligent historical appreciations—are finding their way into libraries.

Paul Tumey, the comics historian who co-edited The Art of Rube Goldberg book seven years ago, has recently put together a fascinating and eccentric addition to the expanding shelves of comics history.3 The future of comics is in the past, and Tumey does a heroic job of casting a fresh light on the hidden corners of that past in Screwball!: The Cartoonists Who Made the Funnies Funny. It’s a lavish picture book with over six hundred comics, drawings, and photos, many of which haven’t been seen since their twenty-four-hour life-spans in newspapers around a century ago. The book is a collection of well-researched short biographies of fifteen artists from the first half of the twentieth century, accompanied by generous helpings of their idiosyncratic cartoons. Goldberg—whose name schoolchildren learn when their STEM studies bump into chain reactions—is the perfect front man to beckon you toward the other less celebrated newspaper cartoonists who worked in the screwball vein that Tumey explores.

Screwball is an elusive attitude in the language of laughs and, like pornography, it’s hard to define but easy to recognize. Tumey prowls for common denominators and trails of influence that connect these odd ducks and their droppings. But the closer one looks, the less they seem to have in common. Virtually all the earliest newspaper comics were designed to be funny, but not all the funnies were screwball. The book is a survey, not in the sense of a Comics 101 history course serving up a knowledgeable overview, but more like a deep exploratory mining dig that samples underground specimens to assay what’s of value. The project is hardly arbitrary, but it doesn’t seem exactly definitive, either. It’s actually sort of, well, screwy—and it may just be that screwball is its own shortest definition.

One foot of the slippery screwball stretches back through vaudeville to commedia dell’arte with its stock situations and characters; the other foot strides forward toward Dada, surrealism, and the theater of the absurd—while the third foot of this ungainly creature remains firmly balanced on a banana peel. Screwball comics tend toward the manic, excessive, over-the-top, obsessive, irrational, anarchic, and grotesque; they can veer toward parody or satire, but at their core they are an assault on reason and its puny limitations. They wage a gleeful war on civilization and its discontents—armed mostly with water-pistols, stink bombs, and laughing gas.

The cinematic analogs of screwball comics would include the Marx Brothers’ Duck Soup, Olsen and Johnson’s Hellzapoppin, as well as the early animated shorts of the Fleischer brothers, Tex Avery, et al. Screwball comics have little to do with the more attended-to genre of romantic “screwball comedy”—movies like Howard Hawks’s Bringing Up Baby or Frank Capra’s It Happened One Night—except in their velocity. Screwball strips, designed for family newspapers, had even fewer hints of sex than those screwball romantic comedies, though their punchlines did elicit sublimated climaxes with the so-called straight man flying out of the last box, feet in the air. 
King Features Syndicate
Thimble Theater by E.C. Segar, January 4, 1934

In Smokey Stover, Bill Holman’s essence-of-screwball fireman strip, rapid-fire puns rage through all the panels like kindling for a four-alarm newsprint conflagration. “Plop-take” feet sail out of their shoes, revealing toes that poke through sock holes; bowties pop off shirt collars while mustaches, hairpieces, eyeglasses, false teeth, and even ears explode clear off of heads. Another symptom of this approach—rechanneled id erupting in the release of a belly laugh—is what Tumey dubs “the screwball spin,” a blurry mandala of repeating heads and limbs that form a proto-Futurist pinwheel of frenzied slapstick action. It was, for example, how Elzie Segar drew Popeye pummeling an adversary in the ring—like a rapidly rotating phénakistiscope.
Tribune Content Agency
Panels from Smokey Stover by Bill Holman, November 15, 1936

I can only tour you through a few of the giant screwballs spinning around in this treasure chest of salvaged newsprint, and will start as the book does, with Frederick Burr Opper. A founding father of the funnies, he’s credited with making speech balloons a regular part of the comics’ formal vocabulary. He was already a seasoned and highly regarded artist of forty-one by the time he was recruited by William Randolph Hearst’s Journal in 1899 as a big gun in the epic newspaper war between Hearst and Joseph Pulitzer.

Pulitzer had developed a high-speed four-color newspaper press, hoping to bring the great art of the ages to his masses. When he found that the imprecise, out-of-register printing made the Old Masters look like blurry Impressionists, he settled on a comics supplement with black outlines containing flat colors. The funnies became a major weapon in the battle for circulation—and Hearst soon set up his own color supplement, announcing it as “eight pages of iridescent polychromous effulgence that makes the rainbow look like a lead pipe!”

Opper, a formidable draftsman, had become a star in Puck, the color-lithographed, Progressive Era satirical weekly. Instead of bringing gravitas to Hearst’s paper, Opper remade himself as a king of comedy, working in a playful, casual mode. His first and longest-lived hit, Happy Hooligan, featured a hapless hobo with a tin can for a hat; his well-meaning but dimwitted attempts to be helpful brought swirls of multipanel havoc that often ended with a screwball spin of cops brandishing nightsticks and dragging our hero off to the slammer. He was a precursor of Chaplin’s tramp, Goldberg’s long-lived Boob McNutt, and, a half century later, another beautiful loser named Charlie Brown. Happy Hooligan is a genial version of the xenophobic caricatures of simian-featured Irish immigrants that Thomas Nast had angrily drawn for Harper’s Weekly and that Opper, following in Nast’s footsteps, had produced for Puck.
King Features Syndicate/Library of American Comics
A panel from Happy Hooligan by Frederick Burr Opper, May 23, 1909

Ah, stereotypes! Cartoons are a visual language of simplification and exaggeration whose vocabulary was entirely premised on them. It’s as if the N-word was the only word in the dictionary to describe people of color, and even the poetry that comics can offer had to be written in this debased language. We humans are hard-wired toward stereotyping, and, alas, comics echo the way we think. It’s part of the medium’s danger and its power. Tumey’s collection of historical material comes with a trigger warning:


These comic strips were created in an earlier time and may include racial and other stereotypes; we reproduce them in historical context with the understanding that they reflect a thankfully bygone era.

He scrupulously tries to depict the work of the era accurately without grinding our eyeballs into an overdose of toxic images. Still, it’s hard to guide an uninitiated reader to distinguish between intentional insults and images that—considering the form and our nation’s history—are only ambiently offensive, reflecting the time in which they were made.

Which brings us to Eugene “Zim” Zimmerman, who was one of America’s most famous cartoonists at the turn of the twentieth century. A consummate graphic artist, Zim had an unfortunate predilection for the ethnic and racial themes that were especially popular at the time, and—though this material may represent only, say, 30 percent or so of his prolific output—he was brutally skillful at it. Zim once joked that he and his fellow cartoonists at Puck treated the various races and creeds that made up America with gloves, the kind boxers wear. It may explain why—despite the large influence he had on other cartoonists of his time—Zim has been more or less canceled from comics histories. Still, Rube Goldberg deeply admired Zim’s art and eulogized him as “the dean of grotesque pictorial humor.”
Judge Company of New York
‘Getting There’; cartoon by Eugene Zimmerman from Judge, September 20, 1902

George Herriman, the creator of Krazy Kat, sits far from Zim on the screwball spectrum, on as high a throne as a comics canon can offer. Krazy Kat’s relentless vaudeville variations of a kat getting whacked by a mouse hurling a brick might make the work the ultimate expression of screwball, but its ethereal and gentle subtlety beckons the strip into a transcendent world of its own. Knowing that Krazy Kat is now widely available again, Tumey has chosen instead to offer a slice of Herriman’s far more obscure Stumble Inn. Krazy was getting an unenthusiastic response from most readers and most newspaper editors—Hearst often had to insist that his newspapers run it—so the cartoonist doubled his workload by simultaneously providing his syndicate with a more conventionally funny comic strip. Happily, he didn’t do conventional very well; Stumble Inn is a strip about a fleabag hotel that seems to anticipate John Cleese’s Fawlty Towers. It looks a bit like Mutt and Jeff if that strip had been drawn with the precision of a Renaissance master—some of the most breathtakingly beautiful cartooning I’ve ever seen.

Herriman was born in New Orleans in 1880. A Creole of color, he and his family left the city when he was ten years old and, as Michael Tisserand documents in his meticulous and revelatory biography, relocated to Los Angeles, where they passed for white for the rest of their lives.4 Reading Krazy Kat through that lens adds new layers of complexity to a strip about a black cat and the white mouse (pink on Sundays) who loathes him.5

The most poignant panel in Tumey’s book is in a wonderfully convoluted Stumble Inn sequence in which Mr. Stumble, Owl Eye (the hotel’s house detective), and a deadbeat boarder they’re now trying to lure back to their inn are each disguised in hats, long coats, and false beards. Soda Popp, the sweet young bellboy, with black face and large red lips (Herriman always drew his black humans according to the then standard cartoon physiognomic code), looks at the camouflaged Owl Eye and says, “’at ole ‘Owl Eye’ he’s so disguised I bet he dont know who he is his’sef.”
King Features Syndicate/Library of American Comics
Stumble Inn by George Herriman, April 5, 1924

While Stumble Inn sits in a quiet and conventional suburb of Coconino County, in a dingy small town elsewhere on the comics pages we find Our Boarding House, established in 1921 by Gene Ahern. The homely daily panel orbits around the landlady’s lazy gasbag of a husband, Major Hoople, a hybrid of Falstaff, Munchausen, and W.C. Fields. He chases one hopeless get-rich-quick scheme after another and regales the other lodgers with tall tales about his big-game hunting or his heroism as a prisoner in the Boer War sneaking messages out hidden in alphabet soup.

Syndicates back in the day often required their artists to provide “toppers” for their Sunday pages—small “throwaway” strips that could independently sit atop the main feature so papers could brag about having, say, thirty-two strips in their supplements rather than sixteen, or, even better, they could replace the feature with an ad. Comics always existed in the interstice between art and commerce, and Ahern turned the “minor” toppers into something simultaneously ridiculous and sublime. The Nut Bros., Ches and Wal, sat above Our Boarding House, breaking the fourth wall by offering their pun-laden old chestnuts with a self-aware wink and a surrealistic edge—goofy sight gags and costume changes from one panel to the next.

In 1936 Ahern moved to a larger syndicate at twice the pay but had to leave Major Hoople and the strip’s title behind. A near clone, Judge Puffle, now lived under a new logo, Room and Board. The Squirrel Cage, his topper for Room and Board, developed into one of the underappreciated hidden glories in the history of comics. It started as a direct continuation of The Nut Bros. but transformed into a strip that didn’t just have a surreal edge—it was surreal to its core. 
The Squirrel Cage by Gene Ahern, January 5, 1947

A bearded Little Hitchhiker—the direct model for R. Crumb’s Mr. Natural a generation later—started to pop up in the strip’s shifting landscapes with thumb extended, inscrutably asking: “Nov shmoz ka pop?” The vaguely Slavic-sounding gibberish was pretty much the only thing he ever said, and it became an unanswerable catchphrase, the kind that screwball strips were often able to wormhole into readers’ brains through satisfying repetition.
King Features Syndicate/Library of American Comics
A panel from The Squirrel Cage by Gene Ahern, July 1, 1945

Ahern’s shifting backgrounds and props are less graceful than those in Krazy Kat. The characters, no matter how odd, seem to walk through their uncanny environment and impossible situations with the same resignation as if they were waiting for Godot. It all makes the irreal seem…real. The unearthly world of the top strip exists in a dialectical relationship to the drab boarding house in the strip below that contains the outsized fantasies of Judge Puffle.

Of all the wise guys6 gathered in Screwball!, Milt Gross is perhaps the essence of the idiom—cartooning distilled into precious drops of Banana Oil. (“Banana Oil,” for the uninitiated, was one of those aforementioned wormhole phrases, Gross’s equivalent of Rube Goldberg’s “Baloney!”) Gross was born to Russian-Jewish immigrants in 1895 and raised in the Bronx. In the early 1920s he created Banana Oil, among other strips, as well as an illustrated syndicated weekly newspaper column called Gross Exaggerations that crosscut conversations heard through the dumbwaiter of a small tenement building. Talk about finding one’s voice! It was written in Gross’s fractured Yiddishized English (Is diss a lengwitch? Dunt esk!) and gathered into a best-selling book called Nize Baby in 1926 before becoming a comic strip. His malapropisms and phonetic spelling ache to be read out loud for comprehension. His 1927 skirmish in the war against Christmas was a retelling of the Clement Clark Moore poem, “De Night in De Front from Chreesmas,” which starts:


’Twas de night befurr Chreesmas und hall troo de house
Not a critchure was slipping—not ivvin de souze,
Wot he leeved in de basement high-het like a Tsenator,
Tree gasses whooeezit—dot’s right—it’s de jenitor!

Gross was doubly gifted: an irresistibly risible writer and visually a comics genius. His cartoons are pure doodle: effortless and effervescent. The art looks like he was giggling uncontrollably while the cartoons just shpritzed out of his pen—and his laugh is infectious, bouncing off the page so you laugh too. Of his many creations, Count Screwloose of Tooloose, a Sunday page that launched in 1929, may be his screwiest. It reveals the thematic heart of all the screwball works in this book and beyond: The Count, a half-pint, sausage-nosed, cross-eyed resident of Nuttycrest Sanitarium, has an even smaller companion, a yellow dog named Iggy who wears a Napoleon hat. In each episode the Count devises a nutty new way to escape the institution and reenter the world outside its walls (see illustration on page 9). When he sees just how out of their minds the people out there are, he flees back to Nuttycrest, where his pup rapturously welcomes him home, as the Count exclaims, “Iggy, keep an eye on me.” Gross’s theme is an inverted way of expressing what Salvador Dalí famously said a few years later: “There is only one difference between a madman and me. I am not mad.” Count Screwloose deploys Gross’s spontaneous and flexible pen line to search for the difference between the delusional and the rational. 
King Features Syndicate/Library of American Comics
Count Screwloose of Tooloose by Milt Gross, January 25, 1931

The clearest expression of that search can be found in the subversive work of Harvey Kurtzman, the cartoonist who founded Mad. He is not included in Screwball!, since Tumey felt he had to limit himself to newspaper cartoons from the late nineteenth century through World War II to keep his project manageable—and Kurtzman’s Mad, originally a comic book launched in 1952, falls outside those parameters. But in a short afterword, Tumey writes, “Much of the material in Mad belongs to the lineage traced in this book. In fact, this book could be seen as the road to Mad.”

Indeed, the early Mad is the apotheosis of the aesthetic presented in Screwball! Kurtzman’s precisely timed comics look like a slower, more methodical and cerebral take on Gross’s mishegoss. The core tropes of the Smokey Stover take-no-prisoners chaos—its wacky signage that fills up all extra white space along with backgrounds that burst with sight gags—deeply informed the Mad that Kurtzman wrote and edited. Those “Easter eggs” in the backgrounds (what he and his lifelong collaborator, Bill Elder, called “chicken fat,” and my generation of underground cartoonists called “eyeball kicks”) are clear symptoms of a cartoonist irrepressibly interested in amusing himself as well as the reader.
William M. Gaines Agent, Inc.
‘Mickey Rodent!’; cartoon by Harvey Kurtzman and Bill Elder from MAD no. 19, 1955

If the road to Mad was a loopy rollercoaster, the road from it has been riddled with potholes and has finally run into a wall. Mad was a revolutionary comic book. (Kurtzman transformed it into a magazine in 1954 and left in 1956 after an altercation with the publisher.) Its pointed parodies and satires, its anarchic questioning of authority, and its class-clown silliness shaped the generation that grew up to protest the Vietnam War. Kurtzman was concerned not only with being funny but with interrogating and deconstructing his subjects with a self-reflexive irony: he needed to locate something he could say that was true. (In Mad’s parody of Mickey Mouse, Kurtzman and Elder find something sinister in Disney’s Magic Kingdom—“Mickey Rodent” has stubble on his face and rat-traps on his nose and finger. In the splash panel, the Disney police are seen dragging off “Horace Horseneck” for not wearing the mandatory white gloves.) Reflecting on his work in 1977, Kurtzman said, “Truth is beautiful. What is false offends.”7 Even after Kurtzman left the magazine, Mad retained just enough of its promethean spark to wise up the generation or two after who found it. It has influenced American comedy—from Saturday Night Live to The Simpsons and Colbert’s Late Show—where the spark continues to glow.

But, alas, revolutions grow old and die. I was once told that Rudolph Giuliani grew up with a complete set of Mad. It may have been “fake news,” but the information crushed me: the vaccine that inoculated us against the suffocating 1950s was not a panacea.

This past October, the geriatric remains of Mad were put into cryonic deep freeze, to exist mostly as bimonthly specialty-shop reprints with a planned annual of new material to keep it in half-life in case any swell merchandising opportunities come along. The death knell was sounded last May, when Trump, hoping to tar the Democratic candidate Pete Buttigieg with one of his sophomoric and indelible zingers, announced that “Alfred E. Neuman cannot become president of the United States.”8 Asked about it, the thirty-seven-year-old mayor responded (either cannily or candidly, or both), “I’ll be honest, I had to Google that. I guess it’s just a generational thing. I didn’t get the reference.”

Yet the legacy of Mad is still with us. Trump is often referred to in the press as a “screwball,” but “screwball”—an ironic term of endearment, a synonym for “lovable eccentric”—just won’t do for a pathological, lying narcissist with dangerous sociopathic tendencies.

The existential threat facing screwball humor today comes from a “screwball” president who has weaponized postmodernism. Mad taught me to be skeptical of all mass media and to question reality (including my beloved Mad), but the lesson requires a belief that there might actually be something like consensual reality. Nonsense assumes there’s such a thing as sense and puts it in relief by denying reality’s power even if just for a moment.

*Foolish Question #25,743,001: “So, is screwball humor dead?”
“Sorry, I can’t hear you, I have a banana in my ear.”
*

In early December, a banana duct-taped to a wall—Maurizio Cattelan’s Comedian—sold for $120,000 at Art Basel in Miami. It made headlines all around the world for a minute, as either an immortal work of twenty-first-century art or an event destined to be more ephemeral than any of the pages in Screwball! A few days later a New York City–based performance artist pulled the banana off the wall and ate it, declaring the installation “very delicious.”

This caper brings to mind Goldberg’s “warning” at the front of Chasing the Blues, his first anthology of cartoons, in 1912:


I must burden you with a terrible confession. This is not a work of art!…


My artistic deficiencies remove me far from the sphere of Rembrandt and Michael Angelo. My ever-present realization of the material virtues of kidney stew and gorgonzola cheese has permanently destroyed whatever of the ethereal that may have been born within me…. A touch of art may nourish the soul, but a good laugh always aids the digestion.

*


Foolish Question #25,743,001.75: “So, can screwball comics ever be art?”


“No, you sap. Humor is the last thing one can take seriously—it’s priceless.”


1

Jennifer George, The Art of Rube Goldberg: (A) Inventive (B) Cartoon (C) Genius (Harry N. Abrams), p. 16.
2

Peter C. Marzio, Rube Goldberg: His Life and Work (Harper and Row), p. 305.
3

Full disclosure: the tribe of obsessive comics scholars interested in this sort of thing is a small one. Tumey and I became friends through a screwball blog that he started in 2012 to contemplate the subject. See screwballcomics.blogspot .com.
4

Michael Tisserand, KRAZY: George Herriman, a Life in Black and White (Harper, 2015).
5

Chris Ware wrote about Herriman and race for the NYR Daily: “To Walk in Beauty,” January 29, 2017.
6

And in my best post–David Foster Wallace footnote mode, might I add what may not need saying at all: every one of these wise guys was a guy. While there have been many female screwballs in the history of the performing arts—Fanny Brice, Beatrice Lillie, Carole Lombard, and Gracie Allen come to mind, as do the two broads in Broad City—there don’t appear to have been any female screwballs at all in the overwhelmingly male domain of early-twentieth-century newspaper comics. I refer interested readers to historian and comics artist Trina Robbins’s several books devoted to casting light on women cartoonists and their accomplishments.
7

Bill Schelly, Harvey Kurtzman: The Man Who Created Mad and Revolutionized Humor in America (Fantagraphics, 2015), p. ix.
8

For those kiddies too young to know, the venerable “What—Me Worry?” gap-toothed grinning idiot served as Mad’s mascot from 1956 until its demise.

Was Psychedelic Guru Terence McKenna Goofing About 2012 Prophecy?
By John Horgan on June 6, 2012 
Image result for terence mckenna
Rational Scientific American readers surely scoff at claims—based on ancient Mayan calendars and other esoterica—that life as we know it will end this December, especially now that NASA experts have "crushed" the prophecy. But many folks out there are reportedly worried. Perhaps I can allay their anxieties by relating my encounter with a prominent popularizer of the 2012-doomsday meme, psychedelic guru Terence McKenna.

In his books and lectures, McKenna extolled psychedelic drugs as a spiritual path superior to that of any mainstream religion. His book The Food of the Gods (Bantam 1992) was a rigorous argument—complete with footnotes and bibliography—that mind-expanding plants and fungi catalyzed the transformation of our brutish ancestors into cultured modern humans. The visions inspired in our ancestors by these substances—and particularly by plants containing psilocybin, dimethytryptamine (DMT) and other psychedelics--were the seeds from which language sprung, followed by the arts, religion, philosophy, science and all of human culture, McKenna asserted. By outlawing psychedelics, he said, we have cut ourselves off from the wellspring of our humanity.

Food of the Gods showed that McKenna could play the serious scholar when he chose. But he was truer to himself in True Hallucinations (HarperSanFrancisco, 1993), a memoir packed with psychedelic tall tales and wild riffs on the nature of reality. McKenna was less a scientist or even philosopher than a performance artist or jester, and I mean that as a compliment. What follows is an edited excerpt from Rational Mysticism (Houghton Mifflin, 2003), which describes my 1999 meeting with McKenna and my attempt to find out what he really thought would happen in December 2012.
Image result for terence mckenna

The Man in the Purple Sparkly Suit

Raised in a Colorado ranching town, Terence McKenna discovered psychedelics in 1965 when he enrolled at the University of California, Berkeley, where he studied ecology and shamanism (ah, the Sixties). He became an itinerant scholar-adventurer, traveling to the Far East and South America in search of exotic mind-altering philosophies and substances, including two of his favorites, magic mushrooms and ayahuasca, a DMT-laced tea brewed from plants native to Amazonia.

In the early 1970s, McKenna and his younger brother Dennis (who became a slightly more sober authority on psychedelics) trekked into the jungles of Colombia, where they ingested enormous quantities of ayahuasca, marijuana and mushrooms. McKenna's visions persuaded him that tryptamines--a class of chemicals that includes DMT and psilocybin—were messages from an alien civilization, or "overmind."

McKenna's trips also inspired his "timewave" theory, which holds that existence and even time itself emerge from the interaction between two opposing forces, one conservative and the other creative. McKenna devised a mathematical model that charted the ebbs and surges of creative, "novel" events—including wars, revolutions, famines, plagues and scientific and technological advances--throughout human history. When McKenna extrapolated the model into the future, it predicted a huge spike in novelty in December 2012.

McKenna first suggested that something dramatic might happen in 2012 in his 1975 book The Invisible Landscape, co-written with Dennis, and he elaborated on his prediction—and the timewave theory--in True Hallucinations. In the latter book, McKenna's arch, ultra-hip tone gave way now and then to moments of genuine poignancy. He confessed that, as he was finishing his book, his sixteen-year marriage to his wife Kat, with whom he had two children, was dissolving. All his psychedelic insights, he said, had "done nothing to mitigate or ward off the ordinary vicissitudes of life. Like the Soul in Yeats's poem I am still an eternal thing fastened to the body of a dying animal."

I found this blend of earnestness and irony intriguing. McKenna himself seemed unsure whether his psychedelic visions were genuine revelations or just delusions, whether he was serious when he talked about the timewave and the impending apocalypse or just goofing. I hoped to get a better sense of McKenna’s true beliefs in May 1999, when he left his home in Hawaii and traveled to New York City to give a talk.
Image result for terence mckenna

As I took my tape recorder and pad out of my backpack on the night of McKenna's performance at an auditorium in mid-town Manhattan, a man in front of me, whom I'll call Glen, turned around to chat. He was bearded, about my age, a bit wild-eyed. Glen had never seen "Terence" before, but he owned all his books and two of his taped lectures. He was blown away by Terence's prediction that in December 2012 the shit was going to hit the fan. Glen hoped that tonight Terence would get a little more specific about what was going to happen.

Glen had dropped a lot of acid when he was young. In fact, he liked drugs so much that he became a pharmacologist. Working in a pharmacy got boring after a while, so he became a psychotherapist specializing in "energy healing." Glen hadn’t done acid in a long time. He really wanted to do some again, but he didn’t know where to get it. Glen looked at me expectantly. When I shrugged, he glumly turned his back on me.


Applause erupted as McKenna strode onto the stage and sat in an armchair. Tall, bearded, owlish, with curly brown hair going grey, he looked like a gaunt, Gaelic Allen Ginsberg. His voice had a wheedling, nasal edge. His speech was much like his prose, a bouillabaisse of scientific and high-tech visions, primordial lore, environmental fear-mongering, anarchic political commentary and psychedelic metaphysics.

He kept wandering down apparent digressions that somehow circled back to his main theme: The purpose of existence is novelty-generation, and our culture is generating novelty at an accelerating rate. The internet, nanotechnology, pharmacology, superstring theory, genetic engineering, artificial intelligence are shattering our old paradigms and bearing us toward some fantastic, psychedelic future. And according to his calculations on December 21, 2012, something big would happen, something apocalyptic, that would bring about "the end of life as we know it."


Image result for terence mckenna

One possible catalyst, McKenna said, was artificial intelligence, which would soon produce machines smarter than any human. (Yes, McKenna was an early proponent of the Singularity.) "If the hyper-intelligent AI is not filled with bodhisattva compassion," McKenna warned, "then our ass is probably grass." McKenna presented all of this with a mischievous grin, daring us to take him seriously.

I was still unsure what to make of McKenna when I met him the next day for lunch atop the Millennium Hotel, a gleaming ebony monolith in downtown Manhattan just across from the World Trade Center (which was still standing). We sat in a booth beside a window, McKenna with his back to the glass, beyond which skyscrapers loomed. McKenna looked even more owlish up close than he had on stage. He wore a black t-shirt emblazoned with a bronze dancing figure. The gap between his front teeth enhanced his air of raffishness.

When I said that his talk the previous evening had been quite entertaining, McKenna rolled my descriptor slowly around in his mouth—"En-ter-tain-ing"—as if he wasn’t sure he liked its taste. I added that I thought he had exaggerated the extent to which old scientific paradigms were crumbling and yielding to a radical new "psychedelic" vision of reality. I admitted that I had once been a staff writer for Scientific American, and I still shared that magazine's rather skeptical, conservative perspective.

Scientific American, McKenna assured me, was one of his favorite magazines. It was "incredibly psychedelic," and a major source of his inspiration. Anyone who reads Scientific American, McKenna continued, can see that science is in the throes of "an enormous crisis, or maybe not crisis but turmoil, based on the breakdown of paradigms." Just look at superstrings, parallel universes, hyper-dimensions, time travel and other bizarre notions emerging from physics, and technological advances like artificial intelligence. "Nobody knows what mind is," McKenna said, when it is "operating at multi-gigahertz speeds in virtual realities unconstrained by gravity and economy of any sort."
Image result for terence mckenna


I told McKenna that many of the supposed revolutionary advances he had read about in Scientific American and elsewhere were grossly over-sold. Artificial intelligence, far from being on the verge of creating "hyper-intelligent" machines, is a joke, a failure, with a string of failed prophesies behind it. As for superstrings, they are so small that it would take a particle accelerator the size of the Milky Way to detect them; that is why many physicists believe that superstring theory and its variants will turn out to be a theoretical dead end.

McKenna shrugged. Whether or not superstring theory pans out, no one can deny that physics is "a field in rich ferment, in need of broad new paradigmatic visions. I mean, sort of where physics was circa 1898, when no one could make sense of Planck's black-body studies and Einstein was chasing girls around bars in Basel, and it just hadn't quite frozen out to be what it was." What was most impressive about McKenna’s riffs was their apparent effortlessness.

When I asked McKenna if he seriously believed that psilocybin mushrooms represent messages from an alien intelligence, he told me that his proposal was not as whimsical as it sounded. Mushroom spores, he said, can survive the cold of outer space; in fact, mushroom cultivators here on earth store the spores in liquid nitrogen. "So if somebody were designing a bio-informational package, a spore is how you would go. Millions of them pushed around by light pressure and gravitational dynamics would percolate throughout the galaxy."

Psilocybin's unusual chemical structure suggested an unusual origin, McKenna added. It is "the only four-phosphorelated indol in all of nature," which indicated "that maybe it came from outside the terrestrial ecosystem." The personality of the mushroom, as revealed by the experiences it triggers in humans, also had an alien, science-fiction quality. "It presents itself as this particular slice of alien, aesthetic motif from Hollywood--the shiny metallic surfaces, the mercuroid forms, the piercing, instantaneous biointelligence." McKenna was enjoying himself; he uttered "aesthetic motif" and "mercuroid forms" with a kind of tactile relish.

I said my impression was that he was often... kidding. McKenna guffawed. "I'm Irish! What's your excuse!" He added soberly, "I'm cynical, and a fair cynic must also be cynical about himself." He called himself a "visionary fool," who "propounds this thing which is a trillion to one shot"—the timewave theory--and then "gets to live out the inevitably humorous implications of that."

He recognized that some people think he is "softheaded" because of his rants about "self-transforming machine elves from hyperspace and all that." Actually, he had a "keen nose for other peoples' bullshit." He despised New Agers who gave credibility to alien-abduction rumors or claimed to be channeling the spirits of long-dead sages. "These things are like intellectual viruses loose in the theater of discourse,” he said scornfully. "And you can't really argue with these people, because they don't understand the rules of argument." McKenna added that "the howling tide of unreason beats against pure fact with incredible fury."

When I told him that his writing sometimes reminded me of one of my favorite authors, the Argentinian fabulist Jorge Luis Borges, McKenna was delighted. He was a Borges fan, too. McKenna shared Borges's intuition that "scattered through the ordinary world there are books and artifacts and perhaps people who are like doorways into impossible realms, of impossible and contradictory truth. DMT being the chief example of this." DMT transports us not to heaven or hell but to a "parallel dimension that is somehow intimately tangled up with our own language processes and how we make reality." Modern science, McKenna remarked, "operates on the assumption that there are no miracles at the macro-physical level. Well, I would put it to you, DMT is a miracle at the macro-physical level. And the smarter you are, the more impressive it is."

I said that his riffs on extraterrestrial psilocybin spores and the coming apocalypse struck me as intellectual performance art, not to be taken too seriously, but I suspected that many of his fans took his ideas literally. I told him about my conversation the previous evening with Glen, the man who sat in front of me at McKenna's lecture. McKenna nodded ruefully. "My function is largely pedagogical," he said, "trying to teach people, first of all, that the world is a weird, weird place. And then, so what do you do about it? Do you become a Scientologist? Do you return to your Irish Catholic roots? What is the response," he said, "to the discovery that the world really is totally weird?"

When I told McKenna that I wasn’t sure exactly how his timewave theory worked, he launched into a vigorous explication of it. The essence of the theory is that existence emerges from the clash of two forces: not good and evil but habit and novelty. Habit is entropic, repetitious, conservative; novelty is creative, disjunctive, progressive. "In all processes at any scale, you can see these two forces grinding against each other. You can also see that novelty is winning."

As novelty increases, so does complexity. From the big bang on, McKenna elaborated, "the universe has been complexifying, and each level of complexity achieved becomes the platform for a further ascent into complexity. So fusion in early stars creates heavy elements and carbon. That becomes the basis for molecular chemistry. That becomes the basis for photobionic life. That becomes the basis for eukaryotic stuff. That becomes the basis for multicellular organisms, that for higher animals, that for culture, that for machine symbiosis, and on and on."

Modern science often depicts humanity as an accident, a bit player in the universe, but the timewave theory puts us at center stage in the cosmic drama, according to McKenna. If he had to define God, he would define it as this novelty-generating process. This definition could serve as the basis for a new moral order. "Anything which destroyed novelty would be bad, and anything which helped build it up and advance it would be good."

What about Nazi Germany? I asked. Wasn’t that novel? Or the hydrogen bomb? Or AIDS? McKenna acknowledged that novelty may be accompanied by increased suffering and death, but in general progress of some kind emerges out of these catastrophes. In the case of Nazi Germany, “the twentieth century had to deal with the issue of fascism. It couldn't close its eyes and waltz past that. And it did! So in that sense Nazi Germany, with its science-fiction production values and its silly rhetoric, served a useful purpose.” McKenna, deep down, was apparently an optimist.

As early as the 1970's, McKenna sought to make his drug-inspired insight precise and quantitative. He discovered that fractals, mathematical objects whose patterns repeat themselves at different scales, provide an excellent model of the entropy-novelty dialectic. "The fall of the dynasty, the collapse of the love affair, the ruin of the corporation, and the death of the amoeba are all somehow dramas with the same energy points and flows imbedded in them."

So what did McKenna really think would happen on December 21, 2012? "If you really understand what I'm saying," he replied, "you would understand it can't be said. It's a prediction of an unpredictable event." The event will be "some enormously reality-rearranging thing." Scientists will invent a truly intelligent computer, or a time-travel machine. Perhaps we will be visited by an alien spaceship, or an asteroid. "I don't know if it's built into the laws of spacetime, or it's generated out of human inventiveness, or whether it's a mile and a half wide and arrives unexpectedly in the center of North America."

But did he really think the apocalypse would arrive on December 21, 2012? "Well..." McKenna hesitated. "No." He had merely created one mathematical model of the flow and ebb of novelty in history. "It's a weak case, because history is not a mathematically defined entity," he said. His model was "just a kind of fantasizing within a certain kind of vocabulary." McKenna still believed in the legitimacy of his project, even if his particular model turned out to be a failure. "I'm trying to redeem history, make it make sense, show that it obeys laws," he said.

But he couldn’t stop there. His eyes glittering, he divulged a "huge--quote unquote—coincidence" involving his prophecy. After he made his prediction that the apocalypse would occur on December 21, 2012, he learned that thousands of years ago Mayan astronomers had predicted the world would end on the very same day. "And now there has been new scholarship that they were tracking the galactic center and its precessional path through the ecliptic plane. What does all this mean?" McKenna leaned toward me, his eyes slitted and his teeth bared. "It means we are trapped in software written by the ghost of Jorge Luis Borges!" He threw his head back and cackled. "Tell that to the National Academy of Sciences!"

Two weeks after I met him in New York, just after he returned to his home in Hawaii, McKenna collapsed in the throes of a seizure. Tests revealed an enormous, malignant tumor deep inside his brain. McKenna's choices were grim. The physician recommended gamma-ray surgery, in which converging beams of radiation bombard the tumor. This treatment might give McKenna another six months to a year, but it could also cause dementia and other side effects. Untreated, McKenna would probably die within a month. McKenna chose the radiation surgery. He made it past the great millennial cusp, but he went downhill rapidly after that. He died on April 3, 2000, less than eleven months after I met him. He was 53.

During my lunch with McKenna at the Millennium Hotel, I had asked him if all his psychedelic excursions had mitigated his fear of death. His reply revealed how hard-headed he was, beneath all the phantasmagoric blarney. "I wouldn't say I have no fear of death," he said. "I am interested. Ultimately, my assumption is that, if I have the opportunity, I would embrace it if I saw it coming. But I'm scientific in my approach to my own knowledge of death. In other words, DMT may show you what the dying brain is like… But dying is not death. Near-death experiences are not death experiences."

Just before his death, another psychedelic advocate told Wired that McKenna’s outrageousness, like that of Timothy Leary, may have harmed the psychedelic cause: "Some people would certainly argue that it doesn’t help to have the most famous second-generation psychedelicist be another man in a purple sparkly suit." McKenna’s attempts to serve as a serious advocate for psychedelics were no doubt undermined by his irony and wit, his penchant for Borgesian fantasy, but those were precisely the qualities that I found so appealing in him. To McKenna wonder was the essence of gnosis. As he told me during our interview, all his confabulations were intended to make us see that the world is "a weird, weird place."

To shake us out of our perceptual torpor, McKenna played the holy fool, the crazy wisdom sage. He pushed our faces in the most exotic, lurid inventions of modern science and technology, including superstring theory, time travel, virtual reality and artificial intelligence. He even stooped to speculating about extraterrestrials and to forecasting the end of life as we know it. What elevated him above most other prophets was that he delivered his prophesies with a wink, an implicit acknowledgement that ultimately reality is stranger than we can say or even imagine.

Image courtesy Wikimedia Commons.

Addendum: Fans of McKenna will want to check out this wonderful series of videos, the Terence McKenna Omnibus, including one in which McKenna says: "Shamanism is just show business and philosophy is just a branch of that vaudevillian impulse." http://www.cognitionfactor.net/cinema/screen-7.aspx

The views expressed are those of the author(s) and are not necessarily those of Scientific American.
ABOUT THE AUTHOR(S)

John Horgan directs the Center for Science Writings at the Stevens Institute of Technology. His books include The End of Science, The End of War and Mind-Body Problems, available for free at mindbodyproblems.com.
Astrology, Tarot Cards and Psychotherapy
Responses to "The Cancer Industry: Hype vs. Real