Tuesday, August 08, 2023

Oppenheimer: Beyond the Hollywood Lens

 IEH JODAT – SAM ROSENTHAL

Facebook

Christopher Nolan’s new biographical thriller film, Oppenheimer, chronicles the life of J. Robert Oppenheimer, known as the “father of the atomic bomb.” In Nolan’s typical Hollywood-for-thinking-people style, the film strikes an ambiguous tone, neither triumphal nor admonishing of Oppenheimer and his team’s dubious accomplishment. As we follow Oppenheimer’s life and professional trajectory, we’re presented with the image of a person living through an era of uncertainty. Oppenheimer himself is depicted as conflicted, maybe even tormented, by his pivotal role in creating and developing this weapon of mass murder, while his own government finds Oppenheimer alternately to be an object of worship and suspicion, culminating in his investigation by the House Un-American Activities Committee.

What the film fails to explicitly portray, however, is the profound global effect that Oppenheimer’s personal saga and his work with the Manhattan Project – the secret U.S.-funded research and development program during the Second World War which led to the successful creation of the atomic bomb – still have on the world today. As Danish physicist Neils Henrik David Bohr (played by Kenneth Branagh) warns Oppenheimer in the film, “The power you are about to reveal will forever outlive [World War II], and the world is not prepared.” Even to this day, one can make a strong argument that humanity is still not prepared for the power that Oppenheimer and his team unleashed on the world. Oppenheimer has been largely received as a historical document, and it is, but seen through another lens, it could almost serve as a public service announcement, reminding us of the very real and imminent threat that nuclear weapons still pose.

The arrival of Oppenheimer, during a time of heightened global tensions among the most powerful nuclear powers in the world, brings a feeling of profound portent. In January of this year, the “Doomsday Clock,” which is managed by the Bulletin of Atomic Scientists, was moved to 90 seconds until midnight, serving as a haunting reminder of the immediate risk of a human-made nuclear catastrophe. This is the closest the clock has been to midnight since its inception in 1947. A cursory look at geopolitical history from 1947 through the present reveals several moments of intense tension between nuclear powers: the first hydrogen bomb test, the Cuban Missile Crisis, the proliferation of nuclear weapons into India and Pakistan. That the Bulletin believes the nuclear risk today exceeds that of these earlier historical moments should fill us all with a deep sense of dread.

Our epoch, too, is marked by a worrisome decline in diplomatic dialogue and cooperation between nuclear powers. In Oppenheimer’s time, there were just one, then two, nuclear powers, bound together by the paradoxical, but compelling, logic of mutually assured destruction. Today, combined, China, Israel, North Korea, Russia, France, Pakistan, the United Kingdom and the U.S. own roughly 12,500 nuclear weapons, many of which are far more powerful than “Little Boy,” the bomb that was dropped on Hiroshima. An additional six bases in five NATO countries host U.S. nuclear weapons, increasing the physical dispersion of these weapons of mass destruction and death. Recently, Russia’s Vladimir Putin announced he had deployed Russian nukes to neighboring Belarus; this comes after he earlier raised the specter of using nuclear weapons in the Ukraine War. There should be little doubt that the use of a nuclear weapon on the European continent would trigger a full-scale nuclear war and a near-certain global apocalypse.

So, when we go to see Oppenheimer this summer, we can view it for what it is — a Hollywoodified summation of one of the most consequential scientific breakthroughs in world history — but we should also remember that the world that Oppenheimer created remains our reality today. The threat of nuclear weapons is far from a relic of a bygone era, and it remains a contemporary concern. So, long after the theater lights fade, and we are no longer immersed in the theatrics, we should think about the grave implications of this reality — and collectively work to avert the haunting specter of a nuclear calamity.

This first appeared on Progressive Hub.

Oppenheimer, the Hero? Selling America by the Trinitrotoluene Ton


 
 AUGUST 4, 2023
Facebook

Photograph Source: Los Alamos National Laboratory – https://about.lanl.gov/history-innovation/badges/

As a physicist and baby boomer whose parents both served in World War II, I had to see Christopher Nolan’s cinematic creation about one of the twentieth century’s most enigmatic figures, J. Robert Oppenheimer, the self-styled Destroyer of Worlds known as Oppie to most. Having seen the 1980 BBC miniseries on the Manhattan Project with Sam Waterston, read numerous books on the mysterious workings at Los Alamos, and taught quantum mechanics (the same subject Oppie first taught), I knew a bit of his story, but I was keen to see how Hollywood depicted what some consider the defining moment of modern history, of which Oppie was lead architect. I should have known better. At least they got the physics right.

The figures for Hiroshima and Nagasaki are well known in all their statistical horror – 34 kilotons of TNT, 68,000 buildings destroyed, 170,000 people dead (10,000 per square mile) from two 1,800-foot “air bursts.” Some of the hundreds of thousands of surviving hibakusha were so badly disfigured they would never again show themselves in public or have children because of the fear of birth defects. Not that such horror is shown in Oppenheimer the movie as the human consequences of detonating an atomic bomb are oddly neutered throughout.

Oppenheimer is not about a bomb or the destruction of two cities at the end of a war. It’s about the rise, fall, and rise of its creator, ever tormented at the hands of a divided American political class. Is he a Red? A security risk? Is he a reformed anti-war activist? Nobody knows, the dichotomy of his persona presented on par with the duality of energy and matter, spectacle and reality, life and death. The movie excelled in turning destruction into victory, Oppie’s victory, while the bomb becomes essentially a $2-billion McGuffin exploded two-thirds in.

And so for two hours we get a biopic of the man who oversaw the building of a bomb, followed by some prosecutorial drama between Lewis Strauss, the head of the US Atomic Energy Commission, the pronunciation of whose name is whimsically dramatized. Strauss orchestrated Oppie’s downfall by having his security clearance renewal denied in 1954 amid a growing Cold War of “missile gaps” and “Red scares.” All because Oppie was against developing the “Super,” which could unearth even more destruction from the unseen mysteries of matter (a more powerful fusion H-bomb detonated by an already well-developed fission A-bomb). Oppie’s undoing may have started at a 1949 General Advisory Committee meeting after he stated that the destructive power of the H-bomb is unlimited and that such a “weapon of genocide … should never be produced.”[1] Could sanity prevail after the madness, mass destruction, and horrors of world war? Maybe curses could be returned to Pandora’s box. Maybe Prometheus could be freed. Unfortunately, the possibility of a non-nuclear world doesn’t look good in digital detail.

In the preface to the 2005 book upon which the movie is based, American Prometheus: The Triumph and Tragedy of J. Robert Oppenheimer by Kai Bird and Martin J. Sherwin, Oppie’s answer to nuclear annihilation is to eliminate nuclear weapons altogether. Yet the movie is more interested in Oppie’s political comeuppance than exploring any Faustian bargain. Enrico Fermi says nothing, while Niels Bohr gets one memorable line, “New world. New Weapons,” and little else. The movie and book both consider Oppie’s maltreatment as the unique selling point. American exceptionalism is delivered via technical superiority and political infighting without a depiction of mass carnage or real victims.

In truth, many scientists were opposed to the use of atomic weapons. Often called the Father of the Bomb because of his famous energy-matter equation and 1939 letter to Roosevelt, Einstein said, “Had I known the Germans would not succeed in producing an atomic bomb, I never would have lifted a finger.”[2] He also sent a second letter to Roosevelt dated March 25, 1945, about the lack of contact between scientists and the government, but FDR died before he could read it. Einstein would spend the rest of his life campaigning for arms control, reduced militaries, and a “supernational” security authority.

Fermi wanted the focus after the war to shift from making weapons to peaceful aims, hoping “to devote more and more activity to peaceful purposes and less and less to the production of weapons.”[3] Bohr even met FDR to talk about “atomic diplomacy” with the Soviets and limit the escalation of a coming new conflict in an unholy arms race, shelved by Winston Churchill. And so the bomb-making spree began, estimated at $5 trillion for the US and likely as much or more for the Soviets in the decades of insanity that followed. Imagine $10 trillion spent on Shiva’s other incarnations?

Okay, that’s what Oppenheimer wasn’t about, so what is it about? The three hours did pass by quickly, so I must have been entertained despite an often-intrusive musical score. Part battle between the boffins and the brass. Could the head military honcho General Leslie Groves control the supposed free-thinking lefty scientists like Oppie, Enrico Fermi, and Leo Szilard to deliver the bomb before the Nazis and afterwards when their consciences returned in peacetime? It’s one thing to build a doomsday machine to deter future aggression that can never be used, but a whole other world-ending deal to use it. Could Oppie and Co. be controlled after their nuclear toolmaking was over?

Part stargazing. Matt Damon was suitably gruff as the pragmatic Groves, more socially maladroit than the suave on-screen Oppenheimer played by Cillian Murphy, although the gruff Groves oddly turned to mumbles in 1956 about the horror of ground zero, including the effects of lingering radiation and the pain of living with mutated genes. The real Groves grimly stated that an atomic blast was “a very pleasant way to die.”[4] Okay, blockbusters don’t do ugly, so instead we get Groves bullying a recalcitrant scientist who expressed doubts about the project: “How about because this is the most important fucking thing to ever happen in the history of the world?” Arrogant, gruff, and cinematic fiction. Artistic license.

Was Oppenheimer an anti-war movie? That rendering is problematic when so much screen time is dedicated to building the bomb, while no scenes of real people dying are shown, the ultimate reality of the biggest-ever effing weapon of mass destruction. Nor are there scenes of arms talks, protest marches, or a ticking Doomsday Clock? Millions of people were and are against nuclear weapons, not just the mute master-creator.

Was Oppenheimer serious about arms reduction as suggested by the black-and-white versus colour temporal jumps to build up, tear down, and reconstruct Oppie’s supposed activist credentials? That possibility gets lost in the conflated spectacle of personal redemption within an odious political culture, Oppie unstuck in time like poor Billy Pilgrim in Kurt Vonnegut’s Slaughterhouse-Five. Vonnegut’s book was based on the real fire-bombing of Dresden, six months before Hiroshima and Nagasaki, where Vonnegut as a POW survived. Anti-war on every page. In Oppenheimer, however, we get Oppie’s pain as he is tortured over and over again. You can almost see his liver being pecked out.

Oppenheimer is a movie about vendettas and settling scores. Oppie versus Strauss, Oppie versus Groves (somewhat), Oppie versus Truman (and the military-industrial complex), and the US v. Germany, Japan, and Russia. Japan paid for Pearl Harbor. They were always going to pay, even if the goal of the Manhattan Project was to beat Germany to the bomb. Exonerated and yet humiliated at the same time, Oppie is cut down to size by Gary Oldman’s Harry Truman, who says to him in the Oval Office, “You think anyone in Hiroshima, Nagasaki, gives a shit who built the bomb? They care who dropped it. I did.” That is, the United States, just or unjust.

Forget the misfit prophet versus establishment theme as in genius Mozart versus jealous Salieri. Ditto overcoming personal doubts about the harm caused in life and death. Not when an old lover’s suicide is made out to be more important than the hundreds of thousands who perished on August 6 and 9.

Redemption of a reputation? How can a Destroyer of Worlds’ reputation be redeemed? In From Faust to Strangelove, Roslynn Haynes noted that physicists could no longer be considered innocents after Hiroshima and Nagasaki, their moral superiority questioned as was “their ability to initiate a new, peaceful society.”[5] Indeed, we still have wars despite the weapons (roughly 12,500 nuclear weapons at last count[6]). Physics may be to blame in all its rock-turning obsession, aided by unlimited government spending.

Are we meant to see ourselves in a flawed Destroyer of Worlds’ life? In America’s flaws? We are all post-war tech children now, controlled by so much military cum commercial material: transistors, microwaves, the internet (AI robot drone delivery vehicles coming soon). Much of the science and engineering of today comes from military development. We are children of the bomb, but how can we be faulted other than by some artificially manufactured sin?

Alas, tools always get used. And so 12,500 theoretically impotent yet real world-destroyers await their eventual use. As a former US air force ICBM operator noted, “a good day in nuclear missile operations is a quiet one,” which fortunately for him and us most days were.[7] I don’t want to know about a bad day. Was that Nolan’s message? – nuclear annihilation is inevitable unless we do something. Even in a sanitized spectacle that denies real pain.

Oppenheimer didn’t dare to explore the reasons behind dropping atomic bombs on two cities in a country where the war was essentially over: revenge, arrogance, and superiority. That movie would be called Black Rain, showing how one can be more than intellectually opposed to killing. That Oppie had more than just figurative blood on his hands. The wrong Greek myth was employed. The wrong script.

Indeed, Oppenheimer suffers from the same criticism levelled by the anti-war wife of a buddy of Kurt Vonnegut’s in Slaughterhouse-Five, which he then turns into the theme of his book. Nolan needed a Mary O’Hare to tell him that war is not glory. It never is. That no one’s reputation can be redeemed by mass destruction. As Mary accuses him in the introduction: “You’ll pretend you were men instead of babies, and you’ll be played in the movies by Frank Sinatra and John Wayne or some of those other glamorous, war-loving, dirty old men. And war will look just wonderful, so we’ll have a lot more of them. And they’ll be fought by babies like the babies upstairs.”

What a waste to rationalize the horrors of the nuclear era in one man’s rise and fall and rise? Or to glorify making enemies of friends? Or to measure human progress and security in megatons of unusable TNT? Oppie the hero? All that was missing was the cape. “So it goes.”

Notes

[1] Baggott, J., Atomic: The First War of Physics and the Secret History of the Atomic Bomb, p. 456, Icon Books, London, 2019.

[2] Isaacson, W., Einstein: His Life and Universe, p. 485, Simon & Schuster, London, 2007.

[3] Fermi, E., “Discovery of fission,” American Institute of Physics, 1952.

[4] “US Congress, Senate, Special Committee on Atomic Energy,” 79th Congress, Washington, D.C., November 1945.

[5] Haynes, R.D., From Faust to Strangelove: Representations of the Scientist in Western Literature, p. 303, The Johns Hopkins University Press, Baltimore, 1994.

[6] “Nuclear Weapons: Who Has What at a Glance,” Arms Control Association, June 2023.

[7] Smith, C., “I was a US nuclear missile operator. I’m grateful for the Oppenheimer film,” The Guardian, July 24, 2023.

John K. Whitea former lecturer in physics and education at University College Dublin and the University of Oviedo. He is the editor of the energy news service E21NS and author of Do The Math!: On Growth, Greed, and Strategic Thinking (Sage, 2013)Do The Math! is also available in a Kindle edition. He can be reached at: johnkingstonwhite@gmail.com

Americans Need to Acknowledge Washington’s Wartime Atrocities


Renewed attention on two historical episodes has revived debate about the wartime conduct of the United States. One was the release in theaters of the movie Oppenheimer, which highlights the role played by physicist J. Robert Oppenheimer in developing the first atomic bomb. The other episode was the 70th anniversary of the Korea Armistice, which brought a halt to the bloody conflict in that country.

The reaction among both the public and the foreign policy blob underscored how too many Americans remain unwilling to confront the ugly realities of Washington’s behavior. Yet there is compelling evidence that U.S. leaders committed horrifying war crimes. President Harry Truman’s decision to approve the dropping of atomic bombs on Hiroshima and Nagasaki has attracted considerable attention over the decades. Far less attention has been paid to the massive US bombing campaign against North Korea during the Korean "police action." Both cases, however, involved gratuitous assaults that killed or maimed huge numbers of innocent civilians.

A two-part excuse for the incineration of Hiroshima and Nagasaki emerged promptly, and it has remained largely unchanged over the decades. Part one of the justification is that Japan initiated the war with its unprovoked attack on Pearl Harbor, and Tokyo therefore was to blame for all of the resulting wartime tragedies. It was a cynical rationale that should have been uncompelling to any objective observer. The attack was hardly "unprovoked," given the prewar conduct of Franklin D. Roosevelt’s administration, especially the imposition of steel and oil embargoes that were strangling Japan’s economy. But even if the Pearl Harbor attack had been cowardly and unprovoked, that episode did not justify the Hiroshima and Nagasaki bombings. Pearl Harbor was a military target; the two Japanese cities were civilian population centers of little military value. Japan’s attack was an act of war; the US assaults on Hiroshima and Nagasaki were war crimes.

The second part of the conventional excuse circulated by US officials, the news media, and a distressing number of historians is that the only alternative to the atomic bombings would have been a massive invasion of the Japanese homeland. President Truman and his supporters contended that such an invasion would have killed American and Japanese troops in the millions. Therefore, the rationale went, using the atomic bombs actually saved lives – including Japanese lives. That perspective remains the conventional view.

Such a justification, though, is based on the dubious assumption that the only alternative to the bombings was a full-scale invasion. Yet a blockade obviously was another option, and that approach would at least have given Japanese leaders time to contemplate their country’s hopeless position. Peace overtures by Tokyo could not be ruled out under such conditions. That scenario was even more plausible if Washington had been willing to drop its rigid demand for Japan’s unconditional surrender. (Persisting in that maximalist approach almost certainly prolonged the European phase of World War II as well, resulting in tens of thousands of needless additional casualties). Ironically, Washington ultimately accepted Tokyo’s capitulation with an important condition attached – that the emperor could remain on the throne. Conveying flexibility on that issue (and perhaps a few others) might have ended the war weeks or even months before mushroom clouds appeared above Hiroshima and Nagasaki.

clear plurality of Americans, though, continue to defend the decision to drop the atomic bombs. They appear even less likely to confront the reality of US brutality in Korea. A new article by journalist James Bovard highlights outright atrocities by the US military, including the strafing of civilians who dared to move at night. There also was the systematic carpet bombing of targets throughout North Korea that other experts have documented.

Bovard notes that "Slaughtering civilians en masse became routine procedure," especially after the Chinese army intervened in the Korean War in late 1950. "[Gen. Douglas] MacArthur spoke of turning North Korean-held territory into a ‘desert.’ The US military eventually ‘expanded its definition of a military target to any structure that could shelter enemy troops or supplies.’ General Curtis LeMay summarized the achievements: ‘We burned down every town in North Korea…and some in South Korea, too.’ Yet, despite the hit-anything-still-standing bombing policy, most Americans believed the US military acted humanely in Korea." There is little evidence that the American public would adopt a different view today.

If American officials, opinion leaders, and much of the public are not willing to acknowledge war crimes that occurred more than 7 decades ago, it is hardly surprising that they are unwilling to face evidence of more recent US misconduct in such places as Lebanon, Serbia, Afghanistan, Iraq, Libya, and Syria. However, Americans who believe in democratic accountability must keep up the pressure to thwart attempts to whitewash ugly historical episodes. In a democratic system, officials theoretically act as representatives and agents of the people. When they commit atrocities and war crimes, we the people are ultimately responsible.

Ted Galen Carpenter is a senior fellow at the Randolph Bourne Institute and a senior fellow at Libertarian Institute. He also served in various senior policy positions during a 37-year career at the Cato Institute. Dr. Carpenter is the author of 13 books and more than 1,200 articles on international affairs.

ANTIWAR.COM

Decades Later, the US Government Called Hiroshima and Nagasaki ‘Nuclear Tests’

In 1980, when I asked the press office at the U.S. Department of Energy to send me a listing of nuclear bomb test explosions, the agency mailed me an official booklet with the title “Announced United States Nuclear Tests, July 1945 Through December 1979.” As you’d expect, the Trinity test in New Mexico was at the top of the list. Second on the list was Hiroshima. Third was Nagasaki.

So, 35 years after the atomic bombings of those Japanese cities in August 1945, the Energy Department – the agency in charge of nuclear weaponry – was categorizing them as “tests.”

Later on, the classification changed, apparently in an effort to avert a potential P.R. problem. By 1994, a new edition of the same document explained that the bombings of Hiroshima and Nagasaki “were not ‘tests’ in the sense that they were conducted to prove that the weapon would work as designed . . . or to advance weapon design, to determine weapons effects, or to verify weapon safety.”

But the atomic bombings of Hiroshima and Nagasaki actually were tests, in more ways than one.

Take it from the Manhattan Project’s director, Gen. Leslie Groves, who recalled: “To enable us to assess accurately the effects of the bomb, the targets should not have been previously damaged by air raids. It was also desirable that the first target be of such size that the damage would be confined within it, so that we could more definitely determine the power of the bomb.”

A physicist with the Manhattan Project, David H. Frisch, remembered that U.S. military strategists were eager “to use the bomb first where its effects would not only be politically effective but also technically measurable.”

For good measure, after the Trinity bomb test  in the New Mexico desert used plutonium as its fission source on July 16, 1945, in early August the military was able to test both a uranium-fueled bomb on Hiroshima and a second plutonium bomb on Nagasaki to gauge their effects on big cities.

Public discussion of the nuclear era began when President Harry Truman issued a statement that announced the atomic bombing of Hiroshima – which he described only as “an important Japanese Army base.” It was a flagrant lie. A leading researcher of the atomic bombings of Japan, journalist Greg Mitchell, has pointed out: “Hiroshima was not an ‘army base’ but a city of 350,000. It did contain one important military headquarters, but the bomb had been aimed at the very center of a city – and far from its industrial area.”

Mitchell added: “Perhaps 10,000 military personnel lost their lives in the bomb but the vast majority of the 125,000 dead in Hiroshima would be women and children.” Three days later, when an atomic bomb fell on Nagasaki, “it was officially described as a ‘naval base’ yet less than 200 of the 90,000 dead were military personnel.”

Since then, presidents have routinely offered rhetorical camouflage for reckless nuclear policies, rolling the dice for global catastrophe. In recent years, the most insidious lies from leaders in Washington have come with silence – refusing to acknowledge, let alone address with genuine diplomacy, the worsening dangers of nuclear war. Those dangers have pushed the hands of the Doomsday Clock from the Bulletin of the Atomic Scientists to an unprecedented mere 90 seconds to cataclysmic Midnight.

The ruthless Russian invasion of Ukraine in February 2022 quickly escalated the chances of nuclear war. President Biden’s response was to pretend otherwise, beginning with his State of the Union address that came just days after the invasion; the long speech did not include a single word about nuclear weapons, the risks of nuclear war or any other such concern.

Today, in some elite circles of Russia and the United States, normalized talk of using “tactical” nuclear weapons has upped the madness ante. It can be shocking to read wildly irresponsible comments coming from top Russian officials about perhaps using nuclear weaponry in the Ukraine war. We might forget that they are giving voice to Russia’s strategic doctrine that is basically the same as ongoing U.S. strategic doctrine – avowedly retaining the option of first use of nuclear weapons if losing too much ground in a military conflict.

Daniel Ellsberg wrote near the close of his vital book The Doomsday Machine: “What is missing – what is foregone — in the typical discussion and analysis of historical or current nuclear policies is the recognition that what is being discussed is dizzyingly insane and immoral: in its almost-incalculable and inconceivable destructiveness and deliberate murderousness, its disproportionality of risked and planned destructiveness to either declared or unacknowledged objectives, the infeasibility of its secretly pursued aims (damage limitation to the United States and allies, “victory” in two-sided nuclear war), its criminality (to a degree that explodes ordinary visions of law, justice, crime), its lack of wisdom or compassion, its sinfulness and evil.”

Dan dedicated the book “to those who struggle for a human future.”

A similar message came from Albert Einstein in 1947 when he wrote about “the release of atomic energy,” warning against “the outmoded concept of narrow nationalisms” and declaring: “For there is no secret and there is no defense; there is no possibility of control except through the aroused understanding and insistence of the peoples of the world.”

Norman Solomon is the national director of RootsAction.org and executive director of the Institute for Public Accuracy. He is the author of a dozen books including War Made Easy. His latest book, War Made Invisible: How America Hides the Human Toll of Its Military Machine, was published in June 2023 by The New Press.

Korean Armistice at 70: Redefining Atrocities as Victory


 
 AUGUST 4, 2023
Facebook

Last week was the 70th anniversary of the armistice that ended the fighting between North and South Korea.  In a low-key commemoration, the White House Press Office issued a statement from President Biden calling to “renew our commitment to the democratic values for which [American troops] served and sacrificed.”  In reality, almost 40,000 American soldiers died pointlessly in that conflict to buttress the principle that presidents could deceive the nation and intervene wherever they damn well chose.

If politicians and policymakers were honest and prudent, the Korean War would have vaccinated America against the folly and evil of foreign intervention. Instead, the war was retroactively redefined. As Barack Obama declared in 2013, “That war was no tie. Korea was a victory.”

The war began with what Harry Truman claimed was a surprise invasion on June 25, 1950, by the North Korean army across the dividing line with South Korea that was devised after World War Two. But the U.S. government had ample warnings of the pending invasion. According to the late Justin Raimondo, founder of antiwar.com, the conflict actually started with a series of attacks by South Korean forces, aided by the U.S. military: “From 1945-1948, American forces aided [South Korean President Syngman] Rhee in a killing spree that claimed tens of thousands of victims: the counterinsurgency campaign took a high toll in Kwangju, and on the island of Cheju-do — where as many as 60,000 people were murdered by Rhee’s US-backed forces.”

The North Korean army quickly routed both South Korean and U.S. forces. A complete debacle was averted after Gen. Douglas MacArthur masterminded a landing of U.S. troops at Inchon. After he routed the North Korean forces, MacArthur was determined to continue pushing northward regardless of the danger of provoking a much broader war.  By the time the U.S. forces drove the North Korean army back across the border between the two Koreas, roughly 5,000 American troops had been killed. The Pentagon had plenty of warning that the Chinese would intervene if the U.S. Army pushed too close to the Chinese border. But the euphoria that erupted after Inchon blew away all common sense and drowned out the military voices who warned of a catastrophe. One U.S. Army colonel responded to a briefing on the Korea situation in Tokyo in 1950 by storming out and declaring, “They’re living in a goddamn dream land.”

The Chinese military attack resulted in the longest retreat in the history of America’s armed forces — a debacle that was valorized in the 1986 Clint Eastwood movie, Heartbreak Ridge. By 1951, the Korean War had become intensely unpopular in the United States — more unpopular than the Vietnam War ever was. Truman insisted on mislabeling the war as a “police action,” but it destroyed his presidency regardless.  When the ceasefire was signed in 1953, the borders were nearly the same as at the start of the war.

While the Friends of Leviathan paint Truman as the epitome of an honest politician, he was as demagogic on Korea as Lyndon Johnson was on Vietnam. When Republicans criticized the Korean war as useless, President Harry Truman condemned “reckless and irresponsible Republican extremists” and “the false version of history that has been copyrighted by the extremists in the Republican Party.”

Perhaps the biggest disaster of the Korean war was that intellectuals and foreign-policy experts succeeded in redefining the Korean conflict as an American victory. As Georgetown University professor Derek Leebaert noted in his book Magic and Mayhem, “What had been regarded as a bloody stalemate transformed itself in Washington’s eyes; ten years later it had become an example of a successful limited war. Already by the mid-1950s, elite opinion began to surmise that it had been a victory.” Leebaert explained, “Images of victory in Korea shaped the decision to escalate in 1964-65 helping to explain why America pursued a war of attrition.” Even worse, the notion that “‘America has never lost a war’ remained part of the national myth, and the notion of having ‘prevailed’ in Korea became a justification for going big in Vietnam.” But as Leebaert noted, “in Vietnam, [the U.S. Army] had forgotten everything it had learned about counterinsurgency in Korea as well.”

On last year’s armistice anniversary, President Biden proclaimed, “During the Korean War, nearly 1.8 million Americans answered the call to serve and defend the freedoms and universal values that the people of South Korea enjoy today.”  The “call to serve” mostly came from summons from draft boards for military conscription.  American media commemorations of the Korean War have almost entirely ignored perhaps the war’s most important lesson: the U.S. government has almost unlimited sway to hide its own war crimes.

During the war, Americans were deluged with official pronouncements that the U.S. military was taking all possible steps to protect innocent Korean civilians. Because the evils of communism were self-evident, few questions arose about how the United States was thwarting Red aggression. When a U.S. Senate subcommittee appointed in 1953 by Sen. Joseph McCarthy investigated Korean War atrocities, the committee explicitly declared that “war crimes were defined as those acts committed by enemy nations.”  This same standard prevailed in Vietnam, Iraq, Afghanistan, and practically any other place where the U.S. militarily intervened.

In 1999, forty-six years after the cease fire in Korea, the Associated Press exposed a 1950 massacre of Korean refugees at No Gun Ri. U.S. troops drove Koreans out of their village and forced them to remain on a railroad embankment. Beginning on July 25, 1950, the refugees were strafed by U.S. planes and machine guns over the following three days. Hundreds of people, mostly women and children, were killed. The 1999 AP story was widely denounced by American politicians and some media outlets as a slander on American troops.

The Pentagon promised an exhaustive investigation. In January 2001, the Pentagon released a 300-page report purporting to prove that the No Gun Ri killings were merely “an unfortunate tragedy” caused by trigger-happy soldiers frightened by approaching refugees.

President Bill Clinton announced his “regret that Korean civilians lost their lives at No Gun Ri.” In an interview, he was asked why he used “regret” instead of “apology.” He declared, “I believe that the people who looked into it could not conclude that there was a deliberate act, decided at a high-enough level in the military hierarchy, to acknowledge that, in effect, the Government had participated in something that was terrible.” Clinton specified that there was no evidence of “wrongdoing high-enough in the chain of command in the Army to say that, in effect, the Government was responsible.”

But the atrocities against civilians had been common knowledge among U.S. troops 50 years earlier. As Charles Hanley, Sang-Hun Choe, and Martha Mendoza noted in their 2001 book, The Bridge at No Gun Ri, the Pentagon in 1952 “withdrew official endorsement from RKO’s One Minute to Zero, a Korean War movie in which an Army colonel played by actor Robert Mitchum orders artillery fire on a column of refugees.” The Pentagon fretted that “this sequence could be utilized for anti-American propaganda” and banned the film from being shown on U.S. military bases.

In 2005, Sahr Conway-Lanz, a Harvard University doctoral student, discovered a letter in the National Archives from the U.S. ambassador to Korea, John Muccio, sent to Assistant Secretary of State Dean Rusk on the day the No Gun Ri massacre commenced. Muccio summarized a new policy from a meeting between U.S. military and South Korean officials: “If refugees do appear from north of U.S. lines they will receive warning shots, and if they then persist in advancing they will be shot.” The new policy was radioed to Army units around Korea on the morning the No Gun Ri massacre began. The U.S. military feared that North Korean troops might be hiding amidst the refugees.  The Pentagon initially claimed that its investigators had never seen Muccio’s letter.  Louis Caldera, who was Army secretary in 2001, declared, “Millions of pages of files were reviewed and it is certainly possible they may have simply missed it.”  But Muccio’s letter was in the specific research file used for the official exoneration report.

Conway-Lanz’s 2006 book Collateral Damage: Americans, Noncombatant Immunity, and Atrocity after World War II quoted an official U.S. Navy history of the first six months of the Korean War stating that the policy of strafing civilians was “wholly defensible.” An official Army history noted, “Eventually, it was decided to shoot anyone who moved at night.” A report for the aircraft carrier USS Valley Forge justified attacking civilians because the Army insisted that “groups of more than eight to ten people were to be considered troops, and were to be attacked.”

In 2007, the Army recited its original denial: “No policy purporting to authorize soldiers to shoot refugees was ever promulgated to soldiers in the field.” But the Associated Press exposed more dirt from the U.S. archives: “More than a dozen documents — in which high-ranking U.S. officers tell troops that refugees are ‘fair game,’ for example, and order them to ‘shoot all refugees coming across river’ — were found by the AP in the investigators’ own archived files after the 2001 inquiry. None of those documents was disclosed in the Army’s 300-page public report.” A former Air Force pilot told investigators that his plane and three others strafed refugees at the same time of the No Gun Ri massacre; the official report claimed that “all pilots interviewed … knew nothing about such orders.” Evidence also surfaced of massacres like No Gun Ri. On September 1, 1950, the destroyer USS DeHaven, at the Army’s insistence, “fired on a seaside refugee encampment at Pohang, South Korea. Survivors say 100 to 200 people were killed.”

Slaughtering civilians en masse became routine procedure after the Chinese army intervened in the Korean war in late 1950. MacArthur spoke of turning North Korean-held territory into a “desert.” The U.S. military eventually “expanded its definition of a military target to any structure that could shelter enemy troops or supplies.” Gen. Curtis LeMay summarized the achievements: “We burned down every town in North Korea … and some in South Korea, too.”  Yet, despite the hit-anything-still-standing bombing policy, most Americans believed the U.S. military  acted humanely in Korea.   Historian Conway-Lanz noted: “The issue of intention, and not the question of whose weapons literally killed civilians or destroyed their homes, became the morally significant one for many Americans.”

A million civilians may have been killed during the war. A South Korean government Truth and Reconciliation Commission uncovered many previously unreported atrocities and concluded that “American troops killed groups of South Korean civilians on 138 separate occasions during the Korean War,” the New York Times reported.

Truth delayed is truth defused. The Pentagon strategy on Korean War atrocities succeeded because it left facts to the historians, not the policymakers. The truth about No Gun Ri finally slipped out — ten presidencies later. Even more damaging, the Rules of Engagement for killing Korean civilians were covered up for four more U.S. wars. If U.S. policy for slaying Korean refugees (or anyone who “moved at night”) had been exposed during that war, it might have curtailed similar killings in Vietnam (many of which were not revealed until decades after the war).

Former congressman and decorated Korean War veteran Pete McCloskey (R-Calif.) warned, “The government will always lie about embarrassing matters.” The same shenanigans permeate other U.S. wars. The secrecy and deceit surrounding U.S. warring has had catastrophic consequences in this century. The Bush administration exploited the 9/11 attacks to justify attacking Iraq in 2003, and it was not until 2016 that the U.S. government revealed documents exposing the Saudi government’s role in financing the 9/11 hijackers (15 of 19 whom were Saudi citizens). The Pentagon covered up the vast majority of U.S. killings of Iraqi civilians until Bradley Manning and WikiLeaks exposed them in 2010.

When politicians or generals appear itching to pull the United States into another foreign war, remember that truth is routinely the first casualty.  Governments that recklessly slay masses of civilians won’t honestly investigate and announce their guilt to the world. Self-government is a mirage if Americans do not receive enough information to judge killings committed in their name.

This piece was original published by the Libertarian Institute.

James Bovard is the author of Attention Deficit DemocracyThe Bush BetrayalTerrorism and Tyranny, and other books. Bovard is on the USA Today Board of Contributors. He is on Twitter at @jimbovard. His website is at www.jimbovard.com  This essay was originally published by Future of Freedom Foundation.

Should There be a Supreme Court? 


Its Role Has Always Been Anti-Democratic



 
COUNTERPUNCH
Facebook

“Brown, J., an’ Harlan, J., is discussin’ th’ condition in th’ Roman Impire befure th’ fire …” Political cartoon by Frederick Opper, 1890. Library of Congress.

Vested interests create “checks and balances” primarily to make political systems non-responsive to demands for social reform. Historically, therefore, the checks are politically unbalanced in practice. Instead of producing a happy medium, their effect often has been to check the power of the people to assert their interests at the expense of the more powerful. Real reform requires a revolution – often repeated attempts. The Roman Republic suffered five centuries of fighting to redistribute land and cancel debts, all of which failed as the oligarchy’s “checks” imposed deepening economic dependency and imbalance.

The Supreme Court is America’s most distinctive check. Its deepening bias since its takeover by “conservatives” claiming to be “originalist” interpreters of the constitution, has led to the most widespread protests since Franklin Roosevelt threatened to pack the court in the 1930s by expanding its membership to create a more democratic majority. Although appointed by presidents and consented to by Congress, the judges’ lifetime tenure imposes the ideology of past elections on the present.

So why are they needed at all? Why not permit Congress to make laws that reflect the needs of the time? The Court’s judges themselves have pointed out that if Congress doesn’t like their rulings, it should pass its own laws, or even a constitutional amendment, to provide a new point of reference.

That is not a practical solution in today’s world. The most obvious reason is that Congress is locked in a stalemate, unable to take a firm progressive step because of how far the U.S. political as well as judicial system has long been dominated by corporate and financial interests. wielding enormous sums of money to corrupt the election process since the Citizens United ruling in 2010 even at the nomination stage to determine the candidates. The Federalist Society has embarked on a five-decade lobbying effort to groom and promote judges to serve the vested interests.[1] When today’s Supreme Court act as mediums to ask what the original drafters of the Constitution wanted or meant, they simply are using these ghostly spirits as proxies for today’s ruling elites.

Long before the U.S. Supreme Court’s “originalist” seances rejecting as unconstitutional laws that most Americans want – on the excuse that they are not what the wealthy New England merchants and southern slave-owners who drafted the Constitution would have intended – classical Greek and Roman oligarchies created their own judicial checks against the prospect of Sparta’s kings, Athenian popular assemblies and Roman consuls enacting laws at the expense of the vested interests.

Sparta had two kings instead of just one, requiring their joint agreement on any new rules. And just in case they might join together to limit the wealth of the oligarchs, they were made subject to a council of ephors to “advise” them. A kindred Roman spirit called for two consuls to head the Senate. To ward against their joining to cancel debts or redistribute land – the constant demand of Romans throughout the Republic’s five centuries, 509-27 BC – the Senate’s meetings could be suspended if religious authorities found omens from the flight of birds or other airy phenomena. These always seemed to occur when a challenge to the oligarchy seemed likely to pass.

The historian Theodor Mommsen called this tactic “political astrology.” The most blatant attempt occurred in 59 BC when Julius Caesar was elected consul and proposed an agrarian law to settle some of Pompey’s veterans as well as urban plebs on public land in Italy. Additional land was to be bought from private owners, using funds from Pompey’s campaign in Asia Minor.

 Cato the Younger led the Roman Senate’s Optimates who feared Caesar’s (or anyone’s) popularity. Opposing any change in the status quo, he started one of his famous all-day speeches. Caesar ordered him led away, but many senators followed Cato out, preventing a vote from being taken. Caesar then simply bypassed the Senate to put the measure before the Centuriate Assembly, composed largely of army veterans. That was a tactic that the reformer Tiberius Gracchus had perfected after 133 to promote his own land redistribution (for which he was assassinated, the oligarchy’s traditional fallback defense in all epochs).[2]

When Caesar’s opponents threatened violence to block the popular vote, Pompey threatened to use his own force. And when the time came for the Senate to ratify the law, Caesar and Pompey filled the Forum with their soldiers, and a large crowd gathered. Cato’s son-in-law, M. Calpurnius Bibulus was Caesar’s annoying co-consul, and tried to suspend the voting by claiming to see bad omens, making public business illegal.

Caesar overruled Bibulus, based on his own higher authority as pontifex maximus, leading Bibulus to declare the rest of the year a sacred period in which no assemblies could be held or votes taken. But the crowd drove him away and broke his insignia of consulship, the ceremonial fasces carried by his lictors, and beat the tribunes allied with him. Cato likewise was pushed away when he tried to force his way to the platform to block the vote. He and Bibulus fled, and Caesar’s bill was passed, including a clause requiring all senators to take an oath to adhere to it. Bibulus went home and sulked, insisting that the entire year’s laws be nullified because they were passed under threat of violence. It was the oligarchy, however, that settled matters by assassinating Caesar and other advocates of land and debt reform.

Athens, which turned oligarchic in the 4th century BC after losing the Peloponnesian War with Sparta, used a tactic closer to today’s Supreme Court by trying to subject laws to conformity with an alleged “ancestral constitution” that presumably should never be changed – at least in a way that would favor democracy. Claiming to restore the supposed constitution of Solon, the Thirty Tyrants installed by Sparta’s oligarchy in 404 BC downgraded the Athenian boule’s governing five hundred citizens into a merely “advisory” group whose views had no official weight.[3]

The great watershed in Athenian history had been Solon’s seisachtheia – literally “shedding of [debt] burdens” in 594 BC, cancelling personal debts that bound debtors in near bondage. New demands for debt cancellation and land redistribution remained the primary democratic demands for the next four centuries. Androtion (ca. 344 BC), a follower of the oligarchic Isocrates, sought to claim the authority of Solon while denying that he had actually cancelled debts, claiming that he merely revalued the coinage, weights and measures to make debts more easily payable.[4] But there was no coinage in Solon’s time, so this attempt to rewrite history was anachronistic. That often happens when mediums claim to channel the spirit of the dead who cannot speak.

In a similar tradition, the authors of America’s constitution created the Supreme Court to provide a check on the danger that political evolution might lead Congress to pass laws threatening oligarchic rule. There no longer is a pontifex to block democratic lawmaking by claiming to read auspices in the flight of birds or other airy phenomena. Instead, there is a more secular subordination of new laws to the principle that they must not be changed from what was intended by the authors of the Constitution – as interpreted by their counterpart elites in today’s world. This approach fails to take into account how the world is evolving and how the legal system needs to be modernized to cope with such change.

I have found it to be an axiom of the history of legal philosophy that if the popular political spirit is for democratic reform – especially supporting taxes and other laws to prevent the polarization of wealth between the vested interests and the economy at large – the line of resistance to such progress is to insist on blocking any change from “original” constitutional principles that supported the power of vested interests in the first place.

The U.S. political system has become distorted by the power given to the Supreme Court enabling it to block reforms that the majority of Americans are reported to support. The problem is not only the Supreme Court, to be sure. Most voters oppose wars, support public healthcare for all and higher taxes on the wealthy. But Congress, itself captured by the oligarch donor class, routinely raises military spending, privatizes healthcare in the hands of predatory monopolies and cuts taxes for the financial rent-seeking class while pretending that spending money on government social programs would force taxes to rise for wage-earners.

The effect of the corporate capture of Congress as well as the Supreme Court as the ultimate oligarchic backstop is to block Congressional politics as a vehicle to update laws, taxes and public regulation in keeping with what voters recognize to be modern needs. The Supreme Court imposes the straitjacket of what America’s 18th-century slaveowners and other property owners are supposed to have wanted at the time they wrote the Constitution.

James Madison and his fellow Federalists were explicit about their aim. They wanted to block what they feared was the threat of democracy by populists, abolitionists and other reformers threatening to check their property “rights” as if these were natural and inherent. The subsequent 19th century’s flowering of classical political economists explaining the logic for checking rentier oligarchies was far beyond what they wanted. Yet today’s Supreme Court’s point of reference is still, “What would the authors of the U.S. Constitution, slaveowners fearful of democracy, have intended?” That logic is applied anachronistically to limit every democratic modernization from the right of unionized labor to go on strike, to abortion rights for women, cancellation of student debt and the right of government to tax wealth.

Even if Congress were not too divided and stalemated to write laws reflecting what most voters want, the Supreme Court would reject them, just as it sought for many decades to declare a national income tax unconstitutional under the theory of “takings.” The Supreme Court can be expected to block any law threatening the victory of the Thatcherite and Reaganomics doctrine of privatization, “small” government unable to challenge the power of wealth (but big enough to crush any attempts by labor, women or minorities to promote their own interests), a state of affairs that is an anomaly for a nation claiming to be a democracy.

A nation’s constitution should have the flexibility to modernize laws, taxes and government regulatory power to remove barriers to broadly-based progress, living standards and productivity. But these barriers have been supported by oligarchies through the ages. That was why the Supreme Court was created in the first place. The aim was to leave the economy in the control of property holders and the wealthiest families. That anachronistic judicial philosophy is helping turn the United States into a failed state by empowering a wealthy minority to reduce the rest of the population to economic dependency.

We are repeating the economic polarization of ancient Greece and Rome that I have described in my recent book The Collapse of Antiquity. The 7th– and 6th-century BC crisis of personal debt and land concentration led to social revolution by reformers (“tyrants,” not originally a term of invective) in Corinth, Sparta and other Greek-speaking city-states and Aegean islands. Solon was appointed archon to resolve the crisis in Athens. Unlike reformers in other Greek cities, he did not redistribute the land, but he did cancel the debts and removed the land’s crop-payment stones. The ensuing 6th century saw Solon’s successors lay the groundwork for Athenian democracy.

But the next three centuries saw the rise of creditor oligarchies throughout Greece and Italy, using debt as a lever to monopolize land and reduce citizens to bondage. These increasingly aggressive oligarchies fought, with more and more overt violence, against new reformers seeking to cancel debts and redistribute the land to prevent the economy falling into austerity, clientage and reliance on the dole. Their oligarchic ideology was much like that of today’s right-wing Supreme Court in its approach to constitutional law. The common denominator is an age-old drive to prevent democratic change, above all by using wealth as a means of controlling the political process. That is the philosophy outlined in the Powell Memo, and in the Supreme Court’s Citizens United ruling permitting the political campaign system to be financialized and, in effect, privatized in the hands of the Donor Class.

As in classical antiquity, the exponential rise in debt has polarized wealth ownership. Personal debt bondage no longer exists, but most home buyers and wage earners are obliged to take on a working-lifetime debt burden to obtain a home of their own, an education to get a job to qualify for mortgage loans to buy their home, and credit-card debt simply to make ends meet. The result is debt deflation as labor is obliged to spend an increasing proportion of its income on debt service instead of goods and services. That slows the economy, while creditors use their rising accumulation of wealth to finance the inflation of housing prices, along with stock and bond prices – with yet more debt financing.

The conflict between creditors and debtors is a red thread running throughout American history, from the Whiskey Rebellion of the 1790s to the monetary deflation of the 1880s as “hard money” creditor interests rolled back prices and incomes to be paid in gold, increasing the control of bondholders over labor. Today, U.S. debt and tax policy is passing out of the Congress to the Supreme Court, whose members are groomed and vetted to make sure that they will favor financial and other rentier wealth by leading the Court to impose the founders’ pre-democratic philosophy of constitutional law despite the past few centuries of political reforms that, at least nominally, have endorsed democracy over oligarchy.

The victory of rentier wealth has led to the deindustrialization of America and the resulting predatory diplomacy as its economy seeks to extract from foreign countries the products that it no longer is producing at home. This is why foreign countries are moving to pursue a philosophy rejecting debt deflation, privatization and the shift of economic planning from elected governments to financial centers from Wall Street to the City of London, the Paris Bourse and Japan.

Any resilient society’s constitution should be responsive to the evolution of economic, technological, environmental and geopolitical dynamics. U.S. legal philosophy reflects mainstream economics in trying to lock in a set of principles written by creditors and other rentiers fearful of making the financial system, tax system and distribution of wealth more conducive to prosperity than to austerity and economic polarization. While there no longer is an attempt to roll back the clock to impose the outright slavery that most framers of the Constitution endorsed, the spread of debt deflation and debt dependency has become a form of economic bondage that is the modern “conservative” counterpart to the racial slavery of old. It is what the “original” power elite are thought to have wanted if we choose to go back in a time machine and ask them, instead of looking toward a less oligarchic future.

Notes.

[1] The Lewis Powell memo to the U.S. Chamber of Commerce on August 23, 1971 laid out this plan. https://www.greenpeace.org/usa/democracy/the-lewis-powell-memo-a-corporate-blueprint-to-dominate-democracy/. For a review of how this almost conspiratorial propaganda and censorship attack was financed see Lewis H. Lapham, “Tentacles of Rage: The Republican propaganda mill, a brief history,” Harpers, September, 2004. Available at: http://www.mindfully.org/Reform/2004/Republican-Propaganda1sep04.htm.

[2] See Cassius Dio, Roman History 38.2.2. I discuss this affair in The Collapse of Antiquity, chapter 18.

[3] AthÄ“naion Politeia 35.2 and Xenophon, Hellenica 2.3.2 and 11.

[4] Plutarch, Solon 15.2.

Michael Hudson’s new book, The Destiny of Civilization, will be published by CounterPunch Books next month.