Friday, August 27, 2021

Blinded by the Light: Remembering Hiroshima and Nagasaki in the Age of Normalized Violence


 
 AUGUST 6, 2021
COUNTERPUNCH
Facebook

Detail from a U.S. Air Force map of Hiroshima, pre-bombing, circles drawn at 1,000 foot intervals radiating out from ground zero, the site directly under the explosion. (U.S. National Archives and Records Administration).

On Monday August 6, 1945, the United States unleashed an atomic bomb on Hiroshima killing 140,000 people instantly. 70% of the city was destroyed.  A few days later on August 9th, another atomic bomb was dropped on Nagasaki killing an estimated 70,000 people.[2] The Japanese government stated that the death toll was much higher than the American estimates, indicating that it was close to a half million.  Many died not only because of lack of medical help, but also from radioactive rain. In the immediate aftermath, the incineration of mostly innocent civilians was buried in official government pronouncements about the victory of the bombings of both Hiroshima and Nagasaki. Violence rendered in military abstractions and patriotic platitudes is itself an act of violence. The visceral effect of violence brings to the surface what can only be considered intolerable, unthinkable, and never unknowable. Maybe such horror can only be possible in the language of journalism.

Within a short time after the dropping of the atom bombs on Hiroshima and Nagasaki, John Hersey wrote a devastating description of the misery and suffering caused by the bomb. Removing the bomb from abstract arguments endorsing matters of technique, efficiency, and national honor, Hersey first published in The New Yorker and later in a widely read book an exhausting and terrifying description of the bombs effects on the people of Hiroshima, portraying in detail the horror of the suffering caused by the bomb.  There is one haunting passage that not only illustrates the horror of the pain and suffering, but also offers a powerful metaphor for the blindness that overtook both the victims and the perpetrators. He writes:

On his way back with the water, [Father Kleinsorge] got lost on a detour around a fallen tree, and as he looked for his way through the woods, he heard a voice ask from the underbrush, ‘Have you anything to drink?’ He saw a uniform. Thinking there was just one soldier, he approached with the water.  When he had penetrated the bushes, he saw there were about twenty men, they were all in exactly the same nightmarish state: their faces were wholly burned, their eye sockets were hollow, the fluid from their melted eyes had run down their cheeks. Their mouths were mere swollen, pus-covered wounds, which they could not bear to stretch enough to admit the spout of the teapot.[3]

The nightmarish image of fallen soldiers staring with hollow sockets, eyes liquidated on cheeks and mouths swollen and pus-filled stands as a warning to those who would refuse blindly the moral witnessing necessary to keep alive for future generations the memory of the horror of nuclear weapons and the need to eliminate them. Hersey’s literal depiction of mass violence against civilians serves as a kind of mirrored doubling, referring at one level to nations blindly driven by militarism and hyper-nationalism and at another level the need to exorcise history which now functions as a curse.

The atomic bomb was celebrated by those who argued that its use was responsible for concluding the war with Japan. Also applauded was the power of the bomb and the wonder of science in creating it, especially “the atmosphere of technological fanaticism” in which scientists worked to create the most powerful weapon of destruction then known to the world.[4] Conventional justification for dropping the atomic bombs held that “it was the most expedient measure to securing Japan’s surrender [and] that the bomb was used to shorten the agony of war and to save American lives.”[5]Left out of that succinct legitimating narrative were the growing objections to the use of atomic weaponry put forth by a number of top military leaders and politicians, including General Dwight Eisenhower, who was then the Supreme Allied Commander in Europe, former President Herbert Hoover, and General Douglas MacArthur, all of whom argued it was not necessary to end the war. [6] A position later proven to be correct.

For a brief time, the Atom Bomb was celebrated as a kind of magic talisman entwining salvation and scientific inventiveness and in doing so functioned to “simultaneously domesticate the unimaginable while charging the mundane surroundings of our everyday lives with a weight and sense of importance unmatched in modern times.”[7] In spite of the initial celebration of the effects of the bomb and the orthodox defense that accompanied it, whatever positive value the bomb may have had among the American public, intellectuals, and popular media began to dissipate as more and more people became aware of the massive deaths along with suffering and misery it caused.[8]

Kenzaburo Oe, the Nobel Prize winner for Literature, noted that in spite of attempts to justify the bombing  “from the instant the atomic bomb exploded, it [soon] became the symbol of human evil, [embodying] the absolute evil of war.”[9] What particularly troubled Oe was the scientific and intellectual complicity in the creation of and in the lobbying for its use, with acute awareness that it would turn Hiroshima into a “vast ugly death chamber.” [10]  More pointedly, it revealed a new stage in the merging of military actions and scientific methods, indeed a new era in which the technology of destruction could destroy the earth in roughly the time it takes to boil an egg. The bombing of Hiroshima and Nagasaki forecasted a new industrially enabled kind of violence and warfare in which the distinction between soldiers and civilians disappeared and the indiscriminate bombing of civilians was normalized. But more than this, the American government exhibited a ‘total embrace of the atom bomb,” that signalled support for the first time of a “notion of unbounded annihilation [and] “the totality of destruction.”[11]

Hiroshima and Nagasaki designated the beginning of the nuclear era in which as Oh Jung points out “Combatants were engaged on a path toward total war in which technological advances, coupled with the increasing effectiveness of an air strategy, began to undermine the ethical view that civilians should not be targeted… This pattern of wholesale destruction blurred the distinction between military and civilian casualties.”[12]  The destructive power of the bomb and its use on civilians also marked a turning point in American self-identity in which the United States began to think of itself as a superpower, which as Robert Jay. Lifton points out refers to “a national mindset–put forward strongly by a tight-knit leadership group–that takes on a sense of omnipotence, of unique standing in the world that grants it the right to hold sway over all other nations.”[13]  The power of the scientific imagination and its murderous deployment gave birth simultaneously to the American disimagination machine with its capacity to rewrite history in order to render it an irrelevant relic best forgotten.

What remains particularly ghastly about the rationale for dropping two atomic bombs was the attempt on the part of its defenders to construct a redemptive narrative through a perversion of humanistic commitment, of mass slaughter justified in the name of saving lives and winning the war.[14]  This was a humanism under siege, transformed into its terrifying opposite and placed on the side of what Edmund Wilson called the Faustian possibility of a grotesque “plague and annihilation.”[15] In part, Hiroshima and Nagasaki represented the achieved transcendence of military metaphysics now a defining feature of national identity, its more poisonous and powerful investment in the cult of scientism, instrumental rationality, and technological fanaticism—and the simultaneous marginalization of scientific evidence and intellectual rigour, even reason itself. That Hiroshima, in particular, was used to redefine America’s “national mission and its utopian possibilities”[16] was nothing short of what the late historian Howard Zinn called a “devastating commentary on our moral culture.”[17] More pointedly it serves as a grim commentary on our national insanity, which became more exacerbated over time, reaching a culmination to a form of fascist politics under the Trump administration. In most of these cases, matters of morality and justice were dissolved into technical questions and reductive chauvinism relating matters of governmentally massaged efficiency, scientific “expertise”, and American exceptionalism.  As Robert Jay Lifton and Greg Mitchell stated, the atom bomb was symbolic of the power of post-war America rather than a “ruthless weapon of indiscriminate destruction” which conveniently put to rest painful questions concerning justice, morality, and ethical responsibility.[18]

This narrative of redemption was soon challenged by a number of historians who argued that the dropping of the atom bomb had less to do with winning the war than with an attempt to put pressure on the Soviet Union to not expand their empire into territory deemed essential to American interests.[19] Protecting America’s superiority in a potential Soviet-American conflict was a decisive factor in dropping the bomb. In addition, the Truman administration needed to provide legitimation to Congress for the staggering sums of money spent on the Manhattan Project in developing the atomic weapons program and for procuring future funding necessary to continue military appropriations for ongoing research long after the war ended.[20] The late Howard Zinn went even further asserting that the government’s weak defense for the bombing of Hiroshima was not only false but was complicitous with an act of terrorism. Refusing to relinquish his role as a public intellectual willing to hold power accountable, he writes “Can we … comprehend the killing of 200,000 people to make a point about American power?”[21] Other historians also attempted to deflate this official defense of Hiroshima by providing counter-evidence that the Japanese were ready to surrender as a result of a number of factors including the nonstop bombing of 26 cities before Hiroshima and Nagasaki, the success of the naval and military blockade of Japan, and the Soviet’s entrance into the war on August 9th.[22]

Employing a weapon of mad violence against the Japanese people, the US government imagined Japan as the ultimate enemy, and then pursued tactics that blinded the American public to its own humanity and in doing so became its own worst enemy by turning against its most cherished democratic principles. In a sense, this self-imposed sightlessness functioned as part of what Jacques Derrida once called a societal autoimmune response, one in which the body’s immune system attacked its own bodily defenses.[23] Fortunately, this state of political and moral blindness did not extend to a number of critics for the next fifty years who railed aggressively against the dropping of the atomic bombs and the beginning of the nuclear age.

In the aftermath of the bombing of Hiroshima, there was a major debate not just about how the emergence of the atomic age and the moral, economic, scientific, military, and political forced that gave rise to it but also the ways in which the embrace of the atomic age altered the emerging nature of state power, gave rise to new forms of militarism, put American lives at risk, created environmental hazards, produced an emergent surveillance state, furthered the politics of state secrecy, and put into play a series of deadly diplomatic crisis, reinforced by the logic of brinkmanship and a belief in the totality of war.[24]

Hiroshima not only unleashed immense misery, unimaginable suffering, and wanton death on Japanese civilians, it also gave rise to anti-democratic tendencies in the United States government that put the health, safety, and liberty of the American people at risk. Shrouded in secrecy, the government machinery of death that produced the bomb did everything possible to cover up the most grotesque effects of the bomb on the people of Hiroshima and Nagasaki but also the dangerous hazards it posed to the American people. Lifton and Mitchell argue convincingly that if the development of the bomb and its immediate effects were shrouded in concealment by the government that before long concealment developed into a cover up marked by government lies and the falsification of information.[25] With respect to the horrors visited upon Hiroshima and Nagasaki, films taken by Japanese and American photographers were hidden for years from the American public for fear that they would create both a moral panic and a backlash against the funding for nuclear weapons.[26]   For example, the Atomic Energy Commission lied about the extent and danger of radiation fallout going so far as to mount a campaign claiming that “fallout does not constitute a serious hazard to any living thing outside the test site.”[27] This act of falsification took place in spite of the fact that thousands of military personal were exposed to high levels of radiation within and outside of the test sites.

In addition, the Atomic Energy Commission in conjunction with the Departments of Defense, Department of Veterans’ Affairs, the Central Intelligence Agency, and other government departments engaged in a series of medical experiments designed to test the effects of different levels radiation exposure on military personal, medical patients, prisoners, and others in various sites.  According to Lifton and Mitchell, these experiments took the shape of exposing people intentionally to “radiation releases or by placing military personnel at or near ground zero of bomb tests.”[28] It gets worse. They also note that “from 1945 through 1947, bomb-grade plutonium injections were given to thirty-one patients [in a variety of hospitals and medical centers] and that all of these “experiments were shrouded in secrecy and, when deemed necessary, in lies….the experiments were intended to show what type or amount of exposure would cause damage to normal, healthy people in a nuclear war.”[29] Some of the long lasting legacies of the birth of the atomic bomb also included the rise of plutonium dumps, environmental and health risks, the cult of expertise, and the subordination of the peaceful development technology to a large scale interest in using technology for the organized production of violence. Another notable development raised by many critics in the years following the launch of the atomic age was the rise of a government mired in secrecy, the repression of dissent, and the legitimation for a type of civic illiteracy in which Americans were told to leave “the gravest problems, military and social, completely in the hands of experts and political leaders who claimed to have them under control.”[30]

All of these anti-democratic tendencies unleashed by the atomic age came under scrutiny during the latter half of the twentieth century. The terror of a nuclear holocaust, an intense sense of alienation from the commanding institutions of power, and deep anxiety about the demise of the future spawned growing unrest, ideological dissent, and massive outbursts of resistance among students and intellectuals all over the globe from the sixties until the beginning of the twenty-first century calling for the outlawing of militarism, nuclear production and stockpiling, and the nuclear propaganda machine. Literary writers extending from James Agee to Kurt Vonnegut, Jr. condemned the death-saturated machinery launched by the atomic age. Moreover, public intellectuals from Dwight Macdonald and Bertrand Russell to Helen Caldicott, Ronald Takaki, Noam Chomsky, and Howard Zinn, fanned the flames of resistance to both the nuclear arms race and weapons as well as the development of nuclear technologies.

In the United States, the mushroom cloud connected to Hiroshima is now connected to much larger forces of destruction, including a turn to instrumental reason over moral considerations, the normalization of violence in America, the militarization of local police forces, an attack on civil liberties, the rise of the surveillance state, a dangerous turn towards authoritarianism, embodied in the fascist politics unleashed by Trump and his supine, dangerous allies.  Rather than stand in opposition to preventing a nuclear mishap or the expansion of the arms industry, the United States places high up on the list of those nations that could trigger what Amy Goodman calls that “horrible moment when hubris, accident or inhumanity triggers the next nuclear attack.”[31]  Given the history of lies, deceptions, falsifications, and retreat into secrecy that characterizes the American government’s strangulating hold by the military-industrial-surveillance complex, it would be naïve to assume that the U.S. government can be trusted to act with good intentions when it comes to matters of domestic and foreign policy. Of course, matters of trust, decency, and a respect for democracy evaporated under the former Trump administration.  State terrorism and an embrace of violence as a national ideal has increasingly become the DNA of American governance and politics and is evident in government cover ups, corruption, and numerous acts of bad faith. Secrecy, lies, and deception have a long history in the United States and the issue is not merely to uncover such instances of state deception but to connect the dots over time and to map the connections, for instance, between the actions of the NSA in the early aftermath of the attempts to cover up the inhumane destruction unleashed by the atomic bomb on Hiroshima and Nagasaki and the role the NSA and other intelligence agencies play today in distorting the truth about government policies while embracing an all-compassing notion of surveillance and squelching of civil liberties, privacy, and freedom.  Militarism now pervades every aspect of society, language has become weaponized, state racism has been turned into a tool of political opportunism, and the Republican Party amounts to a criminal organization inflicting lies, conspiracy theories, voter suppression laws, and a denial of science and public health in the midst of a crisis, amounting to untold numbers of death.

Hiroshima symbolized and continues to remind us of the fact that the United States commits unspeakable acts of violence making it easier to refuse to rely on politicians, academics, and alleged experts who refuse to support a politics of transparency and serve mostly to legitimate anti-democratic, if not totalitarian policies.  Questioning a monstrous war machine whose roots lie in Hiroshima and the gangster capitalism that benefits from it is the first step in declaring nuclear weapons unacceptable ethically and politically.   This suggests a further mode of inquiry that focuses on how the rise of the military-industrial complex contributes to the escalation of nuclear weapons and what can we learn by tracing it roots to the development and use of the atom bomb. Moreover, it raises questions about the role played by intellectuals both in an out of the academy in conspiring to build the bomb and hide its effects from the American people? These are only some of the questions that need to be made visible, interrogated, and pursued in a variety of sites and public forums.

One crucial issue today is what role might intellectuals, cultural critics, journalists, and others who trade in lifting ideas into the public realm play in making clear the educative nature of politics? How might reviving the public imagination function as part of a sustained pedagogical effort to resurrect the memory of Hiroshima as both a warning and a signpost for rethinking the nature of collective struggle, reclaiming the ideals and promises a radical democracy, and producing a sustained politics and act of collective resistance aimed at abolishing nuclear weapons forever?  One issue would be to revisit the conditions that made Hiroshima and Nagasaki possible, to explore how militarism and a kind of technological fanaticism merged under the star of scientific rationality. Another step forward would be to make clear what the effects of such weapons are, to disclose the manufactured lie that such weapons make us safe. Indeed, this suggests the need for intellectuals, artists, and other cultural workers to use their skills, resources, and connections to develop massive educational campaigns that make clear both the danger of nuclear war a society armed to the teeth.

Such campaigns not only make education, consciousness, and collective struggle the center of politics, but also systemically work to both inform the public about the history of such weapons, the misery and suffering they have caused, and how they benefit the financial, government, and corporate elite who make huge amounts of money off the arms race and the promotion of nuclear deterrence and the need for a permanent warfare state. Intellectuals today appear numbed by ever developing disasters, statistics of suffering and death, the Hollywood disimagination machine with its investment in celluloid Apocalypses for which only superheroes can respond, and a consumer culture that thrives on self-interests and deplores collective political and ethical responsibility. In an age when violence turns into a spectacle, mass shootings become normalized, and violence becomes the primary language of politics, it becomes all the more difficult and yet necessary to remember the horror and legacy of Hiroshima and Nagasaki.

There are no rationales or escapes from the responsibility of preventing mass destruction due to nuclear annihilation; the appeal to military necessity is no excuse for the indiscriminate bombing of civilians whether in Hiroshima or Yemen.  The sense of horror, fear, doubt, anxiety, and powerless that followed Hiroshima and Nagasaki up until the beginning of the 21st century seems to have faded in light of the rise of a form of gangster capitalism that embraces white nationalism, white supremacy, the Hollywood apocalypse machine, the mindlessness of consumer cultures, the growing spectacles of violence, and a militarism that is now celebrated as one of the highest ideals of American life. In a society governed by militarism, consumerism, and neoliberal savagery, it has become more difficult to assume moral, social, and political responsibility, to believe that democracy matters and is worth fighting for, to imagine a future in which responding to the suffering of others is a central element of democratic life. When historical memory fades and people turn inward, remove themselves from politics, and embrace cynicism over educated hope, a culture of evil, suffering, and existential despair. Americans now life amid a culture of indifference sustained by an endless series of manufactured catastrophes that offer a source of entertainment, sensation, and instant pleasure.

We live in an age in which violence becomes a form of entertainment rather than a source of alarm, individuals increasingly are too numb to question society, and become incapable of translating private troubles into larger public considerations. In the age following the use of the atom bomb on civilians, talk about evil, militarism, and the end of the world once stirred public debate and diverse resistance movements, now it promotes a culture of fear, moral panics, and a retreat into the black hole of the disimagination machine. In the midst of the economic crisis of 2008 and the failure of gangster capitalism to address the COVID-19 crisis, it is clear that gangster capitalism cannot provide a vision to sustain radical democratic society and works largely to destroy it.

The horror of Hiroshima and Nagasaki speaks to what James Baldwin once called the “tension between hope and terror.” Hope in the absence of moral witnessing and a culture of immediacy that hawks support for conditions-environmental, economic, social, and cultural-that embrace rather than reject the incessant drive toward the apocalypse appears meaningless. Gangster capitalism has become a metaphor for the recurring atomic blast, a social, political, and moral embodiment of global destruction that needs to be stopped before it is too late.  Returning to the memory of Hiroshima and Nagasaki is not only necessary to break out of the moral cocoon that puts reason and memory to sleep but also to rediscover both our imaginative capacities for civic literacy on behalf of the public good, especially if such action demands that we remember as Robert Jay Lifton and Greg Mitchell remark “Every small act of violence, then, has some connection with, if not sanction from, the violence of Hiroshima and Nagasaki.” [32]

Manufactured catastrophes and historical amnesia—and with them a generalized sense of manufactured helplessness—now reign supreme in the new interregnum of late modernity, a kind of liminal space that serves to neutralize action, derail the challenges posed by real social and political problems such as the threat of nuclear annihilation, and substitute the escape into fantasy for any attempt to challenge the terrifying conditions that often accompany a serious crisis. Such retreats from reality blunt civic courage, dull the radical imagination, and dilute any sense of moral responsibility, plunging historical acts of violence such as Hiroshima into the abyss of political indifference, ethical insensitivity, and depoliticization.  Catastrophe, as Brad Evans has observed, speaks to an era of late modernity marked by “a closing of the political.” [33]    Resignation and acceptance of catastrophe has taken root in the ground prepared by the neoliberal notion that “nothing can be done.”

If, as the late Zygmunt Bauman argued, crisis speaks to the need to address what exactly needs to be done, then what has been lost in the age of catastrophe and historical amnesia and its overwhelming sense of precarity and uncertainty is a properly political response in the face of a pending or existing disaster. In the age of Trump, history has become a curse, and dissent is now viewed as dangerous, reminders of the horrors of injustice, the collapse of conscience, and willingness of too many to look away. The future will look much brighter and new forms of collective resistance will emerge, in part, with the recognition that the legacy of violence, death and cruelty that extends from Hiroshima to the current tsunami of violence being waged on immigrants, people of color, and peaceful protesters makes clear that no one can be a bystander if democracy is to survive.

Notes.

[1] I have drawn in this essay upon some some previous ideas of mine published on the seventieth anniversary of Hiroshima and Nagasaki. I revisit them in the hope of reviving historical memory in the service of the search for justice and the need to remember that which the dead can no longer speak of.

[2] Jennifer Rosenberg, “Hiroshima and Nagasaki (Part 2),” About.com –20th Century History (March 28, 201). Online: http://history1900s.about.com/od/worldwarii/a/hiroshima_2.htm. A more powerful atom bomb was dropped on Nagasaki on August 9, 1945, and by the end of the year an estimated 70,000 had been killed. For the history of the making of the bomb, see the monumental: Richard Rhodes, The Making of the Atomic Bomb, (New York: Simon & Schuster, 2012.

[3] John Hersey, Hiroshima (New York: Alfred A. Knopf, 1946), p. 68.

[4] The term “technological fanaticism” comes from Michael Sherry who suggested that it produced an increased form of brutality. Cited in Howard Zinn, The Bomb. (New York. N.Y.: City Lights, 2010), pp. 54-55.

[5] Oh Jung, “Hiroshima and Nagasaki: The Decision to Drop the Bomb,” Michigan Journal of History Vol 1. No. 2 (Winter 2002). Online:

http://michiganjournalhistory.files.wordpress.com/2014/02/oh_jung.pdf

[6]  See, in particular, Ronald Takaki, Hiroshima: Why America Dropped the Atomic Bomb, (Boston: Back Bay Books, 1996).

[7] Peter Bacon Hales, Outside the Gates of Eden: The Dream Of America From Hiroshima To Now. (Chicago. IL.: University of Chicago Press, 2014), p. 17.

[8] Paul Ham, Hiroshima Nagasaki: The Real Story of the Atomic Bombings and Their Aftermath (New York: Doubleday, 2011).

[9] Kensaburo Oe, Hiroshima Notes (New York: Grove Press, 1965), p. 114.

[10] Ibid., Oe, Hiroshima Notes, p. 117.

[11] Robert Jay Lifton and Greg Mitchell, Hiroshima in America, (New York, N.Y.: Avon Books, 1995). p. 314-315. 328.

[12] Ibid., Oh Jung, “Hiroshima and Nagasaki: The Decision to Drop the Bomb.”

[13] Robert Jay Lifton, “American Apocalypse,” The Nation (December 22, 2003), p. 12.

[14] For an interesting analysis of how the bomb was defended by the New York Times and a number of high ranking politicians, especially after John Hersey’s Hiroshima appeared in The New Yorker, see Steve Rothman, “The Publication of “Hiroshima” in The New Yorker,”Herseyheroshima.cpom, (January 8, 1997). Online: http://www.herseyhiroshima.com/hiro.php

[15]  Wilson cited in Lifton and Mitchell, Hiroshima In America, p. 309.

[16] Ibid., Peter Bacon Hales, Outside The Gates of Eden: The Dream Of America From Hiroshima To Now, p. 8.

[17] Ibid., Zinn, The Bomb, p. 26.

[18] Ibid., Robert Jay Lifton and Greg Mitchell, Hiroshima In America.

[19] See Ward Wilson, Five Myths About Nuclear Weapons (new York: Mariner Books, 2013).

[20] Ronald Takaki, Hiroshima: Why America Dropped the Atomic Bomb, (Boston: Back Bay Books, 1996), p. 39

[21] Ibid, Zinn, The Bomb, p. 45.

[22] See, for example, Gar Alperowitz’s, Atomic Diplomacy Hiroshima and PotsdamThe Use of the Atomic Bomb and the American Confrontation with Soviet Power (London: Pluto Press, 1994) and also Gar Alperowitz, The Decision to Use the Atomic Bomb (New York: Vintage, 1996). Ibid., Ham.  

[23] Giovanna Borradori, ed, “Autoimmunity: Real and Symbolic Suicides–a dialogue with Jacques Derrida,” in Philosophy in a Time of Terror: Dialogues with Jurgen Habermas and Jacques Derrida (Chicago: University of Chicago Press, 2004), pp. 85-136.

[24] For an informative analysis of the deep state and a politics driven by corporate power, see Bill Blunden, “The Zero-Sum Game of Perpetual War,” Counterpunch (September 2, 2014). Online: http://www.counterpunch.org/2014/09/02/the-zero-sum-game-of-perpetual-war/

[25] The following section relies on the work of both Lifton and Mitchell, Howard Zinn, and M. Susan Lindee.

[26] Greg Mitchell, “The Great Hiroshima Cover-up,” The Nation, (August 3, 2011). Online:

http://www.thenation.com/blog/162543/great-hiroshima-cover#. Also see, Greg Mitchell, “Part 1: Atomic Devastation Hidden For Decades,” WhoWhatWhy (March 26, 2014). Online: http://whowhatwhy.com/2014/03/26/atomic-devastation-hidden-decades; Greg Mitchell, “Part 2: How They Hid the Worst Horrors of Hiroshima,” WhoWhatWhy, (March 28, 2014). Online:

http://whowhatwhy.com/2014/03/28/part-2-how-they-hid-the-worst-horrors-of-hiroshima/; Greg Mitchell, “Part 3: Death and Suffering, in Living Color,” WhoWhatWhy (March 31, 2014). Online: http://whowhatwhy.com/2014/03/31/death-suffering-living-color/

[27] Ibid., Robert Jay Lifton and Greg Mitchell, Hiroshima In America, p. 321.

[28] Ibid., Robert Jay Lifton and Greg Mitchell, Hiroshima In America, p. 322.

[29] Ibid. Robert Jay Lifton and Greg Mitchell, Hiroshima In America, p. 322-323.

[30] Ibid. Robert Jay Lifton and Greg Mitchell, Hiroshima In America, p. 336.

[31] Amy Goodman, “Hiroshima and Nagasaki, 69 Year Later,” TruthDig (August 6, 2014).  Online: http://www.truthdig.com/report/item/hiroshima_and_nagasaki_69_years_later_20140806

[32] Ibid.,  Lifton and Mitchell, p. 345.

[33] Brad Evans, “The Promise of Violence in the Age of Catastrophe,” Truthout (January 5, 2014). Online:

http://www.truth-out.org/opinion/item/20977-the-promise-of-violence-in-the-age-of-catastrophe

Henry A. Giroux currently holds the McMaster University Chair for Scholarship in the Public Interest in the English and Cultural Studies Department and is the Paulo Freire Distinguished Scholar in Critical Pedagogy. His most recent books are America’s Education Deficit and the War on Youth (Monthly Review Press, 2013), Neoliberalism’s War on Higher Education (Haymarket Press, 2014), The Public in Peril: Trump and the Menace of American Authoritarianism (Routledge, 2018), and the American Nightmare: Facing the Challenge of Fascism (City Lights, 2018), On Critical Pedagogy, 2nd edition (Bloomsbury), and Race, Politics, and Pandemic Pedagogy: Education in a Time of Crisis (Bloomsbury 2021). His website is www. henryagiroux.com.

Jesús Santrich Explored the Utopian Origins of Marxist and Bolivarian Ideologies


 
 AUGUST 27, 2021
COUNTERPUNCH
Facebook

Colombian Army commandos on May 17, 2021 killed Jesús Santrich, a 30 -year veteran of the Revolutionary Armed Forces of Colombia (FARC). The ambush took place in Zulia Province, in northeastern Venezuela; six others died. Santrich was a spokesperson for the FARC negotiating team in the talks in Havana with Colombia’s government that resulted in a peace agreement in 2016.

Santrich went into hiding in 2019 for two reasons. The killing of FARC combatants was continuing after the Agreement and he was at risk of being extradited to the United States on false drug charges. He joined the “Second Marquetalia,” an offshoot of the original FARC insurgency that was returning to armed conflict.

Santrich’s  22 -page essay, written in 2009 in honor of legendary FARC commander Manuel Marulanda, carries the title:  “Bolivarianism and Marxism – a Commitment to the Impossible” (Bolivarismo y marxismo,un compromiso con lo imposible.) There, Santrich examines the utopian underpinnings of the Marxist movement and the liberation struggles of Simón Bolívar.

He offers a long-term, visionary perspective that, in our era, is often lacking in anti-imperialist and socialist theorizing. The blunted response of progressives and radicals to recent developments in U.S. hybrid wars involving Colombia, Haiti, and Cuba is a case in point. Looking far afield, Santrich calls for political leadership and a mode of struggle that attend to the future, embrace larger purposes, and rest on commitment.

His words

In his essay, Santrich writes about Simon Bolivar, the continental fight for independence from Spain, the FARC’s early years, leadership qualities of Manuel Marulanda, pioneer socialists of the French Revolution; Bolivar’s teacher Simón Rodríguez, and more. Along the way, Santrich offers ideas that, taken seriously, would strengthen revolutionary endeavors now. What follows are excerpts taken from his essay.

For example: “Marxists must keep utopia foremost in their consciousness. It drives mass actions. They must assume that a revolutionary movement, whatever its origins, doesn’t qualify as such if it lacks that component manifesting as irrepressible effort towards change categorized as ‘impossible.’ But utopia must always take off from a basis in realty. We humans have the duty to regard the world we want as another world that’s possible. Paraphrasing Bolívar, we are looking for the “impossible,” while leaving the possible up to everyone else, every day …

He continues: “To declare oneself Bolivarian and, as such, declare oneself a revolutionary on the Marxist path implies lifelong motivation derived from the hope of transforming society and finding justice. This is a constant and is strong enough with its broad vision as to point to utopia as a characteristic of political consciousness and the natural result of rational belief.”

He adds: “Utopia is a higher goal of commitment. That’s so because even at the beginning, the matter of possibility or impossibility is already uncertain due to extreme difficulties ahead, or uncertain survival of purpose as historical implementation evolves. But like history itself, utopia does not end.”

Moreover, “In the hopeful quest for realization of the “impossible,” the process calls upon a mixture of illusions, realism, magic, and love for people as a reason for life … The essential interest of the utopian is preservation of man and nature in absolute equilibrium, thus displaying the potentials of historical memory, faith, dignity, and our identity as vital factors for existence.

Confronting oppression and marching on the path of utopia, the revolutionary no longer is resigned. He or she is unconditionally, permanently, and creatively committed to the poor people of the world … Let’s say then that the Marxist-Bolivarian idea of a revolutionary is of someone who fixes on an ideology that, while encompassing reality, is not yet solidified and is perhaps uncertain. The goal is set of becoming absolutely convinced that this reality will be fulfilled, “impossible” though it may seem….

“The author of the Communist Manifesto, appealing to selfless purpose, was calling for struggle offering the possibility of risks. … Marx was calling for action needing to pass a test of fire in the face of historical commitment prompted by circumstances, even at the risk of death. He was clarifying a concept of living, whose own ethics intermeshed with the dialectics of reality that was moving, but always toward the future. …”

Santrich speculates further: “This kind of thinking envisions Marxists and Bolivarians alike as rising up, in our world, to the level of magical realism. And why not? Magical realism goes beyond mere rationalism. We have symbols, imagination, and creativity – all based on rich traditions rooted in indigenous experience in the Americas. It’s founded also on the syncretism of our mixed and oppressed mestizo peoples. Playing out, this proposition looks toward installing social justice, that is to say, accomplishes what’s ideal for the benefit of humankind.”

The stakes are high: “Perhaps one of the most fateful legacies for revolutionaries is apprehension on facing the danger that imperialism poses for the very existence of the planet with its catastrophic kind of developmentalism. In the face of great challenges, great resolve is necessary, really a triple boldness: action that overcomes determinism; recovery of the role of subjectivity, passion, audacity, and recklessness; and faith in the initiative of the masses, as they face the immediate prospect of “defeat.” In such circumstances, uncertainty and silence are worth nothing. …

What’s in play is the very survival of the human species, of life, of nature in general, all put at risk through the destructive power of capitalism. But we will not idle around patiently waiting for an automatic end to capitalism and for a communist alternative automatically to flourish. Humanity’s conscious intervention is necessary. It’s our immediate duty. Revolutionaries must connect utopia with liberation practice, at whatever cost …”

Meanwhile, on the ground

The response of anti-imperialists to recent developments in Haiti, Colombia, and Cuba would have gained force, it seems, if they had nurtured far-reaching aspirations, in the manner of Santrich. Statements have been scarce and without focus. Action plans are missing. One looks in vain for an expanded frame of reference – in these instances, just as with larger problems like exploitation, plunder and militarization.

President Jovenal Moïse of Haiti was assassinated on July 7, presumably the result of rivalry among Haitian oligarchs, which apparently suits U.S. needs for a weak Haitian state. All but two of the 28 perpetrators were former Colombian army regulars, now employed as mercenary soldiers. Several had received U.S. training. Colombia mercenaries have engaged militarily in the United Arab Emirates, Yemen, Afghanistan and Dubai,Honduras, Venezuela and elsewhere. Two of the assailants were Haitians living in Florida.

CTU Security, a Florida company owned by right-wing Venezuelan émigre Antonio Intriago arranged for the assassination.  Miami – based Intriago has ties to regressive Colombian President Ivan Duque; to Juan Guaidó, the U. S. puppet  president of Venezuela; and to Christian Sanon, a Haitian physician living in Florida and seeking to be president of Haiti. Intriago’s company in 2018 carried out a drone attack against Venezuelan President Nicolas Maduro.

The Cessna four-passenger plane owned by Helidosa company in the Dominican Republic is emblematic of imperialism’s convoluted presence in the region.  Intriago, Sanon, and two others traveled to Haiti on that plane to be on hand at the assassination. After surgical care in Florida for wounds suffered during the attack,Martine Moïse, the assassinated president’s widow, returned to Haiti on the same airplane. In 2019, that plane transported right-wing Venezuelans to Barbados for negotiations with President Nicolas Maduro’s representatives, and did so again the following year with Juan Guaidó aboard.

The U.S. blockade of Cuba comes into view. Campaigning in 2020, President Biden assured voters he would ease the blockade restrictions on Cuba imposed by Donald Trump. Biden has added new sanctions. For 60 years, the U.S. government has blockaded Cuba in order to cause suffering there. Suffering is mounting now due to sanctions and adverse health and economic effects of the Covid-19 pandemic

On July 27 Biden met with Cuban-American elected officials in his office. He stated that, “I want Cuban Americans to know that we … hear the cries of freedom coming from the island … We’ve brought to bear the strength of our diplomacy, rallying nations to speak out and increase pressure on the regime.”

Correlations

The narrative of these recent events contributes little to an indictment against power structures that might have been meaningful enough to fuel a sustained political counterattack. Such an outcome would have required mobilization already in place of a sizable anti-imperialist movement. The necessary ingredient for that is collective motivation based on the hopes, aspirations, and ideals of victims and their allies. It’s missing.

Instead, U.S. progressives are reduced to criticizing the opportunism of President Biden as he unabashedly seeks electoral advantage in Florida. Or, in the case of Haiti, many apparently are blind to the association of ungovernability there with the priorities of U.S. capitalism. They are prey to reports of political corruption, profiteering, and street protests as characterizing Haiti’s situation. In dealing with Cuba and Haiti, it seems, far too many are distracted, pessimistic, and oblivious to the future.

It could have been otherwise. Jesús Santrich defined characteristics of political leadership and permanent mobilization that enable unified struggle and revolutionary optimism. He conceived of a political movement answering to the subjective needs of oppressed peoples. Doing so, he offered them license to come together in fulfillment of aspirations – against all odds. The current situations of Cuba and Haiti could have been folded into that context.

Santrich draws upon a world off limits to multitudes of suffering people, to their great disadvantage. His perspective, that of attending to longings that seem impossible, is uncommon in our time. It represents an essential dimension of struggle. Santrich was in good company. We remember:

Karl Marx, The Civil War in France, 1871 – “The Commune, they explain, intends to abolish property, the basis of all civilization! [Marx replies:] Yes, gentlemen, the Commune [aims] at the expropriation of the expropriators. [They reply:] But this is communism, “impossible communism! [Says Marx:] “If cooperative production is not to remain a sham and a snare; if it is to supersede the capitalist system …what else, gentlemen, would it be but communism, “possible” communism.”

José Mariátegui, founder of the Peruvian Communist Party, 1928 – “We certainly don’t want socialism in America to be a copy or tracing. It must be a heroic creation. We have to give life to Indian-American socialism with our own reality, in our own language.”

Ernesto Che Guevara, Socialism and Man in Cuba, 1965 – “[I]n moments of great peril it is easy to muster a powerful response with moral incentives. Retaining their effectiveness, however, requires the development of a consciousness in which there is a new scale of values. Society as a whole must be converted into a gigantic school.”

Fidel Castro (as recalled by Raul Castro) – “in the mountains of the Sierra Maestra, on Dec. 18, [1956], with seven rifles and a fist full of combatants, [he] stated, ’Now we have won the war!’”

The author translated the excerpts appearing here. Santrich’s entire essay may be read, in Spanish, at https://rebelion.org/bolivarismo-y-marxismo-un-compromiso-con-lo-imposible/

 

W.T. Whitney Jr. is a retired pediatrician and political journalist living in Maine.

Roaming Charges: Hour of the Goat


 

COUNTERPUNCH

AUGUST 27, 2021

Facebook

Baphomet drawing by Éliphas Lévi. From Dogme et Ritual de la Haute Magie by Éliphas Lévi (G. Baillière, Paris, 1861)

+ Televangelist Jim Bakker: “How can you go to church and pray when you’re wearing a mask? Do you think God can hear your prayers through a mask?” Is this the beginning of a doctrinal schism with Franklin Graham, who said yesterday that Jesus would oppose government background checks for guns because Almighty and All-Seeing God knows what’s in everyone’s heart and was doing the background checks himself?

+ So Nietzsche was wrong. God isn’t dead. He’s just deaf.

+ Woman at the Trump rally in Alabama (which has the lowest vaccination rate in the country), after Trump recommended getting the vaccine…”I think it is time where God is separating the sheep from the goats. I’m a goat. Because I am not a sheep. I’m not doing what they tell me to do. I’m fighting against it (the vaccine).”

+ There’s been a remarkable transformation in Christian cosmology. Lambs used to be the flock of Jesus, the good shepherd, and goats were the Satanic symbol of yielding to lustful temptation.

+ The relevant passage is in Matthew, where Jesus sets the sheep on the Right and the goats on the Left and orders the goats to “depart into the eternal fire prepared for the devil and his angels. For I was hungry and you gave me nothing to eat, I was thirsty and you gave me nothing to drink, I was a stranger and you did not invite me in, I needed clothes & you did not clothe me, I was sick & in prison and you did not look after me.”

+ Those antinomian goats expelled by Jesus sound like they provided the template for Stephen Miller’s political program.

+ Here’s a list of the prohibited items at the Trump rally in Alabama last weekend. In other words, you can wear a mask, but you can’t bring your sidearm? You can bring your gun everywhere, even Church, except to a Trump rally? Is there a constitutional footnote, awaiting discovery in a dicta by that beer-swilling brat Brett Kavanaugh, that sanctions this intrusion into our Jesus-anointed right to carry? As for the prohibition on balls, they wouldn’t want to embarrass Trump.

+ I’m no fan of Patton, who was overrated as a general and underrated as a sadist, but it would have taken him about 5 seconds to slap the shit out of Don Jr.

READ ON  Roaming Charges: Hour of the Goat - CounterPunch.org


Afghanistan Redux: Malala Yousafzai, White Feminism and Saving Afghan Women

 
 AUGUST 27, 2021
COUNTERPUNCH
Facebook

Photograph Source: Afghanistan Matters – CC BY 2.0

So the usual “experts” are out and about on airwaves, in newsprint and social media, offering their punditry regarding the what where how who now what's of the situation unfolding in Afghanistan even as the US-backed government crumpled like a house of cards with the advance of the Taliban, once the US troops started their announced pullback in earnest after 20 years of occupation a few weeks ago.  The shock-and-awe tactics of the Bush doctrine that dragged the world into a post 9/11 horror picture show euphemistically known as the War on Terror (in truth, it was intended– and has been experienced by its globally-dispersed victims, as a war OF terror)—seemed appropriately reverse-engineered as the world was treated to images of ordinary Afghans, a 17 year old aspiring soccer star amongst them—plunging to spectacularly shocking deaths as they fell from the tail of a US Air Force plane they had clung to as it took off, carrying a few hundred of their “luckier” countryfolk to the putative “safety” of the West.

Whilst I refuse to wear the mantle of expertise here— well cognizant —as Rafia Zakaria has put it well in a recent book[1]–of how, in the West,  “the emphasis on expertise becomes a kind of gatekeeping of power that locks out [certain, though not all] people of color, as well as working-class people, migrants” and others (p. 8), I remain equally skeptical of the ways in which the category of “experience” (too often a stand-in for “third world/global south authenticity”)–is mobilized as a counter to expertise. I am not convinced (as Zakariya seems to argue) that claiming your “voice”—in the case of brown and black women for example, even when that process of “voicing” genuinely seeks to interrupt the hegemony of white feminism and its epistemological influence on our world—necessarily and always proffers a more transparent working through, and hence redistribution of, power politics. For this latter progressive political vision to take hold, one needs a clearer understanding of what’s at stake, of who or what the levers of control and domination are, and a willingness to embrace the politics of affiliation over the tribal politics of identity. Only then might a blueprint evolve for challenging structural barriers to a better and more equal world.

The case at hand that prevents me from an unqualified rooting for the category of “experience,” is the exemplary case of Malala Yousafzai of Pakistan, who has traversed the distance from female “experience” to feminist “expertise”, and who, like others before (and since) that have made that journey from the “margins” to the “center” of imperial power, has now switched from being a “voice of the oppressed” to becoming an “expert” who can speak to us and teach us about those authentic “other” women in the global south—in this case, Afghan women– to whom her prior proximity (“experience”)– renders her an “expert” on today. From experience to expertise then, is a pretty straightforward line, following the predictable path forged also by white feminism in thrall and service to imperial designs past and present. This is the path that was announced with great fanfare shortly after 9/11 by First Lady Laura Bush and enthusiastically supported by the Feminist Majority Foundation, that would “save brown women from brown men” by going in to the “backward” country of Afghanistan overrun by crazy “Moslem” men, in the process unleashing a 20-year war on the population that had had nothing to do with 9/11. The initial military intervention was then followed up over the next two decades with countless “development” schemes that enriched a few at the expense of the many, and when the cost of this unending war became unpopular with the citizenry “back home” in the USA over time—we left the hapless “natives” that included those very women we had been so concerned with “saving,” at the mercy of anarchy and chaos.

It is against this backdrop of “expertise” (represented back then by policy feminists such as those at the helm of the Feminist Majority Foundation who supported the war in Afghanistan ostensibly to “save” those poor brown Muslim women from the Taliban)–that the “voice” of Malala Yousafzai of Pakistan who was shot at by the Pakistani branch of the Taliban for wanting to attend school—needs to be understood and assessed when she speaks today in the aftermath of that war. Obviously, the spill-over effects of the 20-year war into neighboring Pakistan negatively affected Malala herself (she almost died because of the Taliban attack upon her), but those same circumstances also helped her ascend to worldwide fame, leading her to being read/seen as the “voice of experience” (by western feminists and policy makers), whose “voice” could then be mobilized in service of several of the goals of the continuing War on Terror.  Because she was “rescued” by the British govt after being shot, one could say she crossed over into the realm of “expertise”—a brown Muslim woman today ostensibly speaking for/on behalf of the Afghan women facing another reign of terror with the Taliban, when she tells her readers (without a hint of irony) in the NYT op-ed essay entitled “I Fear for My Afghan Sisters,” (NYT Aug 22, p 5),  that she “cannot help but be grateful for my life now.”

Please don’t misunderstand me: I am not saying anyone should begrudge this brave young woman a second chance at life, for brave she was in defying the Taliban fanatics who would then (and now we presume)—deny women and girls a chance at education and autonomy in leading their lives. What does, however, bear scrutiny is the degree to which the “escape” route permitted to young brown women like Malala can ever enact a politics of feminist solidarity that bypasses the kind of “empowerment” feminism that remains allied to white liberal imperialist formations; that is, can brown women whose experience allows them a leg up into the realm of expertise usually reserved for white women, ever translate into a feminism for the 99%? Or is their “experience” simply another nail to hammer into our heads the state-sponsored feminism celebrating the individualist ethos of success, of “empowerment” which is aimed at further advancing the interests of the top 1%? Do these “voices of the (formerly) oppressed” now just become fodder for advancing the same old stories about breaking glass ceilings and joining old (white) boy networks in the pursuit of “equality”? Thus, when Malala says

After graduating from college last year and starting to carve out my own career path I cannot imagine losing it all—going back to a life defined for me by men with guns.

one has to shake one’s head with a different type of desperation. Whereas her Afghan “sisters” have indeed much to fear at the hands of gun-toting Taliban leaders if the past is any indication, Malala seems to have entirely missed the irony that her “freedom” is now in the hands of another set of gun-toting men: white imperialist invaders/drone attackers of lands like Afghanistan, Pakistan, Iraq, Syria and so on. Further, her “cure” for the world’s women is to give them access to education in order to enable them—like herself—to lead independent lives whose definition of “freedom” and “success” is measured in carving out individuated “career paths.” Malala Yousafzai, in this instance, becomes the avatar of all those “It” girls of consumer-oriented white western feminism, that Angela Mcrobbie has written about in her  book on the Aftermath of Feminism, in which she theorizes the loss of feminist subjectivity in Britain, a feminist sensibility arising out of a socialist ideology that could have served as the basis of solidarity across race, class and cultures, but which, as one reviewer points out is  “now entrenched in a post-feminist neo-liberal capitalist global economy.” Unfortunately, Malala’s “voice”– her “experiential” advantage– has been turned into its opposite in this capitalist neocolonial economic model that extracts what is of use to it and puts it to a different purpose to serve its own agendas. In short—she is now serving at the behest of her “handlers”—the white feminists who once again are poised to “save brown women from brown men.” [2]

Notes.

[1] Rafia Zakariya, Against White Feminism: Notes on Disruption. WW. Norton and Co., 2021.

[2] I have elsewhere argued that the “Malala effect” enables a “politics of pity.” By this I mean– as Wendy Hesford in Spectacular Rhetorics has explained about the role of the kind of rhetoric celebrating Malala’s spectacular “escape” from the clutches of the Taliban in 2012– that this type of “recognition”  “activates certain cultural and national narratives and social and political relations” that produce differential power relations. Such a “recognition of the pitiable/different other” then leads to an argument for the governance of those visibilized through such rhetorical codes as“human rights subjects,” who need “our” help. “Spectacular rhetorics” as theorized by Wendy Hesford (as well as Lillie Chouliarki), thus produces and circulates images of “suffering others” (in this case, Afghan women as victims)—leading western audiences of these images to feel “pity” for these victimized women and hence to allying with the “white savior/rescue” narrative.

Fawzia Afzal-Khan is University Distinguished Scholar at Montclair State University in NJ. Her latest book is Siren Song:Understanding Pakistan Though it’s Women Singers. She can be reached at:  fak0912@yahoo.com

Facebook posts stated on August 23, 2021 in a Facebook post:

“FDA approval means nothing when you realize the FDA also approves pesticides in our food!"





In this Aug. 4, 2009, file photo, a crop duster sprays a field of crops just outside Headland, Ala. (AP)


By Jason Asenso
August 27, 2021

Fact-checking a claim about the FDA’s role in approving pesticides in food

The EPA sets limits on pesticide residue in food, while the FDA monitors the food supply and enforces the limits.

The Pfizer vaccine was approved after meeting the FDA’s standards for vaccine safety.
See the sources for this fact-check

After the U.S. Food and Drug Administration gave final approval to the Pfizer COVID-19 vaccine, a health and wellness Facebook group shared a post that seeks to discredit the agency.

"FDA approval means nothing when you realize the FDA also approves pesticides in our food!" the Aug. 23 post said.

The post was flagged as part of Facebook’s efforts to combat false news and misinformation on its News Feed. (Read more about our partnership with Facebook.)

The post implies that since the FDA allows pesticides in food, people shouldn’t trust the agency’s approval of the Pfizer vaccine. But that misrepresents the role of the FDA in regulating pesticides, and ignores the scientific processes that go into vaccine approval decisions.
How the FDA approves vaccines

The post uses a misleading claim about pesticide regulation to argue that FDA approval means nothing. That’s not the case. Full approval by the FDA means that the agency has high confidence in the safety, effectiveness and quality of a drug.

On Aug. 23, the FDA approved the Pfizer-BioNTech COVID-19 vaccine for people 16 and older. The vaccine was previously used under emergency use authorization, a designation that still applies for people between 12 and 15.

"While this and other vaccines have met the FDA’s rigorous, scientific standards for emergency use authorization, as the first FDA-approved COVID-19 vaccine, the public can be very confident that this vaccine meets the high standards for safety, effectiveness and manufacturing quality the FDA requires of an approved product," acting FDA Commissioner Janet Woodcock said in a press release.

Although the COVID-19 vaccines were already deemed safe and effective with the emergency use authorization, full approval means that businesses and schools may have more leeway to require the Pfizer vaccine, and the shot can stay on the market beyond the health emergency.

The FDA says the approval process includes an analysis of the condition and available treatments, assessment of benefits and risks through manufacturer-provided data, and strategies for managing risks associated with the drug.

The road to FDA approval starts with a vaccine undergoing three different phases of clinical trials, according to University of Chicago Medicine. Each phase of the trial increases the number of participants and the scope of what researchers look for.

Vaccine manufacturers then have to submit a Biologics License Application to the FDA, which includes data from clinical and non-clinical trials, technical information, labeling information, and more. In Pfizer’s case, this application built upon the extensive data submitted to support its emergency use authorization.

Pfizer submitted its application for full FDA approval in May 2021, and it was the fastest FDA approval of a vaccine ever.

FEATURED FACT-CHECK

Viral image
stated on August 2, 2021 in a viral image
It’s safer to be unvaccinated than vaccinated against COVID-19.


By Tom Kertscher • August 4, 2021

"The public and medical community can be confident that although we approved this vaccine expeditiously, it was fully in keeping with our existing high standards for vaccines in the U.S.," said Peter Marks, the director of the FDA’s Center for Biologics Evaluation and Research.
The FDA and pesticides

Regulation of pesticides in the food supply is shared between the Environmental Protection Agency and the FDA. The regulations concern the safety of the pesticides used in growing food, and how much pesticide residue can remain on food without causing harm. This residue is often measured in parts per million.

Simply put, the EPA sets the allowable limits on pesticide residue that would be safe for human consumption, and the FDA is responsible for enforcing these limits. The limits apply to food that’s imported or grown in the U.S.

In setting the limits, the EPA examines factors such as the pesticide’s toxicity, how much growers apply and how it breaks down.

One way the FDA enforces these limits is through its Pesticide Residue Monitoring Program, which tests a variety of food samples every year to see whether they meet federal guidelines for over 800 pesticides and chemicals.

"Very small amounts of pesticides that may remain in or on fruits, vegetables, grains, and other foods decrease considerably as crops are harvested, transported, exposed to light, washed, prepared and cooked," EPA spokesperson Tim Carroll said. "The EPA agrees with experts who recommend consuming a varied diet that includes many fruits and vegetables, and washing them first with plain water to remove dirt, bacteria and pesticide residues."

Under law, the EPA is also required to reevaluate the safety of existing pesticides, although that doesn’t mean the agency isn’t subject to political and industry pressures.

In 2017, then-EPA Administrator Scott Pruitt rejected the agency’s recommendation to ban using chlorpyrifos on food, a pesticide linked to neurological damage in children.

This year, the EPA reversed the decision and announced that it would ban the pesticide, but it will take time to phase out its use on food completely.
Our ruling

A Facebook post claimed that "FDA approval means nothing when you realize the FDA also approves pesticides in our food!"

The FDA does not approve pesticides in food. The EPA sets limits for safe levels of pesticide residue on food, and the FDA monitors the food supply to enforce the limits.

The FDA’s approval process for vaccines begins with three phases of clinical trials to establish their safety and efficacy, and a review of data from clinical and non-clinical trials along with other supporting documentation.

We rate this claim Mostly False.