Monday, December 27, 2021

'Think just a little bit': Reporter busts Chuck Todd for assuming most parents are white
David Edwards
December 26, 2021

NBC/screen grab

NBC host Chuck Todd was admonished by investigative reporter Nikole Hannah-Jones on Sunday for remarks that otherize parents of color.

During a Meet the Press special broadcast about schools and critical race theory, Todd wondered how old children should be before they are taught about race.

"I think this is coming through a racial lens, but there's this, you know, parents are saying, 'Hey, don't make my kid feel guilty,'" Todd told Hannah-Jones. "And I know a parent of color is going, 'What are you talking about? You know, I've got to teach reality.' When do you do it and how do you do it?"

"Well, I think you should think just a little bit about your framing," Hannah-Jones replied. "You said 'parents' and then you said 'parents of color.'"

"White parents and parents of color," Todd interrupted. "No. Fair point."

"As a matter of fact, white parents are representing fewer than half of all public school parents," Hannah-Jones noted. "And yet, they have an outsized voice in this debate."

Watch the video below from NBC.

The sordid underbelly of Christmas past
The Conversation
December 25, 2021


When English Puritans outlawed Christmas in 1647, it was not without good reason. When American Puritans, in turn, outlawed Christmas in Massachusetts between 1659 and 1681, it too was not without good reason.

Christmas past was anything but innocent.

Until the mid-19th century, Christmas was a time for drunkenness and debauchery.

Men dressed like women, women dressed like men, servants dressed like masters, boys dressed like bishops, everyone else either dressed as animals or wore blackface – all to subvert the godly order in the safety of anonymity.

Christmas was a carnival of drink, cross-dressing, violence and lust during which Christians were unshackled from the ethical norms expected of them the rest of the year.

No wonder the Puritans wanted it banned.

The Origins of Christmas Revelry

It was not until the 4th century that the Church of Rome recognised December 25 as the date to celebrate the birth of the messiah. And it did so knowing well that there were no biblical or historical reasons to place Christ’s birth on that day.

There is some evidence the Romans worshipped Sol Invictus, their sun god, on December 25. But what the Romans really celebrated during the month of December was Saturnalia, an end of harvest festival that concluded with the winter solstice. As historian Stephen Nissenbaum pointed out in his acclaimed The Battle for Christmas, the early Church entered into a compromise: in exchange for widespread celebration of the birth of Christ, it permitted the traditions of Saturnalia to continue in the name of the saviour.


Gambling, as seen here in a fresco from Pompeii, was a hallmark of the Roman celebration of Saturnalia.  Wikimedia Commons

Gift-giving, feasting, candles, gambling, promiscuity and misrule were the hallmarks of Saturnalia. Add to this the holly, the mistletoe and (much later) the tree, and we have a Christmas inclusive of a variety of pagan traditions.

But as time went on, Church leaders became increasingly disillusioned by the way the carnival that was Saturnalia simply carried on under a thin veneer of Christian piety.

The 16th century bishop Hugh Latimer lamented that many Christians “dishonoured Christ more in the 12 days of Christmas than in all the 12 months besides.”
Lords and Ladies of Misrule

In early modern England, it was common practice to elect a “Lord of Misrule” to oversee Christmas celebrations. Revellers under the auspices of the “Lord” marched the streets dressed in costume, drinking ale, singing carols, playing instruments, fornicating and causing damage to property.

One account from Lincolnshire in 1637 relates how the revellers decided the Lord must have a “Christmas wife,” and brought him “one Elizabeth Pitto, daughter of the hog-herd of the town.” Another man dressed as a vicar then married the lord and lady, reading the entire service from the Book of Common Prayer, after which “the affair was carried to its utmost extent.” Had they not carried the matter so far, the account continues, “probably there would be no harm.” As it was, “the parties had time to repent at leisure in prison.”


Twelfth-night (The King Drinks), painted by David Teniers the Younger, between 1650 and 1660.
© Museo Nacional del Prado, CC BY-NC-SA

“December was called […] the Voluptuous Month” for a reason, wrote Reverend Increase Mather in 1687. Young men and women often took advantage of the moral laxity of the Christmas season to engage in late-night drinking and sex.

Not surprisingly, such seasonal merrymaking resulted in higher than usual birth rates in the months of September and October, as well as real rather than burlesque marriages.
Wassailing

Even Christmas charity was far from innocent. Gifting, this hallmark of the season, was rarely given freely, but demanded with threats of mischief or violence.

In the practice known as “wassailing” during the 17th and 18th centuries, roving bands of poor men and boys asserted their Christmas right to enter the houses of the prosperous and claim the finest food and drink, singing:
We’ve come here to claim our right,
And if you don’t open up your door,

We will lay you flat upon the floor.


An depiction of wassailing from the Illustrated London News, 1856.
© The Trustees of the British Museum, CC BY-NC-SA

Though most wassailing ended without violence, the occasional stone was thrown through the window of an uncharitable lord. To the lord who was generous, the goodwill of the wassailers could be hoped for the rest of the year.
Domesticating Christmas

Ultimately, the efforts of Puritans to ban Christmas failed. The irreligious revelry that marked Christmas past was too deeply entrenched in Western culture. But where the forces of religion failed, the forces of the market would soon succeed in taming Christmas. The sordid behaviour of Christmas past would be substituted for another type of irreligion: consumerism.

Still, much of the sordid underbelly of Christmas past remains. That family member who always has a bit too much to drink, the overeating, the regretful rendezvous with a colleague at the office party – all telltale signs our oldest Christmas traditions are alive and well.

James A. T. Lancaster, Lecturer in Studies in Western Religious Traditions, The University of Queensland

This article is republished from The Conversation under a Creative Commons license. Read the original article.
Like COVID-19, the Black Death had its own 'truther' movement too
Matthew Rozsa, Salon
December 25, 2021

Wikipedia- public domain

While the COVID-19 pandemic has been an inflection point of modern history, it is nowhere close to being the deadliest pandemic in human history. That dubious distinction belongs to the infamous "Black Death," a bubonic plague that swept through Europe and the Near East in the mid-14th century. Like COVID-19, the bubonic plague was a terrible way to die, but with very different symptoms. The most notorious were the dreaded buboes (hence 'bubonic plague'), severe inflammations of the lymph nodes that oozed pus and broke out over the groin, armpits and neck. A victim's skin and flesh would eventually turn black and rot, although long before this happened they would experience intense nausea, vomiting, fever, headaches and aching joints. Within days — or, at most, a couple weeks — the infected person would die.

One might imagine that a disease this terrible would have been burned into humanity's collective consciousness. Indeed, the Black Death did have a profound impact on our day-to-day lives, influencing everything from the professionalization of medicine and the decline of feudalism to works of art like Giovanni Boccaccio's book "The Decameron" and Ingmar Bergman's movie "The Seventh Seal." Yet the Black Death is not often mentioned in reference or in contrast to the COVID-19 pandemic — even though there are important parallels. Perhaps most tellingly, both diseases fueled scapegoating and mass hysteria because of widespread ignorance.

While the scientific illiteracy in the COVID-19 era is fueled by a mixture of motivated reasoning, political bias and historically-based concerns about institutional trustworthiness, inhabitants of the Middle Ages lacked modern humanity's sophisticated knowledge about biology. Louis Pasteur did not develop modern germ theory until the 19th century, half a millennium after the Black Death. Today we know that the Black Death was a disease, and that the microorganisms was most likely imported from Asia through the Crimea and into Europe and the Near East by way of fleas living on black rats. People who lived in Europe, North Africa and the Middle East in the 1340s and 1350s could not have even imagined what a microorganism was, much less the complex chain of events that would have brought such a deadly one into their homes.

In the absence of knowledge, some ugly alternative theories emerged. Because Jews had been a popular scapegoat in Europe for centuries, a wave of pogroms against Jewish communities broke out during this time as they were blamed for the plague. For years Jews had been collectively blamed for the death of Jesus Christ and accused of sinister blood rituals; around the Crusades, the stereotype also emerged of Jewish wealth, one reinforced in anti-Semitic minds by how Jews were barred from owning land and therefore were disproportionately concentrated in finance. Attacks on Jewish communities were commonplace prior to the Black Death, but now occurred with renewed vigor and effectiveness because the attackers had more motive. Jews were accused of poisoning wells and of other conspiratorial actions, all somehow connected back to alleged vendettas against Christianity, desires to earn money, ominous religious practices or some combination of the three. Victims were tortured into confession and exterminated in large numbers.

There is an obvious parallel between this and the rise of anti-Chinese prejudice during the COVID-19 pandemic. While nowhere near as pervasive as anti-Semitic sentiment during the Black Death, there have been thousands of anti-Asian hate incidents in the United States since COVID-19 reached our shores. These have ranged from taunts and slurs to acts of physical violence. And the rhetoric of certain politicians, who reinforce and encourage the scapegoating of China — or even promote unfounded conspiracy theories that China somehow created the virus — hasn't helped.

"I've been saying this for a year: The rhetoric from Trump has emboldened people to openly speak in an anti-Chinese way, which — being Asian American in the United States, part of the stigma is people can't tell Asians apart, we're forced into a racial group and lumped together," Rosalind Chou, an associate professor of sociology at Georgia State University, told Salon in April. "I'm Taiwanese American, but people walking down the street couldn't differentiate, right? [...] I've been saying for a year that people are going to get hurt if we keep placing blame and calling COVID-19 the 'China virus,' if we have radio talk show hosts and news reports constantly using rhetoric that is anti-Chinese."

Scapegoating during the plague era wasn't confined to Jews. Setting aside lepers and other unfortunate individuals from marginalized groups that were also sometimes blamed for the plague, medieval people had a wide range of theories about who was behind the Black Plague. Some turned to astrology for an explanation for the plague, as well as a possible cure. Many religious people believed it was God's wrath or Satan's scourge; flagellants, or religious penitents who would flog themselves in public and beseech the almighty for forgiveness, became a common sight at this time; more educated people subscribed to the idea that miasmas, or "poisoned air," was responsible for causing disease. (This was probably the closest anyone came to the truth without knowing anything about microbiology.)

There is a lesson in humility there: It is possible that there is much we don't know about COVID-19 in our era that could become common knowledge in a handful of generations. Likewise, there are parallels between the people who saw deities and devils behind every bubonic sore and blister, and those who insist that analogous sinister conspiracies are at work behind current events today. These ideas may seem outlandish, such as claiming that Bill Gates or George Soros is somehow behind the whole thing. On other occasions they have a measure of plausibility, albeit a grossly exaggerated one, such as the idea that the bug may have originated from a Chinese laboratory. Just as the flagellants and anti-Semites of medieval Europe drew from pre-existing religious traditions to color their interpretations of the Black Plague, so too do individuals who were conspiracy theory-minded before the pandemic turn to those types of explanations during it.

"The people who are believing in those conspiracy theories were likely believing in similar conspiracy theories before the COVID-19 pandemic, and they're just applying that style of thinking to this new thing," Joseph E. Uscinski, a political scientist at the University of Miami, told Salon last year. "Basically what we find is that the people who buy into these sorts of conspiracy theories do so because they have what we call underlying conspiracy thinking, meaning that they see the world through a conspiratorial lens."

Not all of the comparisons between the Black Plague and COVID-19 are foreboding. As briefly mentioned earlier, the Black Plague drew attention to how medieval practitioners of medicine usually had no idea what they were doing. This planted seeds that eventually grew into a systematized, scientific approach to healing the human body — in short, the renaissance of modern medicine. While human beings were thankfully much farther progressed in biotechnology by the 2020s, the pandemic helped jump start the development of a new class of vaccine technology, the mRNA vaccines like those mass produced by Moderna and Pfizer/BioNTech, which could revolutionize medicine. Everything from cancer vaccines to universal influenza inoculations are all within the realm of possibility thanks to this platform, which trains cells how to produce proteins that the immune system can recognize as being associated with the SARS-CoV-2 virus.

By the time it had finished peaking (1347 to 1351) and ravaged most of the Western world, the Black Death had claimed anywhere from 75 million to 200 million lives. The COVID-19 pandemic's death toll, though no less tragic, is at the time of this writing just shy of 5.4 million, with more than 800,000 of those in the United States. Fortunately, this is a small faction of the total human population of more than 7.7 billion today; while it is impossible to know for sure how many people were alive in the mid-14th century, most estimates place it around 300 to 400 million. Reflecting back on how far humans have come, we can at least be grateful to live at a time when science has brought us so many miracles of medicine — even if it hasn't yet cured the miasma of misinformation.


A 'talking' cat is giving scientists insight into how felines think
Nicole Karlis, Salon
December 25, 2021

Cute gray kitten playing on artificial green grass (Shutterstock.com)

Billi, a 13-year-old domestic cat in Florida, presses a button that voices the word "dog" — twice.

She proceeds to sit as if she's waiting for her human parent, Kendra Baker, to respond.

"Dog outside, hmm?" Baker asks Billi, via the buttons. A few minutes later, Billi presses another button for "tummy," twice.

"Accident or premeditated murder? You decide," Baker writes on the caption of the video on Instagram.

Those who follow the travails of internet-famous "talking" animals may be familiar with Bunny the Talking Dog, a TikTok- and Instagram-famous pet. Just like Bunny, Billi the cat uses an augmentative and alternative communication (AAC) device — essentially, a sound board made up of buttons with a different word vocally recorded on each — to "talk" to her human, Baker. Baker, like Bunny's human parent, was inspired to attempt this means of animal-human communication after she observed Christina Hunger, a speech-language pathologist, who taught her dog Stella to use an AAC device.

Of course, unlike Bunny and Stella, Billi is a cat. And while dogs, as social animals, are renowned for being able to understand human speech, cats are a different matter.

That didn't stop Baker. At the start of the pandemic, when she found herself with extra time on her hands, Baker decided to order an AAC device to see if Billi could "talk," too.

"At that point Billi was the first cat that I knew of to try it," Baker tells Salon. "I hadn't seen any cats do it."

Considering Billi's feline status, Baker was naturally a bit skeptical at first.

"I was concerned because they [the buttons] were quite large for a little tiny kitty, and I was not sure that she was actually going to be heavy enough to press them," Baker said. "So I started with a word that I'd really not recommend that you start with, which is 'food,' because it becomes very motivating for them. And Billi loves food."

Baker's concerns quickly washed away once it became clear that Billi was able to press the button "food" — which she appeared to enjoy doing perhaps a little too much.

"She was definitely heavy enough for it," Baker said. "And then I later regretted starting with food because it kind of backfired on me, but it definitely got the ball rolling."

Today, Billi has 50 words on her board, and — like Bunny — is part of the ongoing research project called TheyCanTalk, whose goal is to understand if animals can communicate with humans through AAC devices. While the study is mostly made up of dogs, about 5 percent of the animals using AAC devices are now felines. It turns out that many cats have been successful at using the device.

Leo Trottier, cognitive scientist and founder of How.They CanTalk.org and developer of the FluentPet system Billi uses, admitted to Salon he was "pessimistic" about cats using the buttons, but was pleasantly surprised when they started to see felines catch on. Now, he's intrigued by the ways in which cats appear to use the buttons differently from dogs.

"What's interesting is that they [cats] tend to not do that much in the way of multi-button presses, but there's like a lot of single-button presses," Trottier tells Salon. "With cats, you kind of have to find things they really want, and there are just fewer of those than with dogs."

Baker agreed that Billi appears to string words together less frequently than dogs. For example, Bunny is often putting together what appears to be sentences like "night talk sleep," which Bunny's human interpreted as the dog's attempt to communicate that she was having a dream. But Baker has a theory on why cats, like Billi, might be more prone to pressing one button to communicate.

"She does string words together, but it is much less frequent than what I see some of the dogs doing, and I don't know exactly why that is but I will say she's more deliberate in her button presses," Baker said. "Billi is very, very deliberate when she presses a button and knows exactly which one she's looking for, she takes her time . . . and if she is going to string a sentence together, she'll take a thinking loop and then she'll come back — very rarely does she go from one directly to another."

Could it be that in observing cats use the AAC device, humans are finding out that the stereotypical differences between cats and dogs are actually true? Perhaps, but researchers have been very cautious to jump to any conclusions about these "talking" animals yet. In fact, it's up for debate if these animals are, scientifically speaking, speaking — or if they've simply been trained to use specific buttons to conjure specific things. Whether or not their communications are spontaneous has yet to be concluded.

Still, the spectacle of an animal talking through speakers is fascinating to observe, and researchers are excited by the prospect that cats are part of the project now.

"I'm very intrigued by the cats that are using the boards, because there's really a dearth in cat cognition studies, particularly those that happen in the home," Gabriella Smith, a cognitive science researcher at CleverPet. "Cats are really kind of overlooked in the companion animal cognition world. I've been a big fan of Billi, and my animal cognition scientist brain just lights up because I see these behaviors that I know from my own cat — but now I'm able to look at it from a cognition lens."

Smith added that having cats as part of the TheyCanTalk study is a great way to study their cognition — and also, perhaps, dispel myths about cats.

"They have this reputation of just doing what they want and not really caring what the humans are doing, and I think this is a great opportunity to see that they actually are paying attention," Smith said. "Seeing that they can be engaged, that they're not just cat automatons, that aren't driven by instinct 24/7 can function a great deal positively for their role in other studies."

In some ways, including cats in the study has opened the door for other species too — like birds. Indeed, some birds are notorious for their ability to mimic and learn humans words, so their addition makes sense.

Regardless of what these studies ultimately tell us about cat cognition, Billi's owner has observed a noticeable shift in Billi's happiness since introducing the buttons to the talkative kitty.

"I really believe that the majority of house cats are bored and depressed," Baker said. "We don't give them any stimulation . . . and if this pandemic has taught us anything, it's that staying inside your house all day is terrible. So you know, anything that we can do for them that gives them a better life, I'm for it."
Opinion | 70 Years Ago Black Activists Accused the U.S. of Genocide. They Should Have Been Taken Seriously.

The charges, while provocative, offer a framework to reckon with systemic racial injustice — past and present.



Paul Robeson and other members of the Civil Rights Congress submit a report on police brutality and systemic racism against Black people, accusing the U.S. of genocide, to the United Nations. | Daily Worker & Daily World Photo Collection, Tamiment Library & Robert F. Wagner Labor Archives, New York University


Opinion by ALEX HINTON
12/26/2021 

Alex Hinton is a distinguished professor of anthropology and director of the Center for the Study of Genocide and Human Rights at Rutgers University, Newark. He is also an author, most recently of It Can Happen Here: White Power and the Threat of Genocide in the U.S. and of The Anthropological Witness, a forthcoming book about his 2016 experience testifying as an expert witness on the charge of genocide at the Khmer Rouge tribunal in Cambodia. Follow him on Twitter @AlexLHinton.

Seventy years ago this month, on Dec. 17, 1951, the United Nations received a bold petition, delivered in two cities at once: Activist William Patterson presented the document to the U.N. assembly in Paris, while his comrade Paul Robeson, the famous actor and singer, did the same at the U.N. offices in New York. W.E.B. Du Bois, a leading Black intellectual, was among the petition’s signatories.

The group was accusing the United States of genocide — specifically, genocide against Black people.

The word “genocide” was only seven years old. It had been coined during World War II in a book about Nazi atrocities, and adopted by the United Nations in 1948, though no nation had yet been formally convicted of perpetrating a genocide.

The 240-page petition, “We Charge Genocide: The Crime of Government Against the Negro People,” was meant to be sensational. America had been instrumental in prosecuting the Nazis at Nuremberg, and now its own citizens were turning the lens back on the U.S. in the most horrifying, accusatory terms.

Instead, mainstream media largely ignored it. The New York Times and Washington Post mentioned the petition in brief stories buried in the back pages. The Chicago Tribune condemned it for “shameful lies.” Raphael Lemkin, the Polish jurist who had coined the term “genocide,” publicly disagreed with the whole basis of the petition, saying it confused genocide with discrimination.

The drafters of the document hadn’t expected it to go anywhere; they knew the U.S. had too much power at the U.N. for the petition to be taken up. They had written it less as a formal charge than as a presentation of an allegation, loosely written in the model of a legal brief. They hoped, though didn’t expect, that the General Assembly, Commission on Human Rights or another party at the U.N. might take up the issue for deliberation. That never happened.

But today, 70 years later, the document has a new resonance amid the patent injustices of police brutality that continue to occur and racial inequities in health care on display especially throughout the pandemic. “We Charge Genocide” explored these kinds of issues at length, making a compelling case for thinking about structural racism as genocide, which demands not only condemnation but also redress and repair. To consider the arguments in “We Charge Genocide,” drafted by some of the most notable figures in the midcentury civil rights movement, offers important insights into the current moment and how to move forward.

Two events set the stage for the Black genocide charge in 1951 — one international, and one domestic.

The first was close to a miracle. After two years of political haggling following Nazi atrocities, U.N. member states each agreed to relinquish a small piece of their sacrosanct sovereignty when the body passed the 1948 Convention on the Prevention and Punishment of the Crime of Genocide. This codified a legal definition of genocide, criminalizing group destruction and opening the door to try perpetrators in an international court of law. Since then, the convention has been used both to hold perpetrators of genocide accountable in places like Rwanda, the former Yugoslavia and Cambodia and, importantly, to set up or pave the way for remediation mechanisms after it’s been established that genocide occurred.

The Convention defined genocide as “acts committed with the intent to destroy, in whole or in part, a national, ethnical, racial or religious group, as such.” It also specified that these acts include not just killing but also attempted group destruction through “bodily or mental harm,” impeding reproduction, harsh living conditions and child removal.

Second, the Black civil rights movement gained momentum after World War II. Freedom struggles and racism became a focus of worldwide attention, and the newly established U.N., which included a Commission on Human Rights, provided a new, global platform for protest against racism in the U.S. Black activists used it to internationalize and broaden their movement, in part by reframing it in terms of human rights.

Two prior petitions preceded “We Charge Genocide.” In 1946, the National Negro Congress delivered an eight-page petition to the U.N. Secretary-General asking him to take action on the subjugation of Black Americans, particularly in the South, where 10 million Black people lived in deplorable conditions. The U.N., mired in politics, failed to act. But the petition had some success, particularly abroad, in drawing attention to the plight of Black Americans.

Seeing the potential of such petitions, the NAACP, frustrated by the U.S. government’s slow pace of racial reform, undertook a much more ambitious initiative. On Oct. 23, 1947, it presented an extensive, 95-page “Appeal to the World!” petition to the U.N. The statement, which garnered much more attention, lambasted the U.S. for denying a host of human rights to its Black minority population.

“We Charge Genocide,” a project of the Civil Rights Congress, followed in this tradition, but it was even more ambitious and inflammatory, explicitly situating its documentation of U.S. racism within the framework of genocide — a new kind of crime, freshly outlined by the U.N.

“We Charge Genocide” starts with killing: The petition documents hundreds of killings and other abuses, some involving police, which took place from 1945, when the U.N. was established, to 1951. It asserts that more than 10,000 Black people had been lynched or killed, often for something as small as “failure to say ‘sir’ or tip their hats.” Many more, it adds, suffered from “serious bodily and mental harm” from beatings, assaults and the terror caused by the constant threat of such attacks.

Direct violence was only one part of the charge. Another part was the systematic treatment of Black people within society. The petitioners were taking the minimalist language of the U.N. convention on genocide, about things like conditions of life, and giving it shape and form with real-world examples. The petition outline the harsh living conditions to which Black Americans were subject: lives diminished in terms of housing, medical care, education, segregation, job opportunities, sharecropping, incarceration, housing, poverty and voter suppression. As a result, the petition states, 30,000 extra Black people were dying each year, and Black Americans were living eight years less than white people. Officials from all three branches of the U.S. government, the petition alleges, were guilty of inciting, conspiring to commit or being complicit in such genocidal acts, with examples ranging from police violence to racist laws promulgated by members of Congress.

Proof of Black genocide, the petition concludes, “is everywhere in American life,” and as “the conscience of mankind,” the U.N. General Assembly should not stand by silently but should instead both condemn the U.S. and demand that the country take steps to stop the genocide and prevent its persistence.

The petition had 94 signatories, including family of victims. Some were with Robeson when he presented the petition to the U.N. in New York, including the widow of one of the “Martinsville Seven,” executed after being falsely accused of raping a white woman.

The petition was received by U.N. officials in Paris and New York. Patterson even discussed it with a handful of delegates from countries like Egypt, Ethiopia, Haiti, India and Liberia. But no one agreed to take it forward.

Cold War politics were a large part of the reason why. The U.S. government — worried about bad PR during the Cold War — mounted a campaign to blunt any domestic and international impact it might have.

In 1951, the Red Scare was in full swing. The Soviet Union was nuclear; Mao had taken over China; war was raging in Korea; and Senator Joseph McCarthy was leading now-infamous hearings on the communist infiltration of U.S. Black nationalists. “We Charge Genocide” petitioners were among the targets. The government had already confiscated Robeson’s passport, which is why he presented the document in New York. It seized Patterson’s after he presented the petition in Paris.



Left: Robeson and others wait at the U.N. headquarters in New York on Dec. 17, 1951, to present a petition accusing the U.S. of genocide. Right: Patterson, pictured at his desk in New York in 1962, delivered the same petition to the U.N. in Paris. | AP Photos


The U.S. government launched a campaign to discredit the petition, fearing it would inflame tensions at home and impugn the country’s reputation abroad. The sometimes bombastic, communist overtones of the petition, as well as the communist associations of some of the petitioners, made it easy even for sympathetic Americans to see it in an anti-American, ideological light.

Eleanor Roosevelt gave a U.N. address defending the U.S in response to the document. In an interview, she emphasized Black progress while calling the genocide accusation “ridiculous.” The U.S. also enlisted more moderate Black luminaries to condemn the petition. Some, like NAACP leader Walter White, agreed to do so since they were wary of Patterson and wanted to distance the Black civil rights movement from communism.

America’s immense international clout also helped bury the issue. The architect of the idea of genocide, Lemkin, was still working tirelessly to achieve U.S. ratification of the U.N. Genocide Convention — which faced resistance, especially from Southern Democrats, precisely because they feared the U.S. would be accused of genocide against its Black population — and couldn’t afford to alienate the U.S.

Lemkin responded to the petition by publicly supporting the official U.S. view, emphasizing the improving situation of Black Americans while claiming the petition was a maneuver by “communist sympathizers” to divert attention from the genocide of “Soviet-subjugated people.” And so, in a moral failing of his otherwise distinguished career, Lemkin downplayed the long history of violence committed against Black Americans. “Genocide means annihilation and destruction,” he stated, “not merely discrimination.”

In fact, the arguments in the document — especially with the benefit of some distance from the Cold War anxieties of the time — look very compelling. While many may think that genocidal annihilation only looks like Nazi mass murder, the U.N. Genocide Convention clearly incorporates more nuanced forms of destruction than that. Lemkin dismissed the lynchings and other killings detailed in “We Charge Genocide” as “actions against individuals — not intended to destroy a race.” But the petition detailed how such violence was not just individual but deeply intertwined in the fabric of American society.

Perhaps most importantly, it made the case that the systemic and structural nature of racism against Black people in the U.S. was what made it genocide, a novel legal argument to expand the understanding of genocide that turned out to be ahead of its time.

In recent years, some genocide scholars have begun to think in terms of “structural genocide.” This sort of understanding could also be applied to, for example, the Uyghur people in China, where the group faces a sort of social death.

Ironically, Lemkin’s own early scholarship provided an important foundation for this view of genocide. When he coined the term “genocide” in his 1944 book Axis Rule in Occupied Europe, Lemkin defined it as a “coordinated plan of different actions aiming at the destruction of essential foundations of the life of national groups.” This process, Lemkin stated, had two phases: “one, destruction of the national pattern of the oppressed groups; the other, the imposition of the national pattern of the oppressor.” Such group destruction was carried out not just by killing but by political, social, cultural, economic, biological, religious, moral and physical means that crushed the “spirit” of the victim group. This is exactly the sort of interwoven tapestry of group diminishment “We Charge Genocide” sought to establish as constituting the genocide of Black Americans.

“Genocide” seems — and is — hugely divisive, and it’s unlikely that the U.N. would formally accuse its wealthiest member of the crime, even in the past. But the petitioners weren’t wrong to situate Black American history in that bleak context, and doing so gives us — maybe surprisingly — some routes forward.

The U.N. remediation guidelines for mass human rights violations like genocide have some clear goals. These include safeguarding basic human rights of the offended group, investigating abuses and providing redress.

What might such remedy look like? Monetary payment is an obvious form of compensation, and an establishment of genocide can help advance current discussions around such reparations. Efforts to remedy historical injustices in other parts of the world offer other examples — education, health services, criminal trials, structural reforms — that have also been implemented in response to genocide. And even in the face of only accusations, let alone convictions, of genocide, truth-seeking commissions have been held to take seriously the charge and determine and acknowledge the extent of abuse.

As the U.S. reckons with how to address its racial injustices, past and present, such examples of remedy for mass human rights violations give us a framework to think about how to do it justly and with full acknowledgment of the wrongs committed.
Can Democrats break the midterm curse? Maybe — consider the example of 1934
Matthew Rozsa, Salon
December 26, 2021


Now that Joe Manchin has sounded the death knell — at least for the moment — for Joe Biden's Build Back Better package, Democrats are doomed in the 2022 midterm elections.

Or, wait: Are they? Sometimes the "laws" of politics (or economics) are characterized as immutable, akin to the laws of physics. They're not, of course. Not a single ballot has been cast in the midterms. If Democrats can find a way to turn out their voters at unexpectedly high levels, while Republicans don't, they could turn 2022 into another blue wave. Political trends do not govern us; they are the results of human behavior, which is never entirely predictable.

Of course the apparent collapse of Build Back Better doesn't help. But the pattern that every political analyst and historian understands may be the real problem: Since the modern era of American politics began with Franklin D. Roosevelt's election in 1932, the party that controls the White House has lost congressional seats in 19 of the 22 midterm elections. Two of the three exceptions, in 1998 and 2002, were special cases with little relevance to the Democrats' predicament in 2022. In 1998, Democrats benefited from a booming economy and popular backlash against the Republican effort to impeach Bill Clinton over a sex scandal. The 2002 midterm elections came just a year after the 9/11 terrorist attacks; patriotic sentiment was running high and George W. Bush had successfully defined himself (for the moment) as a "war president."

It's that third exception, way back in 1934 — during the first of Roosevelt's three-plus terms as president — that may offer an instructive example. Those midterms came after FDR and the Democrats had passed a series of ambitious and historic laws, known collectively as the New Deal. While the Great Depression certainly didn't end immediately, the New Deal put millions of people to work and did a great deal to relieve suffering and despair. Despite Roosevelt's reputation (then and now) as a progressive president, the underlying premise of the New Deal was more pragmatic than ideological: Economic insecurity, poverty and hunger were a threat to social stability and indeed to the capitalist system; creating a social safety net was understood as a matter of urgent importance. A few years later, Roosevelt put it this way in his 1944 State of the Union address: "We have come to a clear realization of the fact that true individual freedom cannot exist without economic security and independence. 'Necessitous men are not free men.' People who are hungry and out of a job are the stuff of which dictatorships are made."

Within his first 100 days, Roosevelt had done a great deal to restore confidence in government, providing emergency financial aid to those who were struggling, passing the National Industrial Recovery Act and the Securities Act, and implementing a number of other regulatory and relief measures. Indeed, his first term was among the most productive in presidential history, and resulted in a fundamental restructuring of the federal government, which from that point onward was a more direct presence in ordinary people's lives than ever before. Conservatives characterized this as a dangerous intrusion on personal freedom (and have done so ever since); what we now call "liberalism" coalesced around the idea that government action was sometimes necessary to help the most vulnerable people in society.

Despite these legislative successes, however, political success was not guaranteed to follow. Persistent unemployment was still painfully high, and corporate America had begun to impose wage cuts. In the era before sophisticated polling, pundits could not scientifically assess a president's popularity like they can today. For all the Democrats knew, Republican warnings about creeping socialism had led to widespread panic, and conservatives might turn out in record numbers to halt a supposed red menace. On the other end of the spectrum, some Democratic radicals and socialists were frustrated with the Roosevelt administration's piecemeal approach to issues of social and economic justice. Sen. Huey Long, the legendary Louisiana populist who advocated a massive program of wealth redistribution and federal spending, was planning to run against Roosevelt for the Democratic nomination in 1936 (and probably would have, if he hadn't been assassinated in 1935).

In the event, Democrats did remarkably well in the 1934 midterms, gaining nine more seats in the House — and also an extraordinary nine seats in the Senate, giving them a supermajority in that chamber, with 69 of the 96 seats. Richard Walker, director of the Living New Deal Project and professor emeritus of geography at the University of California, Berkeley, told Salon by email that the Democrats' big win resulted from two principal factors. Most obviously, FDR and his party were seen as taking decisive steps to address the economic crisis.

"The Depression was still bad and the Republicans had no new ideas since 1932," Walker wrote, "so even though FDR had not solved anything definitively yet, no one was about to bring back the utterly failed Hooverites. Does that count as missteps of the GOP? Well, they had lots of time to do badly from late 1929 to early 1933 and people hadn't forgotten yet."

Joe Biden is no doubt aware of that history, and very likely intended Build Back Better as his own legacy-setting achievement, substantial enough to shift the political tides. Whether that package can yet be resuscitated remains unclear, but his fundamental problem remains that the Democrats have razor-thin majorities in Congress and remain unwilling to end the Senate filibuster. Without a major legislative win, the Democrats' last remaining hope is to run a negative campaign and convince voters that a Republican victory would be catastrophic. Pending decisions at the Supreme Court, including the possible or likely overturn of Roe v. Wade, could potentially produce a political backlash that helps the Democrats hold control of Congress.

There are again vague similarities to the 1934 midterm elections, when Democrats successfully depicted Republicans as extremists, although in a different sense than today: They were associated with the wealthy elite, with businessmen who lived in mansions and held to laissez-faire dogmas radically out of touch with the lives of ordinary people. Returning to the disastrous economic policies of Herbert Hoover's administration, Democrats argued, would be a dreadful mistake. In 2022, the threat posed by a recently defeated Republican administration has taken a more literal and even more dangerous form, with Donald Trump and his supporters using fascist language and tactics and overtly seeking to overthrow democratic institutions.

Calling out that extremism wasn't enough for Democrats in the recent Virginia gubernatorial election, but that doesn't prove it wouldn't work on a national scale, if pursued more aggressively and effectively. If there's an applicable lesson for Democrats to be found in the 1934 midterms, it might be this: The incumbent party can win, but only if it makes an overwhelmingly persuasive and urgent case that their opponents are dangerous and the future will be truly bleak if they prevail. Given the circumstances, that's a highly plausible argument.
US Multinationals et al, Illicit Financial Outflow and Tax Evasion: Undermining or Promoting Development In Nigeria?
On Dec 27, 2021


Development experts have always engaged in intensive discourses of development in any nation within the purview of internationally recognized indices among which poverty, unemployment and their antecedents in the nations under review dominate the content of such discourses.

Expectedly, these are notable parameters by which a nation’s development and international competitiveness or otherwise are properly measured.

Nigeria has maintained the infamous position in the global rungs of poverty-burdened countries in the world, with 93.9 million people in Africa’s most populous country currently living below the poverty line despite the rich natural, mineral and human resources the country is endowed with.

Given the low level of development in Nigeria, the big capitalist elites in the Western nations who are at a vantage position in terms of economic prosperity are expected to act as a shield to protect the economic rights of poor nations of the world, considering how often the western developed nations have always claimed to be champions of protecting human rights worldwide. However, the activities of some of these western multinationals led by those from the US have left much to be desired, making pundits wonder if they are really interested in helping the poor of the poorest nations get out of the poverty brackets as they have always proclaimed.

The ugly reality is that despite the prevailing poverty level In Nigeria, the country has continued to be pummeled by organized illicit financial outflows like tax evasion and money laundering among others which are sometimes perpetrated in connivance with those who ought to prevent such. The activities of the International Oil Companies operating in Nigeria and how they have been milking the economy came to the fore sometimes ago in the National dailies as the Lagos Zone of the Tax Appeal Tribunal ordered Mobil Producing Nigeria Unlimited to pay 83.4 million dollars (N13 billion) education tax to the Federal Inland Revenue Service (FIRS). This is just one of the numerous cases of misdemeanors perpetrated by these powerful multinational concerns. In a report released in 2019 by the Nigeria Extractive Industries Transparency Initiative (NEITI) and Trust Africa indicated that Nigeria loses between $15b billion and $18b yearly to illicit financial flow, and over 92 percent of the crime is reportedly committed in the oil and gas sector. Similarly, in 2015, the Economic Commission for Africa reported that the United States accounted for 29.0 percent of IFF from Nigeria, Spain accounted for 22.5 percent, France, 8.7 percent, Japan 8.5 percent, and Germany 7.7 percent. The five countries together contributed 76.4 percent of total illicit financial flows from Nigeria from 1970 to 2008, and were the key destinations of Nigeria’s oil products at that time.

In another instance, some banks in the United States of America were fingered in the accusation of helping to advance fraud in the P& ID case just as the Nigerian government is demanding records of 60 companies and individuals from 10 banks in the United States as part of its efforts to overturn the controversial $9.6 billion P&ID judgement. This is an indication that the Western Countries provide soft landings to help fewer Nigerians to launder currencies abroad. A Bloomberg report showed how Nigeria was seeking documents from banks, including Citigroup Inc. and JPMorgan Chase & Co., in a bid to overturn the $9.6 billion arbitration award. Nigeria has asked a federal court in New York for permission to subpoena information about transactions involving government officials


THE INEVITABLY AWKWARD TOKYO OLYMPICS

Admin
December 26, 2021

The fact that the Tokyo Olympics actually happened in 2021 was—depending on one’s point of view—either a miracle or disgrace. It often seemed as though there was no in between. For a global gathering that is portrayed as unifying, this year’s Olympics proved remarkably polarizing. Much of the world saw the Games as a moral exit ramp from the ravages of the pandemic—a necessary path to rejuvenation and interconnectedness after a year of isolation. But Tokyo’s explosive COVID-19 case rates quickly vindicated the other side of the debate, which regarded the Olympics as an irresponsible public relations show that placed the International Olympic Committee’s (IOC) image over the health and safety of the host country’s people. Indeed, the Japanese public was overwhelmingly opposed to hosting the event.

Once the Games began, COVID-19 remained omnipresent—but receded to the media narrative’s background. By 2021, strict pandemic precautions had become so ubiquitous that masking, testing, and distancing protocols were hardly newsworthy.

After the mid-competition withdrawal of U.S. gymnastics superstar Simone Biles, solidarity supplanted rivalry as the Games’ defining theme. This was a new phenomenon. Athletes openly condemned coaches and federations for neglecting their well-being in favor of hardware and prestige as mental health became a topic of intense scrutiny. Meanwhile, Belarusian sprinter Krystsina Tsimanouskaya made headlines for refusing to return home after a spat with Belarusian officials, instead seeking refuge in Poland through a dramatic sequence of events at Tokyo’s Haneda Airport.

All this arguably precipitated an identity crisis for the Olympics, which derives much of its popularity from the hype of interstate competition. That national boundaries between athletes seemed to recede—or, at the very least, change shape—pushed media coverage into a new realm. Time will tell whether the Games themselves can become a more inclusive space for those who eschew—or do not find themselves represented in—the rigidities of competition between nation states.


Here are five of the best Foreign Policy pieces chronicling how the tumultuous Tokyo Olympics moved from would-be COVID-19 catastrophe to an inflection point on the role of nationalism in sports.

1. SUGA’S OLYMPIC-SIZED GAMBLE

by Kazuhiro Maeshima, June 22

To host or not to host the Tokyo Olympics: That was the impossible question subsuming then-Japanese Prime Minister Yoshihide Suga for the better part of the past year.

It’s also not one he was supposed to have to answer. Suga took over from long-serving Japanese Prime Minister Shinzo Abe in September 2020 after the latter stepped down due to health concerns. Immediately, the Olympics became the issue that would seal Suga’s fate. As Kazuhiro Maeshima, dean of Sophia University’s Faculty of Global Studies, writes, the new premier was faced with a no-win scenario. Although Japanese public opinion was staunchly opposed to the Games, canceling would have likely also precipitated political turmoil due to its financial costs alone.


“If political tightrope walking were an Olympic sport, the prime minister would no doubt be competing in that event. … And he won’t have a net to catch him if he falls,” Maeshima writes.

Suga’s first (and, it turns out, only) G-7 summit may have helped with his decision-making. The group of leading economies seemed to be in concert with the IOC, which exerted a notoriously ruthless pressure campaign on Japan in the months leading up to the Tokyo Olympics. That international backing ended up—perhaps inevitably, as Maeshima hints—bringing about Suga’s political demise at home. In October, he gave way to Fumio Kishida, the new leader of Japan’s ruling Liberal Democratic Party.
2. THE PATHOLOGICAL OBSESSION WITH MOVING THE OLYMPICS

by David Clay Large, Aug. 2

“While the circumstances behind and motivations for the cancellations, boycotts, and violent attacks [of the Olympics] have varied over the years,” writes David Clay Large, a fellow at the Institute of European Studies, “the Olympic movement has been rendered more vulnerable to such intrusions by one key dimension in its DNA: mobility.”

We may take for granted that the Olympics travel among a rotating list of host cities every four years, but their ambulatory nature is relatively new. It goes without saying that the ancient Greek Games took place in one location—Athens—but even upon their revival there in 1896, it was not a given that they would deviate from this tradition. Despite ample pushback within the nascent IOC, the organization’s then-president, Pierre de Coubertin, was adamant that the Games find a new host every cycle. As he said at the time: “The sole means of assuring the [modern] Games’ success and of rendering them as splendid and brilliant as possible consists in giving them a great variety of aspect.”


Although the “traveling Olympic circus,” as Large puts it, has enriched those at the very top, it has proved a headache for almost everyone else involved in its execution. Ultimately, Large writes, venue variation has “damaged the Olympic project’s (aspirational) image as a promoter of international harmony and goodwill, as well as its viability as a commercial enterprise, which, despite pious claims to the contrary, it most certainly is.”
3. NATIONALISM IS UNDERRATED BY INTELLECTUALS

by Stephen M. Walt, Aug. 3






One reason the Olympics have become such a lucrative venture is nationalism sells. Nationalism “runs rampant throughout the entire proceeding,” FP’s Stephen M. Walt writes. “Televised coverage is relentlessly jingoistic (at least in the United States), and every broadcast repeats the latest medal count as if this was a revealing indicator of national merit.”

Walt, a Harvard professor and self-described “hard-nosed realist who … should be at least partly immune to this sort of reflexive patriotism,” admits he is, in fact, not. He chalks up this innate attachment to the deep human desire to feel connected to something larger—a void that has come to be filled by nations in the modern era.


And while there is no shortage of examples showing how nationalism has been used for deleterious ends, Walt acknowledges that, throughout the course of his career, he has come around to the concept. “Lately, I’ve been thinking we could use a bit more of it in the United States,” he writes. To Walt, nationalism is at its best not “self-serving and historically ignorant” but as a “‘we’re all in this together’ spirit.”
4. ATHLETES ARE POST-NATIONAL NOW

by Allison Meakem, July 23

While Walt has shown that nationalism remains alive and well among the billions of people who tune into the Olympics on TV, it may be waning in the actual arenas of competition.

Prior to the opening ceremonies in Tokyo, I had been tracking “an emerging fissure between athletes and the national federations they represent” in women’s gymnastics, spurred largely by the USA Gymnastics sex abuse scandal. I toyed with the thesis that the U.S. female gymnasts headed to Japan might represent the first “post-national” athletes, their voices and destinies entirely separate—if not more powerful—than the beleaguered federation whose logo appeared on their leotards. This new dichotomy was a result of three main factors: individual and collective athlete trauma, concurrent social movements, and the internet.

Although I focused on gymnastics, I found that my framework held strong across many women’s sports. Japanese tennis star Naomi Osaka’s dispute with the Women’s Tennis Association—in which she forfeited a competition to prioritize her mental health—made waves globally. And, as I wrote, “perhaps nothing is more emblematic of the post-national era of sport than the irony that rooting for female U.S. soccer players has often meant rooting against U.S. Soccer”—the federation that continues to deny them equal pay.

But the single most important test of my thesis was the Olympics themselves. I had theorized that, in the long term, the emergence of post-national athletes would lead to a sidelining of competitive merit in favor of inter-athlete solidarity, but I did not expect it to happen this soon; in fact, I (reflexively and overconfidently) predicted the United States would easily win the women’s gymnastics team competition and Biles would retain her all-around title.


When she instead withdrew from the team competition for the sake of her mental well-being, U.S. female gymnasts single-handedly shifted the Games’ entire discourse, proverbially shedding the stars and stripes to make the event a more inclusive one for all. On the competition floor, they audibly cheered for their Russian counterparts. And they challenged jingoistic narratives. When asked by NBC about how Biles’s absence would affect U.S. prospects going forward, U.S. gymnast Sunisa Lee put it best: “We don’t owe anybody anything. … We’re the ones who had to go through all of this.”
5. GAMING THE STATE SYSTEM

by Josh Kron, Aug. 22

The post-national athletes competing in Tokyo still occupy a relatively privileged position: Although they may personally feel detached from their national colors, their international competition is still made possible by citizenship to a U.N. member state or non-self-governing territory, each of which has its own National Olympic Committee (NOC).

But there is a whole class of athletes who are post-national in another way: members of Indigenous nations that remain unrecognized by the rigid state system they long preceded. Journalist Josh Kron charts the quest of the Haudenosaunee Confederacy—which straddles parts of modern-day Canada and the United States—to gain an Olympic berth to the 2028 Summer Olympics in Los Angeles. The year holds particular significance to the Haudenosaunee because it is when lacrosse—the game they invented millennia ago before it was appropriated by white settlers—returns as an Olympic sport.

The Haudenosaunee’s fight for United Nations recognition is not new. “The Haudenosaunee Confederacy was and is highly diplomatic,” Kron writes. “When the United Nations was founded in 1945, the Haudenosaunee traveled to San Francisco for the new world body’s creation.” A 1961 study of U.N. procedure, however, claimed Haudenosaunee membership “could not be entertained.”

In the years thereafter, the Haudenosaunee founded their own national lacrosse team—the Iroquois Nationals—which was quickly able to top global rankings once allowed into the International Lacrosse Federation. But “the Iroquois Nationals’ path to the playing field has been fraught, vexed by the same bylaws of the international system that have long stymied the Haudenosaunee’s own quest for formal recognition,” Kron writes.


Time will tell whether that system sanctions a Haudenosaunee NOC in time for 2028. As Kron argues, doing so may be an opportunity for the U.N. to back its celebrations of Indigenous issues with action. But the organization also has to contend with likely pushback from settler-colonial states, such as Australia, Canada, New Zealand, and the United States—which may be afraid of what such an expression of Indigenous sovereignty might inspire.
The Language of Lenin


In 1924, a group of linguists published a study which aimed to decode the power of Lenin's language – today, a newly-translated version sheds light on the contributions words can make to revolutionary politics.



Vladimir Ilyich Lenin (1870-1924) makes a celebratory speech as head of the first Soviet government in Red Square on the first anniversary of the 1917 Russian Revolution. (P. Otsup / Getty Images)


ByEdmund Hardy
26.12.2021

Struggles over words are political struggles. Socialists have to battle over the reality of phrases such as ‘class struggle’ or ‘anti-racism’, or else they have to cede them to the abstractions and reversals sent to smother these words by the Right. But if momentum is lost or gained in language as much as it is on the streets or in campaigns, can theories of speech and literary form help us understand these processes and sharpen left tactics in these battles?

Rab-Rab press’s recent Coiled Verbal Spring: Devices of Lenin’s Language presents some answers, and shows how poetry as a form connects to this struggle for words. It translates into English a special 1924 issue of LEF on Lenin’s speeches and texts, alongside a 1928 pamphlet by the poet Alexei Kruchenykh on the same subject. The LEF (Left Front for the Arts) group focused usually on poetry; its co-editor Vladimir Mayakovsky was a major Futurist poet, alongside the Formalist theorist of poetic language Osip Brik.

The LEF group grappled with how new literary forms could be constructed to break with the old; their debates centred on how language could be reloaded for permanent revolution. As Sezgin Beynik’s introduction to Coiled Verbal Spring makes clear, poetry was considered to be primary because it set itself against everyday speech—the preserve of the old way of life, captured in the Russian concept of ‘byt’. Byt doesn’t have a direct translation, but it was conceived of as an immobile, reactionary force, a petty bourgeois pull back into the inertias of capitalist culture, or what the Futurist Sergei Tretyakov described as the ‘rockfish of everyday life which pressed from the past’. Every element of revolutionary life needed to pull away from it.

The sharpest of these writers, Boris Arvatov, argued that the dialectical relationship between poetry and practical language could therefore be radicalised if poetry forged ahead as a communist avant-garde. But, Osip Brik further argued, the resulting experiments needed to be tested in the fire of class struggle. A further intended blow against the language of byt was Zaum, a new language developed by Futurists Alexei Kruchenykh and Velimir Khlebnikov. Roman Jakobson described the results as ‘mustard without the dish’. A taste of a future revolutionary banquet for all, then, or as Kruchenykh put it: ‘We learned how to look at the world backward.’

The special issue on Lenin’s language, published after his death in 1924, was an attempt to apply these ideas, derived from the study of poetic language, to the highly practical language of revolutionary politics. Yury Tynanov highlights a language which quotes and mocks bourgeois and rightist abstractions. Adding to this, Victor Shklovsky sees Lenin’s radical style as a ‘decanonising force’, cutting words down to the right measure—clear statements of necessity which can carry ‘schemes for life’, in Lenin’s phrase.

Building on Tynanov’s position, Boris Eikhenbaum discerns an intermediary form between poetical and practical language through a close reading of Lenin’s text ‘The Chief Task of Our Day’ from 1918, finding that Tolstoyan rhetoric of the old intelligentsia mixes there with Latin oratorical style and Russian colloquialisms, all to be broken at the point Lenin draws on them. This is a strategy of acknowledging historical currents in order to redirect or end them. Without losing its oppositional power nor its futurist edge, poetry here is compacted into a co-ordinate for political language as action. Eikhenbaum argues: ‘words were not Lenin’s profession or career, but his real deed.’

What can be done when right-wing speech embodies the positions and structures of socialism in order to hollow them out and render actual progressive voices meaningless? When the Right can claim ‘we are the real anti-racists’ or Fascist groups say ‘we are beyond politics’ then these reversals and positionings need to be broken apart before they solidify. Lenin was consistent in warding off the abstract weakening of precise left concepts—for example, saying ‘the word “commune” is being used too freely.’ If the Left tries to imitate right-wing strategies by occupying right-wing rhetoric—on immigration or prisons, for instance—then a collapse of political meaning occurs, partly because left movements propose a break from old or reactionary forms of hierarchy or control and the language which expresses and subjectivises them.

There is a final loop in the coordinates of how new forms are made which Coiled Verbal Spring touches on. Uprisings and social change everywhere produces new language. ‘The revolutionary masses brought into everyday use expressions that had never before existed,’ notes Kruchenykh in his pamphlet which reflects on Lenin four years after the LEF publication. ‘Lenin, in all his ingenious sensitivity, detected this stream that invigorated the language and solidified it in his speeches and writings.’ Left politics can work to expand the space for socialism in language now, as a way to counter byt and all the reactionary forms gathering to take back and enclose the world.

About the Author
Edmund Hardy is a filmmaker and writer based in London.
Christmas with the International Brigades


James R Jump was one of thousands of British volunteers to spend Christmas 1937 fighting fascism in Spain. In a diary entry, he remembers festivities on the frontlines with the International Brigades.


ByJames R Jump
25.12.2021

Aged twenty-one, James R Jump gave up his job as a journalist and went to Spain to fight General Franco’s fascists in the Spanish Civil War of 1936-’39. Leaving his Spanish fiancée in England, he joined the International Brigades and was mentioned in despatches for bravery during the Battle of the Ebro. He would later go on to write for Tribune and many other radical publications.

In this extract from his recently published memoir, he describes Christmas 1937 at the Tarazona base.

Afew days before Christmas I was on duty in the orderly-room at about half past ten at night when the phone rang. It was someone ringing up from the headquarters of the Assault Guards in Tarazona. ‘Have you heard the news, comrade?’

‘What news?’

‘We’ve captured Teruel.’

‘What?’ I couldn’t believe my ears.

‘Yes, it’s true. Teruel is in our hands. We have just heard the news on the radio.’

So I had the pleasure of telling Major Allan Johnson [the base commander] that Teruel had been liberated by the Spanish People’s Army. He blinked, put on his glasses and ordered me to take a bottle of brandy and two glasses from a cupboard so that we might toast the success.

The news spread through Tarazona like wild-fire and produced a state of euphoria. What would the Republic’s next objective be – Saragossa? Every soldier became a military expert and worked out plans for the next offensive.

Christmas Day was a holiday and the news from Teruel certainly gave us something to celebrate. We had a ration of cigarettes and each British volunteer received a parcel from Charlotte Haldane’s International Brigade fund, and the Americans and Canadians also received parcels from similar organisations in their countries. Each of our parcels contained tea, sugar, cocoa, chocolate, cigarettes, toothpaste, razor-blades, two Penguin novels and a Christmas card signed by Harry Pollitt.

In the dining room, which was decorated with political slogans for the occasion, we had a special meal – soup, half an egg, boiled fowl with potatoes and stewed plums and custard. And a double ration of wine!

In the evening a party of us had supper with a Spanish family. We had given them the money and they provided us with roasted goat served with red peppers and tomatoes, fresh bread and wine. I sat next to the oldest member of the family who looked 90 but was probably not much over 60. He asked me the question I so often heard in Tarazona: ‘Are you an American from England or from America?’

‘I’m from England.’

‘I have a son who used to live abroad.’

‘Oh yes? Where did he live?’

‘In Bilbao.’

I found that most of the old folk had little knowledge of Britain or America, and to try to explain that a Scot is not an Englishman was to attempt the impossible.

After supper we returned along the dark streets to the HQ, where a sing-song was in progress. After a time the party became rather noisy and hectic, especially when Allan Johnson, who usually kept to himself, appeared with a bottle of cognac and two bottles of rum.

Thus, we had one day’s rest. On Boxing Day reveille sounded at 5.30am and life returned to normal, except that not a few men had hangovers.

This is an extract from ‘The Fighter Fell in Love: A Spanish Civil War Memoir’ by James R Jump (The Clapton Press, 2021).

About the Author
James R Jump was a journalist and a volunteer with the International Brigades to Spain.