Wednesday, March 04, 2020

BOOKS

Abraham Lincoln’s Radical Moderation
What the president understood that the zealous Republican reformers in Congress didn’t


ANDREW FERGUSON MARCH 2020 ISSUE

Congress at War: How Republican Reformers Fought the Civil War, Defied Lincoln, Ended Slavery, and Remade America 


BY FERGUS M. BORDEWICH
 KNOPF CECILIA CARLSTEDT

In the opening days of the Civil War, long before Saturday Night Live appropriated the idea, Louis Trezevant Wigfall earned the distinction in Washington, D.C., of being the Thing That Wouldn’t Leave. Elected to the United States Senate from Texas to fill a vacancy in 1859, Wigfall wasted no time in making himself obnoxious to his colleagues and the public alike. He was lavish in his disdain for the legislative body in which he had sought a seat. On the Senate floor, he said of the flag and, especially, the Union for which it stood, “It should be torn down and trampled upon.” As the southern states broke away, Wigfall gleefully announced, “The federal government is dead. The only question is whether we will give it a decent, peaceable, Protestant burial.”

By then Wigfall had been appointed to the Confederate congress, and the only question that occurred to many of his colleagues was why he was still bloviating from the floor of the U.S. Senate. Wigfall was worse than a mere gasbag. As Fergus M. Bordewich points out in his provocative new book, Congress at War, he “passed on military information to his southern friends, bought arms for the Confederacy, and swaggered around encouraging men to enlist in the secessionist forces.” At last, in March 1861, Wigfall quit the U.S. capital and showed up a few weeks later in South Carolina. Commandeering a skiff after Confederate batteries opened fire on Fort Sumter, in Charleston Harbor, he rowed out to present terms for the fort’s surrender. He had no authorization to do such a thing; he was simply following his passion to make trouble and get attention. He went down in history as a triple threat: a traitor, a blowhard, and a shameless buttinsky.

Lincoln's Great Depression JOSHUA WOLF SHENK

From April 1861: A Connecticut Yankee visits Charleston during the Fort Sumter standoff

Wigfall, one of the many strange and colorful characters tossed up by the politics of the Civil War, typifies the time in important respects. The years leading to the Civil War, and the war itself, were political intensifiers; radicalism was rewarded and could be made to pay. This was as true of the Republican reformers who are the heroes of Bordewich’s book as it is of secessionists like Wigfall.

Bordewich’s ungainly subtitle—How Republican Reformers Fought the Civil War, Defied Lincoln, Ended Slavery, and Remade America—telegraphs the grand claims he sets out to make for a group of congressmen who mostly styled themselves as Radical Republicans. In his account, it is they who pressed for aggressive military campaigns when the will for war flagged among Abraham Lincoln’s generals; who invented the financial mechanisms that funded the war; who pushed for punitive measures against the southern slaveholders; and who deserve credit (or blame!) for the birth of big government—achievements more commonly attributed to their far less radical president. A popular historian and journalist blessedly free of academic affiliations, Bordewich is a master of the character sketch, summarizing complicated figures in a few swift phrases. But Lincoln himself never comes alive in his pages. Indeed, he scarcely appears. He lurks just offstage, stepping forward now and then to try, briefly and usually without success, to stymie the righteous zeal that propels the Radicals. The last line of the book declares that “a whole generation of politically heroic Republicans … led Congress to victory in the Civil War.” It’s an odd formulation—you probably thought the North won the war.

From June 1865: The place of Abraham Lincoln in history

Bordewich has chosen to tell his sprawling story of legislative activism and ascendancy mainly through four members of Congress: Senators Benjamin Wade of Ohio and William Pitt Fessenden of Maine, and Representatives Thaddeus Stevens of Pennsylvania and Clement Vallandigham of Ohio. Vallandigham is the only Democrat, a leader of an anti-war faction whose preference for the Union was complicated by his pro-slavery sympathies. The rest are Republicans, and two of them, Stevens and Wade, proudly called themselves Radicals and behaved accordingly. Fessenden, at one time a conservative, grew more sympathetic to the Radicals’ aims as the war dragged on.

Congressional power fell in the lap of Republicans, thanks to the departure of Wigfall and his southern colleagues; their seizing of it seems, in retrospect, less a matter of superior gamesmanship than a law of political gravity. Calling for stronger prosecution of the war, immediate liberation of the enslaved, and confiscation of all property owned by the southern belligerents, Radicals quickly took control of the Republican caucus. Perhaps, Bordewich writes, the Radicals “have something to teach us about how our government can function at its best in challenging times, and how crisis may even make it stronger.” Lesson No. 1: Get most of your opponents to leave town before you try anything.

The Radicals were quick on their feet, exploiting national turmoil to break a legislative logjam. For decades Southern Democrats, their numbers swollen by the Constitution’s infamous three-fifths clause, had blocked a series of domestic programs proposed first by the Whigs and then by their Republican successors. Here was the chance to neutralize the Democratic aversion to centralized power and advance a collectivist vision of the commercial republic, laying the foundation, Bordewich writes, “for the strong activist central government that came fully into being in the twentieth century.”

The flurry of legislating was indeed “transformative,” as Bordewich says. He points in particular to four pieces of legislation as landmarks. The Homestead Act promised 160 acres of federal land to any citizen willing to live on it and farm it for five years. The Pacific Railway Act financed the transcontinental railroad and further opened up the western territories to white settlement. The third bill created the federal Department of Agriculture. And the Morrill Land Grant College Act would distribute federal land to states and localities for the purpose of building public institutions of higher learning dedicated to teaching agriculture and other practical arts—a miracle of democratization in the history of American education.

Yet in Bordewich’s telling, Lincoln had little to do with the ambitious measures, as if the bills were signed by autopen during coffee breaks. In fact, two of them were explicitly endorsed in the Republican platform that Lincoln ran on in 1860; he made a special plea for the Department of Agriculture in his first annual message to Congress. Bordewich also downplays the inevitable unintended consequences that accompany government expansion, even what seem to be the most benign reforms. The railway act, with its crony capitalism and funny-money bond issues, led straight to the Gilded Age and the creation of half a dozen robber-baron fortunes. Those “federal lands” that Washington gave away in the railway and homestead acts were not, except in the sneakiest sense, the federal government’s to give away; the land rush they touched off may have guaranteed the otherwise merely predictable genocide of the Native Americans already living there.

In the name of designating the Radicals as the forerunners of contemporary liberalism, Bordewich tries to draw a continuous line from the Civil War Congress to the New Deal and the Great Society. Yet the line has too many zigs and zags and ups and downs to clinch a causal connection. And in fact, many of the features of big government (19th-century style) fell away before long. Calvin Coolidge, for instance, 60 years after the Civil War and a few years before the New Deal, oversaw a federal government that was in most respects closer in size and scope to the antebellum government than to the modern state that was soon to emerge.

From July/August 2009: Christopher Hitchens on Lincoln’s emancipation

If bordewich oversells the legacy of the Radicals in Congress, his more fundamental misapprehension lies elsewhere: His version of events shortchanges the greatness that humanists of all stripes—not only historians—have found in Lincoln. The problem is partly a failure to appreciate that the Radicals were kibitzers, as many legislators are. But misjudging Lincoln’s role as executive and his commitment to larger obligations is Bordewich’s more telling mistake. Lincoln the executive shouldered the responsibility to lead an entire government and, just as important, an unstable political coalition. From Radicals to reactionaries, Republicans were held together by a single strand: a hostility, varying in degree, to slavery. A collapse of this delicate alliance—brought on by a sudden call for immediate, nationwide abolition, for instance—would have doomed the war effort.

Lincoln was required to be more cautious than a Radical congressman had to be—more serious, in a word. Bordewich credits the Radicals with forcing Lincoln year by year to pursue the war more savagely, culminating in the elevation of General Ulysses S. Grant in 1864. But his evidence is thin that Lincoln paid anything more than lip service to the Radicals’ pleas for bloodshed. Bordewich is a particular admirer of the Joint Committee on the Conduct of the War—“this improvised vigilante committee,” Lincoln called it, “to watch my movements and keep me straight.” It was put together by Benjamin Wade and stocked with his fellow Radicals.

The committee researched and rushed into print massive reports after failed and sometimes catastrophic military engagements. The accounts totaled millions of words and accused officers and bureaucrats of horrifying lapses in military judgment and execution. Some of the accusations were implausible; others were all too real. Historically, the reports are invaluable. At the time, however, their primary effect was to second-guess generals disliked by the committee’s majority and to advance the generals with whom the majority was politically aligned. The committee’s “greatest purpose,” Lincoln told a friend, “seems to be to hamper my action and obstruct military operations.”

Shelby Foote, in his history of the Civil War, tells a story that illustrates why Lincoln and the Radicals were destined to be so often at odds. One evening Wade rushed to the White House to demand that Lincoln fire a weak-willed general who had failed to press the Union advantage. Lincoln asked Wade whom he should enlist to take the general’s place. “Anybody!” Wade cried. “Anybody will do for you,” Lincoln replied, “but I must have somebody.” Lincoln had to be serious.

As Bordewich concedes, the Radicals were as bloody-minded as the Wigfalls of the world. “Nothing but actual extermination or exile or starvation will ever induce [southern rebels] to surrender,” Stevens once said, in a speech Bordewich doesn’t quote. There can, of course, be no moral equivalence between Stevens and a slavery apologist like Wigfall. One of them was on the side of the angels, and it wasn’t Wigfall. But both were radicals.

Radicalism is more than a packet of views or policies. The contents of the packet will change with circumstances and over time. (One reason Bordewich admires the Radical Republicans is that their views on race are so close to current mainstream attitudes; today’s radicals, valorizing group identity above all else, will likely find both the views and the politicians who held them hopelessly retrograde.) Radicalism is a disposition. The same is true of its contrary, moderation. Lincoln’s moderation was so infuriating to the Radicals because it reflected a hierarchy of values different from theirs.

The ultimate concerns for Stevens and his fellows were the liberation of the enslaved, the punishment of the enslavers, and the reorganization of southern society. The ultimate concern for Lincoln was the survival of the Union, to which he had an almost mystical attachment. The old question—was the war fought to preserve the Union or to free the slaves?—underestimates how closely the two causes were entwined in his mind. Lincoln’s goal was to uphold the kind of government under which slavery could not in the end survive. This was a government, as Lincoln said, dedicated to a proposition.

From September 1999: Lincoln’s greatest speech

In a hectoring letter written at a low point in 1863, a Radical senator insisted that Lincoln “stand firm” against conservatives in his government. It was a common complaint of the Radical Republicans that Lincoln was hesitant, easily led, timid—weak. “I hope to ‘stand firm’ enough to not go backward,” Lincoln replied, “and yet not go forward fast enough to wreck the country’s cause.” Lincoln struck this balance with unmatched skill and sensitivity.

It was a feat of leadership peculiar to self-government, captured most famously by the only 19th-century American who could rival him as a prose artist and a statesman. Frederick Douglass was an enthusiastic admirer of Lincoln, once calling him, not long after the assassination, “emphatically the black man’s president: the first to show any respect for their rights as men.” Years later, Douglass’s enthusiasm had cooled—and ripened.

From December 1866: Frederick Douglass on Reconstruction

Lincoln “was preeminently the white man’s President, entirely devoted to the welfare of white men,” Douglass now said. “Viewed from the genuine abolition ground”—the ground, that is, from which Bordewich and many of today’s historians want to judge him, and the ground from which the Radicals did judge him—“Mr. Lincoln seemed tardy, cold, dull, and indifferent.” Douglass knew, though, that Lincoln never claimed to govern as an abolitionist, and Douglass knew why. “But measuring him by the sentiment of his country, a sentiment he was bound as a statesman to consult, he was swift, zealous, radical, and determined.”

The italics are mine, but the insight belongs to Douglass. Lincoln was radical without being a Radical—and never more radical than a leader can afford to be when he leads a government of, by, and for the people.


ANDREW FERGUSON is a staff writer at The Atlantic. He is the author of Fools’ Names, Fools’ Faces; Land of Lincoln; and Crazy U: One Dad’s Crash Course on Getting His Kid Into College.


SEE 

The Questions Sex-Ed Students Always Ask

For 45 years, Deborah Roffman has let students’ curiosities guide her lessons on sexuality and relationships.

DEBORAH ROFFMAN / THE ATLANTIC


Editor's Note: In the next five years, most of America’s most experienced teachers will retire. The Baby Boomers are leaving behind a nation of more novice educators. In 1988, a teacher most commonly had 15 years of experience. Less than three decades later, that number had fallen to just three years leading a classroom. The Atlantic’s “On Teaching” project is crisscrossing the country to talk to veteran educators. This story is the eleventh in our series.

About 25 years ago, a public school in the Baltimore suburbs invited Deborah Roffman to teach a class on puberty to fifth graders. Roffman, who was known as the “Sex Lady” at the private Park School of Baltimore, where she had been teaching for two decades, was flattered. But she was troubled by the restrictions that the public school’s vice principal had given her: She couldn’t use the words fertilization, intercourse, or sex. And she couldn’t answer any student questions related to those subjects. That wasn’t going to work for the Sex Lady.

Eventually, Roffman reached a compromise with the public school: Students would get parental permission to attend her talk, and Roffman could answer any question they asked, even if it meant using the S-word.

Roffman’s title of human-sexuality educator has not changed since she arrived at the Park School in 1975, but the dimensions of her role there have steadily grown. So, too, has her outside work in consulting and teacher training: Over the years, she has advised at nearly 400 schools, most of them private.

Initially, Roffman taught elective classes in sexuality to the juniors and seniors at Park, but within two years, she had expanded to seventh and eighth graders. In the 1980s, she added fourth and fifth graders to her roster. She also meets annually with the parents of students as young as kindergartners, to coach them on how to talk with their children about sexuality, and she leads summer training for the Park’s elementary-school teachers on incorporating sexuality instruction into their classrooms. “There is this knowledge that we keep in a box about sexuality, waiting until kids are ‘old enough,’” Roffman told me. “My job is to change that.”


What School Could Be If It Were Designed for Kids With Autism KRISTINA RIZGA


How to Save a Dying Language ALIA WONG

Every Child Can Become a Lover of Books ASHLEY FETTERS

During her 45 years of teaching, Roffman has witnessed the evolution of the nation’s attitude toward sex education and, as her experience at the public school shows, how uneven that education can be.

Perhaps more than any other subject, sex education highlights the country’s fierce loyalty to local control of schools. Twenty-nine states require public schools to stress abstinence if they teach about sex, according to the latest count by the Guttmacher Institute, a think tank based in Washington, D.C., and New York that promotes reproductive rights. Some of the more outrageous abstinence lessons employ troubling metaphors, such as comparing sexually active, unmarried women to an old piece of tape: useless and unable to bond. Only 17 states require sex education to be medically accurate.

Most research has found that sex education for adolescents in the United States has declined in the past 20 years. Like art and music, the subject is typically not included on state standardized exams and, as the saying goes, “what gets tested gets taught.” In the case of sex education, waning fear about the spread of HIV and AIDS among heterosexual youths has contributed to the decline in instruction, says John Santelli, a professor at Columbia’s Mailman School of Public Health.

But some bright spots do exist, says Jennifer Driver, the vice president of policy and strategic partnerships at the Sexuality Information and Education Council of the United States. For example, in some parts of Mississippi and Texas, there has been a shift away from "abstinence only" to "abstinence plus" curricula, with the latter permitting at least some information about contraception.

Roffman remembers her own sex education while growing up in Baltimore as being limited to a short film in fifth grade about periods and puberty. She began working in sex ed in 1971—when access to birth control was rapidly expanding amid the sexual revolution—helping Planned Parenthood train health-care professionals who were setting up family-planning clinics in the region, and doing broader community outreach.

Four years later, she followed her Planned Parenthood supervisor to the progressive Park School, where students often address teachers by their first name and current tuition runs about $30,000 a year. When she arrived that spring, she heard that the senior-class adviser had recently rushed into the upper-school principal’s office, exclaiming that something had to be done before the seniors’ graduation, because “we forgot to talk to them about sex.”

Read: The case for comprehensive sex ed

During the next several years, Roffman not only made sure the school remembered to talk to students about sex but steadily built up the curriculum. At Park, students learn about standard fare like birth control and sexually transmitted diseases but also delve into issues such as the history of abortion rights, changing conceptions of gender roles, and how to build respectful, intimate relationships.

Students start by learning about the reproductive systems, the importance of open communication, and the fundamentals of puberty in their first classes with Roffman, in the fourth and fifth grades. In seventh grade, they take a deep-dive course on human sexuality, covering everything from pornography to the use of sex in advertising to gender identity and sexual orientation. They see her again for a shorter, related course in eighth grade. During the 2016 presidential campaign, Roffman’s seventh graders spent most of a semester researching the candidates’ differing views on sex, gender, and reproduction. “In the process of doing that, I got to teach about every topic I wanted to teach about,” she said.

In high school, students take a required sexuality-studies seminar. The specific content varies year to year, but it’s always based on what Roffman calls the “eight characteristics of a sexually healthy adult,” which include staying healthy, enjoying pleasure, and relating to others in caring, nonexploitative ways.

The through line of her approach, at any age, is letting students’ queries guide her instruction. So she asks her students to submit anonymous questions at the start of the semester, and makes sure that she answers them as the course progresses.

Regardless of whether they grew up in the ’80s or the aughts, kids of certain ages always ask versions of the same questions, Roffman has found. For instance, middle-school students, she said, want to know if their bodies and behaviors are “normal.” Many older students ask her at what age it’s normal to start masturbating.

High schoolers routinely ask about romantic communication, relationships, and the right time for intimacy: “Who makes the first move?” “How do you know if you or the other person is ready for the ‘next level’?” “How can you let someone down easy when you want to break up?”

But some contemporary questions, Roffman said, are very different from those she heard earlier in her career. Sometimes the questions change when the news does. (More than 30 years ago, Roffman started reading two newspapers a day to keep up with the rapid pace of news about HIV and AIDS; she’s maintained the habit since.)

She said she received a flood of questions about sexual harassment after the Senate confirmation hearings for Supreme Court Justice Clarence Thomas, in the early 1990s. The same decade ended with a spike in student interest in oral sex and behaviors that had previously been considered more taboo, such as anal sex.

Sometimes changing student questions signal broader cultural shifts, like the recent surge in student queries about gender identities. “There would have been questions 20 years ago about sexual orientation, but not about gender diversity,” Roffman said. But one recent eighth-grade cohort submitted questions like “How many genders are there?” “What does ‘gender roles’ mean?” “What is the plus sign for in LGBTQIA+?” and “Why is ‘gay’ called ‘gay’?” She finds a way to answer them all.

Read: What schools should teach kids about sex

Roffman’s students appreciate her blunt and holistic approach. As a sixth grader at a charter school several years ago, Maeve Thistel took a brief unit in sex education. The teacher seemed uncomfortable and nervous, she remembers. The condoms the teacher brought for a demonstration were expired, and split when she took them out of the package. Thistel came away from the class with the impression that sex was both “icky and disturbing.”

Thistel, now a college freshman, transferred to Park for high school, where she found that Roffman presented some of the same material quite differently: Her very first step in the lesson on condoms was to point out that all of them have an expiration date that should be noted and heeded.

Under Roffman’s guidance, sexuality at Park has come to be treated as something closer to social studies, science, or other core subjects. Sex ed is “just another part of the curriculum, not carved out as its own special thing,” says David Sachs, a 1988 graduate who studied with Roffman and whose son, Sebastian, is now in 11th grade at the school and has her as a teacher as well.

Like all Park students, Sebastian Sachs had to complete an eighth-grade project wherein he examined the root cause of a social-justice issue. His team picked sexual assault and, with Roffman as their adviser, focused on consent education and how to introduce it in the youngest grades. Sachs and his teammates created a curriculum for preschoolers that, among other things, encourages them to ask permission before hugging a classmate, borrowing a pencil, or swooping in for a high five.

In Roffman’s ideal world, the school would implement lessons like these, and other age-appropriate sex and relationship education, from the earliest grades. Several of her co-workers agree. “Fourth grade might be too late for us” to begin this kind of education, says Alejandro Hurtado, Park’s Spanish teacher for the lower grades. Last summer, Hurtado participated in a voluntary two-week workshop led by Roffman that aimed to create a sexuality-education curriculum for Park’s elementary-age kids. “It will be subtly woven in,” he says, noting that he plans to talk more explicitly about traditional gender roles and expectations in some Latino cultures as part of his own class.

In her teacher training, Roffman encourages colleagues to be scientifically accurate and use age-appropriate language when answering even the youngest children’s questions. Four-year-olds are beginning to understand place and geography, so they will frequently ask where they came from. “The proper answer is that there’s a place inside a female body called the uterus, and that’s where they grew,” Roffman said.

Sarah Shelton, a Park third-grade teacher who also participated in the summer workshop, says Roffman inspired her to not dodge students’ questions about bodies and sex. In the past she’s deflected sex-related inquiries, such as when a student asked about birth control last year.

“I told her, ‘Great question. Ask your parents,’” Shelton recalls. “If that were to occur again, I would say something like ‘When reproduction happens in the body, there is medication that you can take to stop it so you can have sexual intercourse without creating a baby.’”

Sarah Huss, the director of human development and parent education at the private Campbell Hall school in Los Angeles, says Roffman helped her rethink her school’s sexuality education. Huss reached out to Roffman after reading her book Talk to Me First: Everything You Need to Know to Become Your Kids’ “Go-To” Person About Sex. The ensuing dialogue prompted Campbell Hall to begin sexuality education in third grade and to significantly shore up its middle-school programming. Prior to meeting Roffman, “I had taught sex education as ‘Don’t get hurt, don’t get pregnant, don’t get a disease,’” Huss says. “That wasn’t a hopeful message for the kids.”

Huss admires her colleague’s patient tenacity. “She’s walking into schools where there is so much emotional baggage around a subject,” Huss says. “To suggest doing it differently, you have to confront years and years and years of thinking that talking with young kids about sex is dangerous.”

After decades of striving for change both within and beyond Park’s walls, Roffman is optimistic about the future of sexuality education at progressive private schools like Campbell Hall and Park. “I’ve always believed that independent schools have the responsibility to give back to the larger educational community,” she told me. “It’s up to us to demonstrate that, yes, this can be done well and successfully.”

By contrast, “I see very limited movement in the public sector,” she said. And in a country where only a minority of states require medically accurate sex-education classes, her dream of seamlessly integrating the subject from kindergarten up may be a long way off. But Roffman has lived through one sexual revolution, and she holds out hope for a second, in education.

This article is part of our project "On Teaching," which is supported by grants from the William and Flora Hewlett Foundation, the Spencer Foundation, the Bill & Melinda Gates Foundation, and the Panta Rhea Foundation.

SARAH CARR leads an investigative education reporting team at the Boston Globe and is the author of Hope Against Hope, about New Orleans schools.
The Problem With Telling Sick Workers to Stay Home

Ev
en with the coronavirus spreading, lax labor laws and little sivck leave mean that many people can’t afford to skip work.
ROBERT NICKELSBERG / GETTY
As the coronavirus that has sickened tens of thousands in China spreads worldwide, it now seems like a virtual inevitability that millions of Americans are going to be infected with the flu-like illness known as COVID-19. Public-health officials in the United States have started preparing for what the Centers for Disease Control and Prevention is calling a “significant disruption” to daily life. Because more than 80 percent of cases are mild and many will show no symptoms at all, limiting the disease’s spread rests on the basics of prevention: Wash your hands well and frequently, cover your mouth when you cough, and stay home if you feel ill. But that last thing might prove to be among the biggest Achilles’ heels in efforts to stymie the spread of COVID-19. The culture of the American workplace puts everyone’s health at unnecessary risk.

For all but the independently wealthy in America, the best-case scenario for getting sick is being a person with good health insurance, paid time off, and a reasonable boss who won’t penalize you for taking a few sick days or working from home. For millions of the country’s workers, such a scenario is a nearly inconceivable luxury. “With more than a third of Americans in jobs that offer no sick leave at all, many unfortunately cannot afford to take any days off when they are feeling sick,” Robyn Gershon, an epidemiology professor at the NYU School of Global Public Health, wrote in an email. “People who do not (or cannot) stay home when ill do present a risk to others.” On this count, the United States is a global anomaly, one of only a handful of countries that doesn’t guarantee its workers paid leave of any kind. These jobs are also the kind least likely to supply workers with health insurance, making it difficult for millions of people to get medical proof that they can’t go to work.

MORE STORIES


How a Measles Quarantine Can Lead to Eviction OLGA KHAZAN


20 Seconds to Optimize Hand Wellness JAMES HAMBLIN


Workers Love AirPods Because Employers Stole Their Walls AMANDA MULL

They’re also concentrated in the service industry or gig economy, in which workers have contact, directly or indirectly, with large numbers of people. These are the workers who are stocking the shelves of America’s stores, preparing and serving food in its restaurants, driving its Ubers, and manning its checkout counters. Their jobs tend to fall outside the bounds of paid-leave laws, even in states or cities that have them. Gershon emphasizes that having what feels like a head cold or mild flu—which COVID-19 will feel like to most healthy people—often isn’t considered a good reason to miss a shift by those who hold these workers’ livelihood in their hands.

Read: You’ll likely get the coronavirus

Even if a person in one of these jobs is severely ill—coughing, sneezing, blowing her nose, and propelling droplets of virus-containing bodily fluids into the air and onto the surfaces around her—asking for time off means missing an hourly wage that might be necessary to pay rent or buy groceries. And even asking can be a risk in jobs with few labor protections, because in many states, there’s nothing to stop a company from firing you for being too much trouble. So workers with no good options end up going into work, interacting with customers, swiping the debit cards that go back into their wallets, making the sandwiches they eat for lunch, unpacking the boxes of cereal they take home for their kids, or driving them home from happy hour.

Even for people who have paid sick leave, Gershon noted, the choices are often only marginally better; seven days of sick leave is the American average, but many people get as few as three or four. “Many are hesitant to use [sick days] for something they think is minor just in case they need the days later for something serious,” she wrote. “Parents or other caregivers are also hesitant to use them because their loved ones might need them to stay home and care for them if they become ill.”

For workers with ample sick leave, getting it approved may still be difficult. America’s office culture often rewards those who appear to go above and beyond, even if that requires coughing on an endless stream of people. Some managers believe leadership means forcing their employees into the office at all costs, or at least making it clear that taking a sick day or working from home will be met with suspicion or contempt. In other places, employees bring their bug to work of their own volition, brown-nosing at the expense of their co-workers’ health.

Read: The gig economy has never been tested by a pandemic

Either way, the result is the same, especially in businesses that serve the public or offices with open plans and lots of communal spaces, which combine to form the majority of American workplaces. Even if your server at dinner isn’t sick, she might share a touch-screen workstation with a server who is. Everyone on your side of the office might be hale and healthy, but you might use a tiny phone booth to take a call right after someone whose throat is starting to feel a little sore. “Doorknobs, coffee makers, toilets, common-use refrigerators, sinks, phones, keyboards [can all] be a source of transmission if contaminated with the agent,” Gershon wrote. She advised that workers stay at least three to six feet away from anyone coughing or sneezing, but in office layouts that put desks directly next to one another with no partition in between—often to save money by giving workers less personal space—that can be impossible. No one knows how long COVID-19 can live on a dry surface, but in the case of SARS, another novel coronavirus, Gershon said it was found to survive for up to a week on inanimate objects.

Work culture isn’t the only structure of American life that might make a COVID-19 outbreak worse than it has to be—the inaccessible, precarious, unpredictable nature of the country’s health-care system could also play an important role. But tasking the workers who make up so much of the infrastructure of daily American life, often for low wages and with few resources, with the lion’s share of prevention in an effort to save thousands of lives is bound to fail, maybe spectacularly. It will certainly exact a cost on them, both mentally and physically, that the country has given them no way to bear.



AMANDA MULL is a staff writer at The Atlantic.

Nuclear Tests Marked Life on Earth With a Radioactive Spike

Even as it disappears, the “bomb spike” is revealing the ways humans have reshaped the planet.

Zoe van Djik


Story by Carl Zimmer

MARCH 2, 2020

On the morning of March 1, 1954, a hydrogen bomb went off in the middle of the Pacific Ocean. John Clark was only 20 miles away when he issued the order, huddled with his crew inside a windowless concrete blockhouse on Bikini Atoll. But seconds went by, and all was silent. He wondered if the bomb had failed. Eventually, he radioed a Navy ship monitoring the test explosion.

“It’s a good one,” they told him.

Then the blockhouse began to lurch. At least one crew member got seasick—“landsick” might be the better descriptor. A minute later, when the bomb blast reached them, the walls creaked and water shot out of the bathroom pipes. And then, once more, nothing. Clark waited for another impact—perhaps a tidal wave—but after 15 minutes he decided it was safe for the crew to venture outside.

The mushroom cloud towered into the sky. The explosion, dubbed “Castle Bravo,” was the largest nuclear-weapons test up to that point. It was intended to try out the first hydrogen bomb ready to be dropped from a plane. Many in Washington felt that the future of the free world depended on it, and Clark was the natural pick to oversee such a vital blast. He was the deputy test director for the Atomic Energy Commission, and had already participated in more than 40 test shots. Now he gazed up at the cloud in awe. But then his Geiger counter began to crackle.

“It could mean only one thing,” Clark later wrote. “We were already getting fallout.”

That wasn’t supposed to happen. The Castle Bravo team had been sure that the radiation from the blast would go up to the stratosphere or get carried away by the winds safely out to sea. In fact, the chain reactions unleashed during the explosion produced a blast almost three times as big as predicted—1,000 times bigger than the Hiroshima bomb.

Within seconds, the fireball had lofted 10 million tons of pulverized coral reef, coated in radioactive material. And soon some of that deadly debris began dropping to Earth. If Clark and his crew had lingered outside, they would have died in the fallout.

Clark rushed his team back into the blockhouse, but even within the thick walls, the level of radiation was still climbing. Clark radioed for a rescue but was denied: It would be too dangerous for the helicopter pilots to come to the island. The team hunkered down, wondering if they were being poisoned to death. The generators failed, and the lights winked out.

“We were not a happy bunch,” Clark recalled.

They spent hours in the hot, radioactive darkness until the Navy dispatched helicopters their way. When the crew members heard the blades, they put on bedsheets to protect themselves from fallout. Throwing open the blockhouse door, they ran to nearby jeeps as though they were in a surreal Halloween parade, and drove half a mile to the landing pad. They clambered into the helicopters, and escaped over the sea.

Read: The people who built the atomic bomb

As Clark and his crew found shelter aboard a Navy ship, the debris from Castle Bravo rained down on the Pacific. Some landed on a Japanese fishing boat 70 miles away. The winds then carried it to three neighboring atolls. Children on the island of Rongelap played in the false snow. Five days later, Rongelap was evacuated, but not before its residents had received a near-lethal dose of radiation. Some people suffered burns, and a number of women later gave birth to severely deformed babies. Decades later, studies would indicate that the residents experienced elevated rates of cancer. 

Traveling the world to see microbes, plants, and animals in oceans, grasslands, forests, deserts, the icy poles—and wherever else they may be.
Read more

The shocking power of Castle Bravo spurred the Soviet Union to build up its own nuclear arsenal, spurring the Americans in turn to push the arms race close to global annihilation. But the news reports of sick Japanese fishermen and Pacific islanders inspired a worldwide outcry against bomb tests. Nine years after Clark gave the go-ahead for Castle Bravo, the United States, Soviet Union, and Great Britain signed a treaty to ban aboveground nuclear-weapons testing. As for Clark, he returned to the United States and lived for another five decades, dying in 2002 at age 98.

Among the isotopes created by a thermonuclear blast is a rare, radioactive version of carbon, called carbon 14. Castle Bravo and the hydrogen-bomb tests that followed it created vast amounts of carbon 14, which have endured ever since. A little of this carbon 14 made its way into Clark’s body, into his blood, his fat, his gut, and his muscles. Clark carried a signature of the nuclear weapons he tested to his grave.

I can state this with confidence, even though I did not carry out an autopsy on Clark. I know this because the carbon 14 produced by hydrogen bombs spread over the entire world. It worked itself into the atmosphere, the oceans, and practically every living thing. As it spread, it exposed secrets. It can reveal when we were born. It tracks hidden changes to our hearts and brains. It lights up the cryptic channels that join the entire biosphere into a single network of chemical flux. This man-made burst of carbon 14 has been such a revelation that scientists refer to it as “the bomb spike.” Only now is the bomb spike close to disappearing, but as it vanishes, scientists have found a new use for it: to track global warming, the next self-inflicted threat to our survival.

Sixty-five years after Castle Bravo, I wanted to see its mark. So I drove to Cape Cod, in Massachusetts. I was 7,300 miles from Bikini Atoll, in a cozy patch of New England forest on a cool late-summer day, but Clark’s blast felt close to me in both space and time.

I made my way to the Woods Hole Oceanographic Institute, where I met Mary Gaylord, a senior research assistant. She led me to the lounge of Maclean Hall. Outside the window, dogwoods bloomed. Next to the Keurig coffee maker was a refrigerator with the sign that read store only food in this refrigerator. We had come to this ordinary spot to take a look at something extraordinary. Next to the refrigerator was a massive section of tree trunk, as wide as a dining-room table, resting on a pallet.

The beech tree from which this slab came from was planted around 1870, by a Boston businessman named Joseph Story Fay near his summer house in Woods Hole. The seedling grew into a towering, beloved fixture in the village. Lovelorn initials scarred its broad base. And then, after nearly 150 years, it started to rot from bark disease and had to come down.

“They had to have a ceremony to say goodbye to it. It was a very sad day,” Gaylord said. “And I saw an opportunity.”

Gaylord is an expert at measuring carbon 14. Before the era of nuclear testing, carbon 14 was generated outside of labs only by cosmic rays falling from space. They crashed into nitrogen atoms, and out of the collision popped a carbon 14 atom. Just one in 1 trillion carbon atoms in the atmosphere was a carbon 14 isotope. Fay’s beech took carbon dioxide out of the atmosphere to build wood, and so it had the same one-in-a-trillion proportion.

When Gaylord got word that the tree was coming down in 2015, she asked for a cross-section of the trunk. Once it arrived at the institute, she and two college students carefully counted its rings. Looking at the tree, I could see a line of pinholes extending from the center to the edge of the trunk. Those were the places where Gaylord and her students used razor blades to carve out bits of wood. In each sample, they measured the level of radiocarbon.

“In the end, we got what I hoped for,” she said. What she’d hoped for was a history of our nuclear era.

RELATED STORIES

The 60-Year Downfall of Nuclear Power in the U.S. Has Left a Huge Mess
America's Nuclear-Waste Plan Is a Giant Mess
What Lies Beneath

For most of the tree’s life, they found, the level had remained steady from one year to the next. But in 1954, John Clark initiated an extraordinary climb. The new supply of radiocarbon atoms in the atmosphere over Bikini Atoll spread around the world. When it reached Woods Hole, Fay’s beech tree absorbed the bomb radiocarbon in its summer leaves and added it to its new ring of wood.

As nuclear testing accelerated, Fay’s beech took on more radiocarbon. A graph pinned to the wall above the beech slab charts the changes. In less than a decade, the level of radiocarbon in the tree’s outermost rings nearly doubled to almost two parts per trillion. But not long after the signing of the Partial Test Ban Treaty in 1963, that climb stopped. After a peak in 1964, each new ring of wood in Fay’s beech carried a little less radiocarbon. The fall was far slower than the climb. The level of radiocarbon in the last ring the beech grew before getting cut down was only 6 percent above the radiocarbon levels before Castle Bravo. Versions of the same sawtoothlike peak Gaylord drew had already been found in other parts of the world, including the rings of trees in New Zealand and the coral reefs of the Galapagos Islands. In October 2019, Gaylord unveiled an exquisitely clear version of the bomb spike in New England.

When scientists first discovered radiocarbon, in 1940, they did not find it in a tree or any other part of nature. They made it. Regular carbon has six protons and six neutrons. At UC Berkeley, Martin Kamen and Sam Ruben blasted carbon with a beam of neutrons and produced a new form, with eight neutrons instead of six. Unlike regular carbon, these new atoms turned out to be a source of radiation. Every second, a small portion of the carbon 14 atoms decayed into nitrogen, giving off radioactive particles. Kamen and Ruben used that rate of decay to estimate carbon 14’s half-life at 4,000 years. Later research would sharpen that estimate to 5,700 years.

Soon after Kamen and Ruben’s discovery, a University of Chicago physicist named Willard Libby determined that radiocarbon existed beyond the walls of Berkeley’s labs. Cosmic rays falling from space smashed into nitrogen atoms in the atmosphere every second of every day, transforming those atoms into carbon 14. And because plants and algae drew in carbon dioxide from the air, Libby realized, they should have radiocarbon in their tissue, as should the animals that eat those plants (and the animals that eat those animals, for that matter).

Libby reasoned that as long as an organism is alive and taking in carbon 14, the concentration of the isotope in its tissue should roughly match the concentration in the atmosphere. Once an organism dies, however, its radiocarbon should decay and eventually disappear completely.

To test this idea, Libby set out to measure carbon 14 in living organisms. He had colleagues go to a sewage-treatment plant in Baltimore, where they captured the methane given off by bacteria feeding on the sewage. When the methane samples arrived in Chicago, Libby extracted the carbon and put it in a radioactivity detector.. It crackled as carbon 14 decayed to nitrogen.

Read: Global warming could make carbon dating impossible

To see what happens to carbon 14 in dead tissue, Libby ran another experiment, this one with methane from oil wells. He knew that oil is made up of algae and other organisms that fell to the ocean floor and were buried for millions of years. Just as he had predicted, the methane from ancient oil contained no carbon 14 at all.

Libby then had another insight, one that would win him the Nobel Prize: The decay of carbon 14 in dead tissues acts like an archaeological clock. As the isotope decays inside a piece of wood, a bone, or some other form of organic matter, it can tell scientists how long ago that matter was alive. Radiocarbon dating, which works as far back as about 50,000 years, has revealed to us to when the Neanderthals became extinct, when farmers domesticated wheat, when the Dead Sea Scrolls were written. It has become the calendar of humanity.

Word of Libby’s breakthrough reached a New Zealand physicist named Athol Rafter. He began using radiocarbon dating on the bones of extinct flightless birds and ash from ancient eruptions. To make the clock more precise, Rafter measured the level of radiocarbon in the atmosphere. Every few weeks he climbed a hill outside the city of Wellington and set down a Pyrex tray filled with lye to trap carbon dioxide.

Rafter expected the level of radiocarbon to fluctuate. But he soon discovered that something else was happening: Month after month, the carbon dioxide in the atmosphere was getting more radioactive. He dunked barrels into the ocean, and he found that the amount of carbon 14 was rising in seawater as well. He could even measure extra carbon 14 in the young leaves growing on trees in New Zealand.

The Castle Bravo test and the ones that followed had to be the source. They were turning the atmosphere upside down. Instead of cosmic rays falling from space, they were sending neutrons up to the sky, creating a huge new supply of radiocarbon.

In 1957, Rafter published his results in the journal Science. The implications were immediately clear—and astonishing: Man-made carbon 14 was spreading across the planet from test sites in the Pacific and the Arctic. It was even passing from the air into the oceans and trees.

Other scientists began looking, and they saw the same pattern. In Texas, the carbon 14 levels in new tree rings were increasing each year. In Holland, the flesh of snails gained more as well. In New York, scientists examined the lungs of a fresh human cadaver, and found that extra carbon 14 lurked in its cells. A living volunteer donated blood and an exhalation of air. Bomb radiocarbon was in those, too.

Bomb radiocarbon did not pose a significant threat to human health—certainly not compared with other elements released by bombs, such as plutonium and uranium. But its accumulation was deeply unsettling nonetheless. When Linus Pauling accepted the 1962 Nobel Peace Prize for his campaigning against hydrogen bombs, he said that carbon 14 “deserves our special concern” because it “shows the extent to which the earth is being changed by the tests of nuclear weapons.”

Photos: When we tested nuclear bombs

The following year, the signing of the Partial Test Ban Treaty stopped aboveground nuclear explosions, and ended the supply of bomb radiocarbon. All told, those tests produced about 60,000 trillion trillion new atoms of carbon 14. It would take cosmic rays 250 years to make that much. In 1964, Rafter quickly saw the treaty’s effect: His trays of lye had less carbon 14 than they had the year before.

Only a tiny fraction of the carbon 14 was decaying into nitrogen. For the most part, the atmosphere’s radiocarbon levels were dropping because the atoms were rushing out of the air. This exodus of radiocarbon gave scientists an unprecedented chance to observe how nature works.

Today scientists are still learning from these man-made atoms. “I feel a little bit bad about it,” says Kristie Boering, an atmospheric chemist at UC Berkeley who has studied radiocarbon for more than 20 years. “It’s a huge tragedy, the fact that we set off all these bombs to begin with. And then we get all this interesting scientific information from it for all these decades. It’s hard to know exactly how to pitch that when we’re giving talks. You can’t get too excited about the bombs that we set off, right?”

Yet the fact remains that for atmospheric scientists like Boering, bomb radiocarbon has lit up the sky like a tracer dye. When nuclear triggermen such as John Clark set off their bombs, most of the resulting carbon 14 shot up into the stratosphere directly above the impact sites. Each spring, parcels of stratospheric air gently fell down into the troposphere below, carrying with them a fresh load of carbon 14. It took a few months for these parcels to settle on weather stations on the ground. Only by following bomb radiocarbon did scientists discover this perpetual avalanche.


Once carbon 14 fell out of the stratosphere, it kept moving. The troposphere is made up of four great rings of circulating air. Inside each ring, warm air rises and flows through the sky away from the equator. Eventually it cools and sinks back to the ground, flowing toward the equator again before rising once more. At first, bomb radiocarbon remained trapped in the Northern Hemisphere rings, above where the tests had taken place. It took many years to leak through their invisible walls and move toward the tropics. After that, the annual monsoons sweeping through southern Asia pushed bomb radiocarbon over the equator and into the Southern Hemisphere. 
Zoe van Djik

Eventually, some of the bomb radiocarbon fell all the way to the surface of the planet. Some of it was absorbed by trees and other plants, which then died and delivered some of that radiocarbon to the soil. Other radiocarbon atoms settled into the ocean, to be carried along by its currents.

Carbon 14 “is inextricably linked to our understanding of how the water moves,” says Steve Beaupre, an oceanographer at Stony Brook University, in New York.

In the 1970s, marine scientists began carrying out the first major chemical surveys of the world’s oceans. They found that bomb radiocarbon had penetrated the top 1,000 meters of the ocean. Deeper than that, it became scarce. This pattern helped oceanographers figure out that the ocean, like the atmosphere above, is made up of layers of water that remain largely separate.

The warm, relatively fresh water on the surface of the ocean glides over the cold, salty depths. These surface currents become saltier as they evaporate, and eventually, at a few crucial spots on the planet, these streams get so dense that they fall to the bottom of the ocean. The bomb radiocarbon from Castle Bravo didn’t start plunging down into the depths of the North Atlantic until the 1980s, when John Clark was two decades into retirement. It’s still down there, where it will be carried along the seafloor by bottom-hugging ocean currents for hundreds of years before it rises to the light of day.

Some of the bomb radiocarbon that falls into the ocean makes its way into ocean life, too. Some corals grow by adding rings of calcium carbonate, and they have recorded their own version of the bomb spike. Their spike lagged well behind the one that Rafter recorded, thanks to the extra time the radiocarbon took to mix into the ocean. Algae and microbes on the surface of the ocean also take up carbon from the air, and they feed a huge food web in turn. The living things in the upper reaches of the ocean release organic carbon that falls gently to the seafloor—a jumble of protoplasmic goo, dolphin droppings, starfish eggs, and all manner of detritus that scientists call marine snow. In recent decades, that marine snow has become more radioactive.

In 2009, a team of Chinese researchers sailed across the Pacific and dropped traps 36,000 feet down to the bottom of the Mariana Trench. When they hauled the traps up, there were minnow-size, shrimplike creatures inside. These were Hirondellea gigas, a deep-sea invertebrate that forages on the seafloor for bits of organic carbon. The animals were flush with bomb radiocarbon—a puzzling discovery, because the organic carbon that sits on the floor of the Mariana Trench is thousands of years old. It was as if they had been dining at the surface of the ocean, not at its greatest depths. In a few of the Hirondellea, the researchers found undigested particles of organic carbon. These meals were also high in carbon 14.

Read: A troubling discovery in the deepest ocean trenches

The bomb radiocarbon could not have gotten there by riding the ocean’s conveyor belt, says Ellen Druffel, a scientist at UC Irvine who collaborated with the Chinese team. “The only way you can get bomb carbon by circulation down to the deep Pacific would take 500 years,” she says. Instead, Hirondellea must be dining on freshly fallen marine snow.

“I must admit, when I saw the data it was really amazing,” Dreffel says. “These organisms were sifting out the very youngest material from the surface ocean. They were just leaving behind everything else that came down.”

More than 60 years have passed since the peak of the bomb spike, and yet bomb radiocarbon is telling us new stories about the world. That’s because experts like Mary Gaylord are getting better at gathering these rare atoms. At Woods Hole, Gaylord works at the National Ocean Sciences Accelerator Mass Spectrometry facility (NOSAMS for short). She prepares samples for analysis in a thicket of pipes, wires, glass tubes, and jars of frothing liquid nitrogen. “Our whole life is vacuum lines and vacuum pumps,” she told me.

At NOSAMS, Gaylord and her colleagues measure radiocarbon in all manner of things: sea spray, bat guano, typhoon-tossed trees. The day I visited, Gaylord was busy with fish eyes. Black-capped vials sat on a lab bench, each containing a bit of lens from a red snapper.

The wispy, pale tissue had come to NOSAMS from Florida. A biologist named Beverly Barnett had gotten hold of eyes from red snapper caught in the Gulf of Mexico and sliced out their lenses. Barnett then peeled away the layers of the lenses one at a time. When she describes this work, she makes it sound like woodworking or needlepoint—a hobby anyone would enjoy. “It’s like peeling off the layers of an onion,” she told me. “It’s really nifty to see.”

Eventually, Barnett made her way down to the tiny nub at the center of each lens. These bits of tissue developed when the red snapper were still in their eggs. And Barnett wanted to know exactly how much bomb radiocarbon is in these precious fragments. In a couple of days, Gaylord and her colleagues would be able to tell her.

Gaylord started by putting the lens pieces into an oven that slowly burned them away. The vapors and smoke flowed into a pipe, chased by helium and nitrogen. Gaylord separated the carbon dioxide from the other compounds, and then shunted it into chilled glass tubes. There it formed a frozen fog on the inside walls.

Later, the team at NOSAMS would transform the frozen carbon dioxide into chips of graphite, which they would then load into what looks like an enormous, crooked laser cannon. At one end of the cannon, graphite gets vaporized, and the liberated carbon atoms fly down the barrel. By controlling the magnetic field and other conditions inside the cannon, the researchers cause the carbon 14 atoms to veer away from the carbon 12 atoms and other elements. The carbon 14 atoms fly onward on their own until they strike a sensor.


Ultimately, all of this effort will end up in a number: the number of carbon 14 atoms in the red-snapper lens. For Barnett, every one of those atoms counts. They can tell her the exact age of the red snapper when the fish were caught.

That’s because lenses are peculiar organs. Most of our cells keep making new proteins and destroying old ones. Cells in the lens, however, fill up with light-bending proteins and then die, their proteins locked in place for the rest of our life. The layers of cells at the core of the red-snapper lenses have the same carbon 14 levels that they did when the fish were in their eggs.

Using lenses to estimate the ages of animals is still a new undertaking. But it’s already delivered some surprises. In 2016, for example, a team of Danish researchers studied the lenses from Greenland sharks ranging in size from two and a half to 16 feet long. The lenses of the sharks up to seven feet long had high levels of radiocarbon in them. That meant the sharks had hatched no earlier than the 1960s. The bigger sharks all had much lower levels of radiocarbon in their lenses—meaning that they had been born before Castle Bravo. By extrapolating out from these results, the researchers estimated that Greenland sharks have a staggeringly long life span, reaching up to 390 years or perhaps even more.

Barnett has been developing an even more precise clock for her red snapper, taking advantage of the fact that the level of bomb radiocarbon peaked in the Gulf of Mexico in the 1970s and has been falling ever since. By measuring the level of bomb radiocarbon in the center of the snapper lenses, she can determine the year when the fish hatched.


Knowing the age of fish with this kind of precision is powerful. Fishery managers can track the ages of the fish that are caught each year, information that they can then use to make sure their stocks don’t collapse. Barnett wants to study fish in the Gulf of Mexico to see how they were affected by the Deepwater Horizon oil spill of 2010. Their eyes can tell her how old they were when they were hit by that disaster.

When it comes to carbon, we are no different than red snapper or Greenland sharks. We use the carbon in the food we eat to build our body, and the level of bomb radiocarbon inside of us reflects our age. People born in the early 1960s have more radiocarbon in their lenses than people born before that time. People born in the years since then have progressively less.

For forensic scientists who need to determine the age of skeletal remains, lenses aren’t much help. But teeth are. As children develop teeth, they incorporate carbon into the enamel. If people’s teeth have a very low level of radiocarbon, it means that they were born well before Castle Bravo. People born in the early 1960s have high levels of radiocarbon in their molars, which develop early, and lower levels in their wisdom teeth, which grow years later. By matching each tooth in a jaw to the bomb curve, forensic scientists can estimate the age of a skeleton to within one or two years.

Even after childhood, bomb radiocarbon chronicles the history of our body. When we build new cells, we make DNA strands out of the carbon in our food. Scientists have used bomb radiocarbon in people’s DNA to determine the age of their cells. In our brains, most of the cells form around the time we’re born. But many cells in our hearts and other organs are much younger.
We also build other molecules throughout our lives, including fat. In a September 2019 study, Kirsty Spalding of the Karolinska Institute, near Stockholm, used bomb radiocarbon to study why people put on weight. Researchers had long known that our level of fat is the result of how much new fat we add to our body relative to how much we burn. But they didn’t have direct evidence for exactly how that balance influences our weight over the course of our life.

Spalding and her colleagues found 54 people from whom doctors had taken fat biopsies and asked if they could follow up. The fat samples spanned up to 16 years. By measuring the age of the fat in each sample, the researchers could estimate the rate at which each person added and removed fat over their lives.

The reason we put on weight as we get older, the researchers concluded, is that we get worse at removing fat from our bodies. “Before, you could intuitively believe that the rate at which we burn fat decreases as we age,” Spalding says, “but we showed it for the first time scientifically.”

Unexpectedly, though, Spalding discovered that the people who lost weight and kept it off successfully were the ones who burned their fat slowly. “I was quite surprised by that data,” Spalding said. “It adds new and interesting biology to understanding how to help people maintain weight loss.”

Children who are just now going through teething pains will have only a little more bomb radiocarbon in their enamel than children born before Castle Bravo did. Over the past six decades, the land and ocean have removed much of what nuclear bombs put into the air. Heather Graven, a climate scientist at Imperial College London, is studying this decline. It helps her predict the future of the planet.

Graven and her colleagues build models of the world to study the climate. As we emit fossil fuels, the extra carbon dioxide traps heat. How much heat we’re facing in centuries to come depends in part on how much carbon dioxide the oceans and land can remove. Graven can use the rise and fall of bomb radiocarbon as a benchmark to test her models.

In a recent study, she and her colleagues unleashed a virtual burst of nuclear-weapons tests. Then they tracked the fate of her simulated bomb radiocarbon to the present day. Much to Graven’s relief, the radiocarbon in the atmosphere quickly rose and then gradually fell. The bomb spike in her virtual world looks much like the one recorded in Joseph Fay’s beech tree.

Graven can keep running her simulation beyond what Fay’s beech and other records tell us about the past. According to her model, the level of radiocarbon in the atmosphere should drop in 2020 to the level before Castle Bravo.

“It’s right around now that we’re crossing over,” Graven told me.

Graven will have to wait for scientists to analyze global measurements of radiocarbon in the air to see whether she’s right. That’s important to find out, because Graven’s model suggests that the bomb spike is falling faster than the oceans and land alone can account for. When the ocean and land draw down bomb radiocarbon, they also release some of it back into the air. That two-way movement of bomb radiocarbon ought to cause its concentration in the atmosphere to level off a little above the pre–Castle Bravo mark. Instead, Graven’s model suggests, it continues to fall. She suspects that the missing factor is us.


We mine coal, drill for oil and gas, and then burn all that fossil fuel to power our cars, cool our houses, power our factories. In 1954, the year that John Clark set off Castle Bravo, humans emitted 6 billion tons of carbon dioxide into the air. In 2018, humans emitted about 37 billion tons. As Willard Libby first discovered, this fossil fuel has no radiocarbon left. By burning it, we are lowering the level of radiocarbon in the atmosphere, like a bartender watering down the top-shelf liquor.

If we keep burning fossil fuels at our accelerating rate, the planet will veer into climate chaos. And once more, radiocarbon will serve as a witness to our self-destructive actions. Unless we swiftly stop burning fossil fuels, we will push carbon 14 down far below the level it was at before the nuclear bombs began exploding.

To Graven, the coming radiocarbon crash is just as significant as the bomb spike has been. “We're transitioning from a bomb signal to a fossil-fuel-dilution signal,” she said.

The author Jonathan Weiner once observed that we should think of burning fossil fuels as a disturbance on par with nuclear-weapon detonations. “It is a slow-motion explosion manufactured by every last man, woman and child on the planet,” he wrote. If we threw up our billions of tons of carbon into the air all at once, it would dwarf Castle Bravo. “A pillar of fire would seem to extend higher into the sky and farther into the future than the eye can see,” Weiner wrote.

Bomb radiocarbon showed us how nuclear weapons threatened the entire world. Today, everyone on Earth still carries that mark. Now our pulse of carbon 14 is turning into an inverted bomb spike, a new signal of the next great threat to human survival.
CARL ZIMMER is a columnist at The New York Times. His latest book is She Has Her Mother’s Laugh: The Powers, Perversions, and Potential of Heredity.
HERSTORY
How Christina Koch Could Become a Spaceflight Legend

One of the astronauts in NASA’s current corps could be the first in a generation to walk on the moon—or the first to walk on Mars.

MARINA KOREN MARCH 2, 2020
NASA

When Christina Koch returned to Earth earlier this month, feeling the full force of the planet’s gravity for the first time in a long time, it was the middle of the night in the United States. Her capsule parachuted into the Kazakh desert, and by morning, her name was all over the news. After spending 328 days living on the International Space Station, Koch had set a new record for American women in space.

The volume of attention that morning, however warranted, was somewhat unusual for a modern astronaut. Missions to the space station are routine now, and the last astronaut to have his full name flashing across headlines, as if in marquee lights, was Scott Kelly, who nearly four years earlier broke the American record for long-duration spaceflight.

All of this is to say that, in this era of space travel, most astronauts don’t become household names. Asked to think of an astronaut, most people would probably default to Neil Armstrong, the first man on the moon—not to one of the dozens of astronauts who have flown to space in this century, or even one of the three who are there right now. The public today is more likely to be familiar with nonhuman explorers, like the Mars rover Curiosity and the New Horizons spacecraft, which photographed Pluto.


The Coming End of an Era at NASA MARINA KOREN


The Second Moon Landing Was Much Rowdier MARINA KOREN



One Small Controversy About Neil Armstrong’s Giant Leap JACOB STERN

But this century holds potential for new milestones in space exploration, the kind that can turn spacefarers into celebrities. The next Neil Armstrong could already be in NASA’s astronaut corps, which is more diverse now than ever before. This person will have charisma and steely resolve—and probably a very compelling Instagram account.

Read: The next big milestone in American spaceflight

There is no distinct formula that makes astronauts famous, but an obvious component is novelty, says Margaret Weitekamp, a curator in the space-history department at the Smithsonian’s National Air and Space Museum. Firsts—Armstrong stepping onto the lunar surface, delivering his famous line after he put his boot down—become indelible in public memory. Sally Ride, the first American woman in space, is probably the most well-known American female astronaut.

Other superlatives, especially of the Guinness World Records variety—the most, the longest, the oldest—can make astronauts, if not flat-out famous, at least memorable. Peggy Whitson, for example, holds the record for most spacewalks by a woman. Seconds can be even less sticky. Do you remember, for instance, what the commander of Apollo 12, the second moon-landing mission, said when he descended from the lander and touched the gray surface? Or what his name was? Twelve men have walked on the moon, and even those in the space community might struggle to name all of them. Many people don’t realize that there was a third astronaut on the Apollo 11 mission: Michael Collins, who stayed behind in the command module while Armstrong and Buzz Aldrin went to the surface.

Some firsts, of course, can be eclipsed by later, bigger firsts. Alan Shepard was heralded as a national hero when he became the first American to reach space in 1961, less than a month after Yuri Gagarin did it for the Soviet Union. When John Glenn flew a year later, he didn’t just pierce the boundary between Earth’s atmosphere and space; he circled the planet three times. It was a more intense mission, and Glenn came up with a memorable tagline for it, which he repeated for years to come: “Zero G and I feel fine.” Today, Glenn is arguably the more famous of the two. As NASA grew its astronaut corps in the 1960s, astronauts “needed slightly more extraordinary circumstances to break out of the pack and become that household name,” Weitekamp says. Even milestone “firsts” didn’t always make a lasting impression in the national imagination; the first NASA astronauts of color to travel to space—Guion Bluford, who flew on the shuttle in 1983, and Mae Jemison, who followed in 1992—are icons in the space community, but less well known to laypeople.

The first all-female spacewalk, conducted last fall by Koch and Jessica Meir, drew a great deal of attention, and if it ever materialized, so would the first all-female crew on the ISS. When NASA astronauts launch on a brand-new SpaceX transportation system sometime this year, the first endeavor of its kind, the passengers’ names will most certainly cut through the news cycle. But such milestones, on their own, are unlikely to bestow astronauts with mythical status

“When you start thinking about who’s going to be the next Neil Armstrong, you’re going to be looking for that combination of achievement and that personality that catches the public’s attention, the person who has the ‘it’ factor,” Weitekamp says.

Armstrong, she adds, had it. After he flew a couple of missions for Gemini, NASA’s pre-Apollo program, the agency sent him on a publicity tour through South America. Armstrong took a Spanish conversation class to prepare for the trip and name-dropped important South American figures, particularly in aviation, in his speeches, according to James R. Hansen’s biography of the astronaut. “He never failed to choose the right words,” recalled George Low, a NASA executive who traveled with Armstrong and was impressed.


Low would later manage the Apollo program and its crew assignments, including which astronaut should be the first one out of the lander. Armstrong had proved to NASA leadership not only that he could master the mission—he was one of the agency’s best pilots—but that he could handle the attention, too. Armstrong is famous in part because NASA chose him to be famous and, after he finished the mission, turned him into a spokesman for American spaceflight. Aldrin, meanwhile, may be better remembered for the persona he cultivated after visiting the moon, where he followed Armstrong onto the lunar surface. Whereas Armstrong, who died in 2012, is remembered for his stoic and amiable personality, Aldrin became known for a feisty attitude he has maintained into his 90s. (In recent years, he punched a moon-landing denier outside of a hotel and made a GIF-worthy range of facial expressions behind President Trump as he spoke about space exploration.)

In some cases, the “it” factor can outweigh a record-setting superlative. Chris Hadfield is the first Canadian to do a spacewalk, but he’s best known for his floating rendition of David Bowie’s “Space Oddity” on board the ISS, which has more than 45 million views on YouTube. Scott Kelly holds the American record for the most consecutive days in space, but he built his fan base through frequent Instagram posts of beautiful Earth shots. NASA does plenty of work to promote astronauts, especially those involved in the flashiest missions. But thanks to social media—which astronauts are encouraged to use—the spacefarers can take that much more ownership of their public image.

Read: The exquisite boredom of spacewalking

Fans have always been eager for such personal glimpses of astronauts’ personalities, Weitekamp says; in the 1950s and ’60s, Life magazine ran stories about the lives of the Mercury astronauts, ghostwritten but published under the men’s bylines. These days, every NASA astronaut has a professional Twitter account—a very different kind of launchpad for name recognition, but potentially nearly as effective. A tweet from Koch featuring a heartwarming video of the astronaut greeting her dog, adorably overjoyed after their long separation, quickly went viral.

To be a spaceflight legend, an astronaut will likely need, as Weitekamp puts it, extraordinary circumstances. Imagine the first woman on the moon, or the first people to set foot on Mars. It is not unrealistic to think that at the end of this century, the name of the first person to step onto the red planet will be more prominently woven into collective memory than the name Neil Armstrong. By the end of this century, 1969 will be 130 years in the past, as distant a memory as 1890 is now, when Nellie Bly made headlines by circumnavigating the globe, by ship and by rail, in just 72 days.

These explorers are probably already within NASA’s ranks. (Or, perhaps, working for a private company: The 21st century’s most famous spacefarer could end up being Elon Musk.) NASA recently added 11 new members to its active astronaut corps, bringing the total to 48. The new class, fresh off training, “may be assigned to missions destined for the International Space Station, the Moon, and ultimately, Mars,” the space agency said in a statement. These new astronauts can’t predict which among their ranks might be chosen for the next big feat in spaceflight history, but they can start daydreaming about what they might say as they take their own first step. Or they could go the Armstrong route and wait until the moment is near. Days before Apollo 11 launched, a reporter asked whether Armstrong, being “destined to become a historical personage of some consequence,” had come up with “something suitably historical and memorable” to say when he stepped onto the moon. “No, I haven’t,” Armstrong replied. Better to make history first.



MARINA KOREN is a staff writer at The Atlantic.


HERSTORY
The Daredevil Aviatrix That History Forgot
WHITE HISTORY 
Mar 04, 2020 
Video by American Masters — Unladylike2020
Bessie Coleman wanted more out of life. Her parents were sharecroppers in rural Texas, and she had spent her childhood picking cotton and doing laundry for white people. It was 1915. Opportunities were scarce for African Americans—let alone women of color. If Coleman wanted more, she realized, she had to go north. She moved to Chicago as part of the Great Migration and took a job at a barbershop. In her free time, Coleman began to read about flying.

She read about Harriet Quimby, the first American woman to earn a pilot’s license. She learned of the European women who served as combat pilots during World War I. Inspired by their stories, Coleman resolved to become an aviator. She applied to every flying school in the United States, but, because of widespread race and gender discrimination, she was rejected from all of them.

Coleman refused to take no for an answer. She found sponsorship from the black-owned newspaper The Chicago Defender, taught herself French, and moved to France. She earned her license from France's lauded Caudron Brother's School of Aviation in just seven months. specializing in stunt flying and parachuting. In 1921, Coleman became the first black woman to earn a pilot's license.

A new short documentary from PBS’s American Masters series revives the story of the daredevil aviatrix whom history forgot. The film, part of a larger series about pioneering American women called Unladylike2020, illuminates Coleman’s achievements through interviews and colorful animation.

“Like many Americans, the only woman pilot I had ever heard of was Amelia Earhart,” Charlotte Mangin, who produced the film, told me. “I certainly never imagined that a woman of color was able to obtain a pilot’s license in the 1920s, let alone take the country by storm as an aviator.”

What surprised Mangin most about Coleman, however, was the spirit of activism that the pilot brought to her flying shows. “She refused to perform in air shows where African Americans were not allowed to use the front entrance and sit in the stadium with white spectators,” Mangin said. “I can only imagine the courage and determination it took to be an activist in this way, at a time when discrimination and violence against people of color were rampant across America.”

At age 34, Coleman’s life was cut short in a plane crash caused by an engine malfunction. Ida B. Wells spoke at her funeral service. In 1929, Coleman’s dream of opening a flying school for African Americans became a reality when William J. Powell established the Bessie Coleman Aero Club in Los Angeles. The school educated and inspired many outstanding black pilots, including the Five Blackbirds and the Tuskegee Airmen of World War II.

Today, only 7 percent of all pilots in the U.S. are women; less than 1 percent are black women.

Author: Emily Buder
Ghost Stories Keep the Roma Alive


Video by Astra Zoldnere
For 500 years, the Latvian Roma people have been collecting berries in the Kurzeme forest. As one woman puts it, “a Roma without forest isn’t a Roma.”

The woman is part of a Roma family that Astra Zoldnere follows in her short documentary Blueberry Spirits. “It wasn’t easy to earn their trust,” she told me. “I had to live with them in the forest for a while.” Zoldnere traded in their currency—stories—by sharing some of her own. But she quickly realized that their tales were unlike hers, or any she’d ever heard before. They were ghost stories.

“I stepped out from the tent at night,” recounts a man in the film. “I was in a completely different place. One face appeared, and then another. I saw an old woman with a little girl in her arms … they’d been shot dead. Their anguished faces, cold eyes … how many people did the Germans shoot in the forest?”

Like this man’s nightmarish tale, which alludes to the mass murder of the Roma people by the Nazi regime during World War II, ghost stories are important elements of oral history in Roma culture. “Ghost stories help to maintain the community’s identity in the globalized world,” Zoldnere said. “Telling them brings together different generations.”

The tales are woven from the loose fabric of time that characterizes itinerant life in Roma communities. Blueberry Spirits, too, feels like a film out of time, existing somewhere in the space between reality and dreams. Zoldnere evokes this feeling through poetic, eerie imagery of thick fog seeping through the pine trees and the moon slowly rising above the clouds.

“At first, I was surprised that the Roma live in a world where past, present, and future are so connected,” Zoldnere said. “Different times, places, and faces entwine to form a more circular existence.”