It’s possible that I shall make an ass of myself. But in that case one can always get out of it with a little dialectic. I have, of course, so worded my proposition as to be right either way (K.Marx, Letter to F.Engels on the Indian Mutiny)
Sunday, June 07, 2020
Unenlightened thinking: Steven Pinker’s embarrassing new book is a feeble sermon for rattled liberals
To think of this book as any kind of scholarly exercise is a category mistake. The purpose of Pinker’s laborious work is to reassure liberals that they are on “the right side of history”.
BYJOHN GRAY
BOOKS
22 FEBRUARY 2018
SUKI DHANDA/GUARDIAN NEWS & MEDIA LTD
“Opposing reason is, by definition, unreasonable.” Steven Pinker is fond of definitions. Early on in this monumental apologia for a currently fashionable version of Enlightenment thinking, he writes: “To take something on faith means to believe it without good reason, so by definition a faith in the existence of supernatural entities clashes with reason.” Well, it’s good to have that settled once and for all. There is no need to trouble yourself with the arguments of historians, anthropologists and evolutionary biologists, who treat religion as a highly complex phenomenon, serving a variety of human needs. All you need do is consult a dictionary, and you will find that religion is – by definition – irrational.
Similarly, you don’t need to bother about what the Enlightenment was actually like. By any standards, David Hume was one of the greatest Enlightenment thinkers. It was the sceptical Scottish philosopher who stirred Immanuel Kant – whose well-known essay on Enlightenment Pinker quotes reverently at the start of the book – from what Kant described as his “dogmatic slumber”. Pinker barely mentions Hume, and the omission is not accidental. He tell us that the Enlightenment is defined by a “non-negotiable” commitment to reason.
Yet in A Treatise of Human Nature (1738), Hume wrote: “Reason is, and ought only to be, the slave of the passions and can never pretend to any other office than to serve and obey them.” Hume believed being reasonable meant accepting the limits of reason, and so too, in quite different ways, did later Enlightenment rationalists such as Keynes and Freud. Pinker’s Enlightenment has little in common with the much more interesting intellectual movement that historically existed.
One of the consequences of this unhistorical approach is that Pinker repeats fallacies that have been exposed time and time again. He is an evangelist for science – or, to be more exact, an ideology of scientism. Along with reason, humanism and progress, science features as one of the core Enlightenment values that Pinker lists at the start of the book. But for him science is more than a bunch of methods that are useful in conjecturing how the world works: it provides the basis of ethics and politics.
He summarises this claim in a formula: “Entro, evo, info. These concepts define the narrative of human progress, the tragedy we were born into, and our means of eking out a better existence.” Here, “entro” denotes entropy, the process of increasing disorder that is identified in the second law of thermodynamics. “Evo” refers to the evolution of living organisms, which absorb energy and thereby resist entropy. “Info” is information, which when collected and processed in the nervous systems of these organisms enables them to wage their war against entropy.
For Pinker, the second law of thermodynamics doesn’t simply identify a universal regularity in the natural world, “it defines the fate of the universe and the ultimate purpose of life, mind, and human striving: to deploy energy and knowledge to fight back the tide of entropy and carve out refuges of beneficial order”.
There is nothing novel in scientism. The Victorian prophet of social evolution, Herbert Spencer, believed that the universe, life and society were moving from undifferentiated simplicity to a higher state of complex order. In politics, this meant a movement towards laissez-faire capitalism. In social contexts, “survival of the fittest” – an expression Spencer invented after reading Darwin’s On the Origin of Species – meant that anyone unable to stay afloat in such a society would struggle, sink and then disappear. Spencer welcomed this process, since for him it was evolution in action – the movement from lower to higher forms of life.
Pinker is an ardent enthusiast for free-market capitalism, which he believes produced most of the advance in living standards over the past few centuries. Unlike Spencer, he seems ready to accept that some provision should be made for those who have been left behind. Why he makes this concession is unclear. Nothing is said about human kindness, or fairness, in his formula. Indeed, the logic of his dictum points the other way.
Many early-20th-century Enlightenment thinkers supported eugenic policies because they believed “improving the quality of the population” – weeding out human beings they deemed unproductive or undesirable – would accelerate the course of human evolution. When Pinker touches on eugenics in a couple of paragraphs towards the end of the book, he blames it on socialism: “The most decisive repudiation of eugenics invokes classical liberal and libertarian principles: government is not an omnipotent ruler over human existence but an institution with circumscribed powers, and perfecting the genetic make-up of the human species is not among them.” But a theory of entropy provides no reason for limiting the powers of government any more than for helping the weak. Science cannot underwrite any political project, classical liberal or otherwise, because science cannot dictate human values.
Exponents of scientism in the past have used it to promote Fabian socialism, Marxism-Leninism, Nazism and more interventionist varieties of liberalism. In doing so, they were invoking the authority of science to legitimise the values of their time and place. Deploying his cod-scientific formula to bolster market liberalism, Pinker does the same. Scientism is one of the Enlightenment’s bad ideas. But bad ideas do not evolve into better ones. They keep on recurring, often in cruder and sillier forms than in the past. Pinker’s formula for human progress is a contemporary example.
To be sure, for Pinker there are no bad Enlightenment ideas. One of the features of the comic-book history of the Enlightenment he presents is that it is innocent of all evil. Accordingly, when despots such as Lenin repeatedly asserted that they engaged in mass killing in order to realise an Enlightenment project – in Lenin’s case, a more far-reaching version of the Jacobin project of re-educating society by the methodical use of terror – they must have been deluded or lying. How could a philosophy of reason possibly be implicated in murderous totalitarianism? Like the faithful who tell you Christianity is “a religion of love” that had nothing to do with the Inquisition, Pinker stipulates that the Enlightenment, by definition, is intrinsically liberal. Modern tyrannies must therefore be products of counter-Enlightenment ideologies – Romanticism, nationalism and the like. Enabling liberals to avoid asking difficult questions about why their values are in retreat, this is a popular view. Assessed in terms of historical evidence, it is also a myth.
Many Enlightenment thinkers have been avowedly or implicitly hostile to liberalism. One of the most influential, the 19th-century French positivist Auguste Comte – not discussed by Pinker – promoted a brand of scientism that was overtly anti-liberal. Human progress meant following the path of reason and moving from magical thinking to scientific inquiry. In a society based on science there will be no need for liberal values, since moral and political questions will be answered by experts.
Comte admired the Middle Ages as a time when society was healthily “organic” and unified by a single orthodoxy; but the organic society of the future would be ruled by science, not monotheism. The superstitious faith of earlier times would be supplanted by what he called “the Religion of Humanity” – a rationalist creed in which an imaginary version of the human species would occupy the place of the Supreme Being. Comte’s core ideas – reason, science, progress and humanism – are precisely those that Pinker lists at the start of this book as the central values of the Enlightenment. Interestingly, neither of them mentions freedom or toleration.
The link between the Enlightenment and liberal values, which Pinker and many others today assert as a universal truth, is actually rather tenuous. It is strongest in Enlightenment thinkers who were wedded to monotheism, such as Locke and indeed Kant. The more hostile the Enlightenment has been to monotheism, the more illiberal it has been. Comte’s anti-liberalism inspired Charles Maurras, a French collaborator with Nazism and the leading theorist of Action Française – a fascistic movement formed during the Dreyfus affair – in his defence of integral nationalism. Lenin continued the Jacobins’ campaign against religion as well as their pedagogy of terror.
Instead of acknowledging that the Enlightenment itself has often been illiberal, Pinker presents a Manichean vision in which “Enlightenment liberal values” are besieged on every side by dark forces. Amusingly, he is in no doubt as to the identity of the intellectual master-criminal behind this assault. The Professor Moriarty of modern irrationalism, the “enemy of humanism, the ideology behind resurgent authoritarianism, nationalism, populism, reactionary thinking, even fascism” can at last be revealed:
If one wanted to single out a thinker who represented the opposite of humanism (indeed of pretty much every argument in this book) one couldn’t do better than the German philologist Friedrich Nietzsche. Nietzsche helped to inspire the romantic militarism that led to the First World War and the fascism that led to the Second. The connections between Nietzsche’s ideas and the megadeath movements of the 20th century are obvious enough; a glorification of violence and power, an eagerness to raze the institutions of liberal democracy, a contempt for most of humanity, and a stone-hearted indifference to human life.
Searching for some intellectual authority for this wild diatribe, Pinker cites Bertrand Russell’s History of Western Philosophy, where Russell denounced Nietzsche as a Romantic enemy of reason who preached a life of instinct and emotion. Published immediately after the end of the Second World War, Russell’s assessment of Nietzsche was understandably crude. Today it would not pass muster in a first-year undergraduate’s essay.
A lifelong admirer of Voltaire, Nietzsche was a critic of the Enlightenment because he belonged in it. Far from being an enemy of humanism, he promoted humanism in the most radical form. In future, humankind would fashion its values and shape its destiny by its own unfettered will. True, he conferred this privilege only on a select few.
He recognised no principle of human equality. But where does concern with equality come from? Not from science, which can be used to promote many values. As Nietzsche never tired of pointing out, the ideal of equality is an inheritance from Judaism and Christianity. His hatred of equality is one reason he was such a vehement atheist.
The message of Pinker’s book is that the Enlightenment produced all of the progress of the modern era and none of its crimes. This is why he tries to explain 20th-century megadeaths by reference to Nietzsche’s supposedly anti-Enlightenment philosophy. Here he has shifted his ground. In The Better Angels of Our Nature (2011), Pinker represented the Hemoclysm – a term referring to the early 20th-century spasm of mass killing, which he uses to lump together the two world wars, the Soviet Gulag and the Holocaust – as not much more than a statistical fluke. What explains this change of view? Pinker cites no change in the historical evidence that is available on the subject.
Instead, there has been a shift in the mood of liberals. Less than a decade ago, they were confident that progress was ongoing. No doubt there would be periods of regression; we might be in one of those periods at the present time. Yet over the long haul of history, there could be no doubt that the forces of reason would continue to advance. Today, liberals have lost that always rather incredible faith. Faced with the political reversals of the past few years and the onward march of authoritarianism, they find their view of the world crumbling away. What they need at the present time, more than anything else, is some kind of intellectual anodyne that can soothe their nerves, still their doubts and stave off panic.
This is where Pinker comes in. Enlightenment Now is a rationalist sermon delivered to a congregation of wavering souls. To think of the book as any kind of scholarly exercise is a category mistake. Much of its more than 500 pages consists of figures aiming to show the progress that has been made under the aegis of Enlightenment ideals. Of course, these figures settle nothing. Like Pinker’s celebrated assertion that the world is becoming ever more peaceful – the statistical basis of which has been demolished by Nassim Nicholas Taleb – everything depends on what is included in them and how they are interpreted.
Are the millions incarcerated in the vast American prison system and the millions more who live under parole included in the calculus that says human freedom is increasing? If we are to congratulate ourselves on being less cruel to animals, how much weight should be given to the uncounted numbers that suffer in factory farming and hideous medical experiments – neither of which were practised on any comparable scale in the past?
It would be idle to pursue such questions. The purpose of Pinker’s laborious graphs and figures is to reassure his audience that they are on “the right side of history”. For many, no doubt, the exercise will be successful. But nagging questions will surely return. If an Enlightenment project survives, what reason is there for thinking it will be embodied in liberal democracy? What if the Enlightenment’s future is not in the liberal West, now almost ungovernable as a result of the culture wars in which it is mired, but Xi Jinping’s China, where an altogether tougher breed of rationalist is in charge? It is a prospect that Voltaire, Jeremy Bentham and other exponents of enlightened despotism would have heartily welcomed.
Judged as a contribution to thought, Enlightenment Now is embarrassingly feeble. With its primitive scientism and manga-style history of ideas, the book is a parody of Enlightenment thinking at its crudest. A more intellectually inquiring author would have conveyed something of the Enlightenment’s richness and diversity. Yet even if Pinker was capable of providing it, intellectual inquiry is not what his anxious flock demands. Only an anodyne, mythical Enlightenment can give them what they crave, which is relief from painful doubt.
Given this overriding emotional imperative, presenting them with the actual, conflict-ridden, often illiberal Enlightenment would be – by definition, one might say – unreasonable. Judged as a therapeutic manual for rattled rationalists, Enlightenment Now is a highly topical and much-needed book. In the end, after all, reason is only the slave of the passions.
John Gray’s new book, “Seven Types of Atheism” will be published in April by Allen Lane.
Enlightenment Now: the Case for Reason, Science, Humanism and Progress
Steven Pinker
Allen Lane, 576pp, £25
https://www.newstatesman.com/culture/books/2018/02/unenlightened-thinking-steven-pinker-s-embarrassing-new-book-feeble-sermon
Science Is Not Your Enemy
An impassioned plea to neglected novelists, embattled professors, and tenure-less historians
Steven Pinker/August 6, 2013
UIG VIA GETTY IMAGES
The great thinkers of the Age of Reason and the Enlightenment were scientists. Not only did many of them contribute to mathematics, physics, and physiology, but all of them were avid theorists in the sciences of human nature. They were cognitive neuroscientists, who tried to explain thought and emotion in terms of physical mechanisms of the nervous system. They were evolutionary psychologists, who speculated on life in a state of nature and on animal instincts that are “infused into our bosoms.” And they were social psychologists, who wrote of the moral sentiments that draw us together, the selfish passions that inflame us, and the foibles of shortsightedness that frustrate our best-laid plans.
These thinkers—Descartes, Spinoza, Hobbes, Locke, Hume, Rousseau, Leibniz, Kant, Smith—are all the more remarkable for having crafted their ideas in the absence of formal theory and empirical data. The mathematical theories of information, computation, and games had yet to be invented. The words “neuron,” “hormone,” and “gene” meant nothing to them. When reading these thinkers, I often long to travel back in time and offer them some bit of twenty-first-century freshman science that would fill a gap in their arguments or guide them around a stumbling block. What would these Fausts have given for such knowledge? What could they have done with it?
WATCH: Leon Wieseltier's rejoinder: Science doesn't have all the answers
We don’t have to fantasize about this scenario, because we are living it. We have the works of the great thinkers and their heirs, and we have scientific knowledge they could not have dreamed of. This is an extraordinary time for the understanding of the human condition. Intellectual problems from antiquity are being illuminated by insights from the sciences of mind, brain, genes, and evolution. Powerful tools have been developed to explore them, from genetically engineered neurons that can be controlled with pinpoints of light to the mining of “big data” as a means of understanding how ideas propagate.
One would think that writers in the humanities would be delighted and energized by the efflorescence of new ideas from the sciences. But one would be wrong. Though everyone endorses science when it can cure disease, monitor the environment, or bash political opponents, the intrusion of science into the territories of the humanities has been deeply resented. Just as reviled is the application of scientific reasoning to religion; many writers without a trace of a belief in God maintain that there is something unseemly about scientists weighing in on the biggest questions. In the major journals of opinion, scientific carpetbaggers are regularly accused of determinism, reductionism, essentialism, positivism, and worst of all, something called “scientism.” The past couple years have seen four denunciations of scientism in this magazine alone, together with attacks in Bookforum, The Claremont Review of Books, The Huffington Post, The Nation, National Review Online, The New Atlantis, The New York Times, and Standpoint.
The eclectic politics of these publications reflects the bipartisan nature of the resentment. This passage, from a 2011 review in The Nation of three books by Sam Harris by the historian Jackson Lears, makes the standard case for the prosecution by the left:
Positivist assumptions provided the epistemological foundations for Social Darwinism and pop-evolutionary notions of progress, as well as for scientific racism and imperialism. These tendencies coalesced in eugenics, the doctrine that human well-being could be improved and eventually perfected through the selective breeding of the "fit" and the sterilization or elimination of the "unfit." ... Every schoolkid knows about what happened next: the catastrophic twentieth century. Two world wars, the systematic slaughter of innocents on an unprecedented scale, the proliferation of unimaginable destructive weapons, brushfire wars on the periphery of empire—all these events involved, in various degrees, the application of sceintific research to advanced technology.
The case from the right, captured in this 2007 speech from Leon Kass, George W. Bush’s bioethics adviser, is just as measured:
Scientific ideas and discoveries about living nature and man, perfectly welcome and harmless in themselves, are being enlisted to do battle against our traditional religious and moral teachings, and even our self-understanding as creatures with freedom and dignity. A quasi-religious faith has sprung up among us—let me call it "soul-less scientism"—which believes that our new biology, eliminating all mystery, can give a complete account of human life, giving purely scientific explanations of human thought, love, creativity, moral judgment, and even why we believe in God. ... Make no mistake. The stakes in this contest are high: at issue are the moral and spiritual health of our nation, the continued vitality of science, and our own self-understanding as human beings and as children of the West.
These are zealous prosecutors indeed. But their cases are weak. The mindset of science cannot be blamed for genocide and war and does not threaten the moral and spiritual health of our nation. It is, rather, indispensable in all areas of human concern, including politics, the arts, and the search for meaning, purpose, and morality.
The term “scientism” is anything but clear, more of a boo-word than a label for any coherent doctrine. Sometimes it is equated with lunatic positions, such as that “science is all that matters” or that “scientists should be entrusted to solve all problems.” Sometimes it is clarified with adjectives like “simplistic,” “naïve,” and “vulgar.” The definitional vacuum allows me to replicate gay activists’ flaunting of “queer” and appropriate the pejorative for a position I am prepared to defend.
Scientism, in this good sense, is not the belief that members of the occupational guild called “science” are particularly wise or noble. On the contrary, the defining practices of science, including open debate, peer review, and double-blind methods, are explicitly designed to circumvent the errors and sins to which scientists, being human, are vulnerable. Scientism does not mean that all current scientific hypotheses are true; most new ones are not, since the cycle of conjecture and refutation is the lifeblood of science. It is not an imperialistic drive to occupy the humanities; the promise of science is to enrich and diversify the intellectual tools of humanistic scholarship, not to obliterate them. And it is not the dogma that physical stuff is the only thing that exists. Scientists themselves are immersed in the ethereal medium of information, including the truths of mathematics, the logic of their theories, and the values that guide their enterprise. In this conception, science is of a piece with philosophy, reason, and Enlightenment humanism. It is distinguished by an explicit commitment to two ideals, and it is these that scientism seeks to export to the rest of intellectual life.
The Linder Gallery, c.1622-1629, Cordover Collection, LLC
The first is that the world is intelligible. The phenomena we experience may be explained by principles that are more general than the phenomena themselves. These principles may in turn be explained by more fundamental principles, and so on. In making sense of our world, there should be few occasions in which we are forced to concede “It just is” or “It’s magic” or “Because I said so.” The commitment to intelligibility is not a matter of brute faith, but gradually validates itself as more and more of the world becomes explicable in scientific terms. The processes of life, for example, used to be attributed to a mysterious élan vital; now we know they are powered by chemical and physical reactions among complex molecules.
Demonizers of scientism often confuse intelligibility with a sin called reductionism. But to explain a complex happening in terms of deeper principles is not to discard its richness. No sane thinker would try to explain World War I in the language of physics, chemistry, and biology as opposed to the more perspicuous language of the perceptions and goals of leaders in 1914 Europe. At the same time, a curious person can legitimately ask why human minds are apt to have such perceptions and goals, including the tribalism, overconfidence, and sense of honor that fell into a deadly combination at that historical moment.
Many of our cultural institutions cultivate a philistine indifference to science.
The second ideal is that the acquisition of knowledge is hard. The world does not go out of its way to reveal its workings, and even if it did, our minds are prone to illusions, fallacies, and super- stitions. Most of the traditional causes of belief—faith, revelation, dogma, authority, charisma, conventional wisdom, the invigorating glow of subjective certainty—are generators of error and should be dismissed as sources of knowledge. To understand the world, we must cultivate work-arounds for our cognitive limitations, including skepticism, open debate, formal precision, and empirical tests, often requiring feats of ingenuity. Any movement that calls itself “scientific” but fails to nurture opportunities for the falsification of its own beliefs (most obviously when it murders or imprisons the people who disagree with it) is not a scientific movement.
In which ways, then, does science illuminate human affairs? Let me start with the most ambitious: the deepest questions about who we are, where we came from, and how we define the meaning and purpose of our lives. This is the traditional territory of religion, and its defenders tend to be the most excitable critics of scientism. They are apt to endorse the partition plan proposed by Stephen Jay Gould in his worst book, Rocks of Ages, according to which the proper concerns of science and religion belong to “non-overlapping magisteria.” Science gets the empirical universe; religion gets the questions of moral meaning and value.
Unfortunately, this entente unravels as soon as you begin to examine it. The moral worldview of any scientifically literate person—one who is not blinkered by fundamentalism—requires a radical break from religious conceptions of meaning and value.
To begin with, the findings of science entail that the belief systems of all the world’s traditional religions and cultures—their theories of the origins of life, humans, and societies—are factually mistaken. We know, but our ancestors did not, that humans belong to a single species of African primate that developed agriculture, government, and writing late in its history. We know that our species is a tiny twig of a genealogical tree that embraces all living things and that emerged from prebiotic chemicals almost four billion years ago. We know that we live on a planet that revolves around one of a hundred billion stars in our galaxy, which is one of a hundred billion galaxies in a 13.8-billion-year-old universe, possibly one of a vast number of universes. We know that our intuitions about space, time, matter, and causation are incommensurable with the nature of reality on scales that are very large and very small. We know that the laws governing the physical world (including accidents, disease, and other misfortunes) have no goals that pertain to human well-being. There is no such thing as fate, providence, karma, spells, curses, augury, divine retribution, or answered prayers—though the discrepancy between the laws of probability and the workings of cognition may explain why people believe there are. And we know that we did not always know these things, that the beloved convictions of every time and culture may be decisively falsified, doubtless including some we hold today.
In other words, the worldview that guides the moral and spiritual values of an educated person today is the worldview given to us by science. Though the scientific facts do not by themselves dictate values, they certainly hem in the possibilities. By stripping ecclesiastical authority of its credibility on factual matters, they cast doubt on its claims to certitude in matters of morality. The scientific refutation of the theory of vengeful gods and occult forces undermines practices such as human sacrifice, witch hunts, faith healing, trial by ordeal, and the persecution of heretics. The facts of science, by exposing the absence of purpose in the laws governing the universe, force us to take responsibility for the welfare of ourselves, our species, and our planet. For the same reason, they undercut any moral or political system based on mystical forces, quests, destinies, dialectics, struggles, or messianic ages. And in combination with a few unexceptionable convictions— that all of us value our own welfare and that we are social beings who impinge on each other and can negotiate codes of conduct—the scientific facts militate toward a defensible morality, namely adhering to principles that maximize the flourishing of humans and other sentient beings. This humanism, which is inextricable from a scientific understanding of the world, is becoming the de facto morality of modern democracies, international organizations, and liberalizing religions, and its unfulfilled promises define the moral imperatives we face today.
Moreover, science has contributed—directly and enormously—to the fulfillment of these values. If one were to list the proudest accomplishments of our species (setting aside the removal of obstacles we set in our own path, such as the abolition of slavery and the defeat of fascism), many would be gifts bestowed by science.
The most obvious is the exhilarating achievement of scientific knowledge itself. We can say much about the history of the universe, the forces that make it tick, the stuff we’re made of, the origin of living things, and the machinery of life, including our own mental life. Better still, this understanding consists not in a mere listing of facts, but in deep and elegant principles, like the insight that life depends on a molecule that carries information, directs metabolism, and replicates itself.
Science has also provided the world with images of sublime beauty: stroboscopically frozen motion, exotic organisms, distant galaxies and outer planets, fluorescing neural circuitry, and a luminous planet Earth rising above the moon’s horizon into the blackness of space. Like great works of art, these are not just pretty pictures but prods to contemplation, which deepen our understanding of what it means to be human and of our place in nature.
And contrary to the widespread canard that technology has created a dystopia of deprivation and violence, every global measure of human flourishing is on the rise. The numbers show that after millennia of near-universal poverty, a steadily growing proportion of humanity is surviving the first year of life, going to school, voting in democracies, living in peace, communicating on cell phones, enjoying small luxuries, and surviving to old age. The Green Revolution in agronomy alone saved a billion people from starvation. And if you want examples of true moral greatness, go to Wikipedia and look up the entries for “smallpox” and “rinderpest” (cattle plague). The definitions are in the past tense, indicating that human ingenuity has eradicated two of the cruelest causes of suffering in the history of our kind.
Though science is beneficially embedded in our material, moral, and intellectual lives, many of our cultural institutions, including the liberal arts programs of many universities, cultivate a philistine indifference to science that shades into contempt. Students can graduate from elite colleges with a trifling exposure to science. They are commonly misinformed that scientists no longer care about truth but merely chase the fashions of shifting paradigms. A demonization campaign anachronistically impugns science for crimes that are as old as civilization, including racism, slavery, conquest, and genocide.
Just as common, and as historically illiterate, is the blaming of science for political movements with a pseudoscientific patina, particularly Social Darwinism and eugenics. Social Darwinism was the misnamed laissez-faire philosophy of Herbert Spencer. It was inspired not by Darwin’s theory of natural selection, but by Spencer’s Victorian-era conception of a mysterious natural force for progress, which was best left unimpeded. Today the term is often used to smear any application of evolution to the understanding of human beings. Eugenics was the campaign, popular among leftists and progressives in the early decades of the twentieth century, for the ultimate form of social progress, improving the genetic stock of humanity. Today the term is commonly used to assail behavioral genetics, the study of the genetic contributions to individual differences.
I can testify that this recrimination is not a relic of the 1990s science wars. When Harvard reformed its general education requirement in 2006 to 2007, the preliminary task force report introduced the teaching of science without any mention of its place in human knowledge: “Science and technology directly affect our students in many ways, both positive and negative: they have led to life-saving medicines, the internet, more efficient energy storage, and digital entertainment; they also have shepherded nuclear weapons, biological warfare agents, electronic eavesdropping, and damage to the environment.” This strange equivocation between the utilitarian and the nefarious was not applied to other disciplines. (Just imagine motivating the study of classical music by noting that it both generates economic activity and inspired the Nazis.) And there was no acknowledgment that we might have good reasons to prefer science and know-how over ignorance and superstition.
At a 2011 conference, another colleague summed up what she thought was the mixed legacy of science: the eradication of smallpox on the one hand; the Tuskegee syphilis study on the other. (In that study, another bloody shirt in the standard narrative about the evils of science, public-health researchers beginning in 1932 tracked the progression of untreated, latent syphilis in a sample of impoverished African Americans.) The comparison is obtuse. It assumes that the study was the unavoidable dark side of scientific progress as opposed to a universally deplored breach, and it compares a one-time failure to prevent harm to a few dozen people with the prevention of hundreds of millions of deaths per century, in perpetuity.
A major goad for the recent denunciations of scientism has been the application of neuroscience, evolution, and genetics to human affairs. Certainly many of these applications are glib or wrong, and they are fair game for criticism: scanning the brains of voters as they look at politicians’ faces, attributing war to a gene for aggression, explaining religion as an evolutionary adaptation to bond the group. Yet it’s not unheard of for intellectuals who are innocent of science to advance ideas that are glib or wrong, and no one is calling for humanities scholars to go back to their carrels and stay out of discussions of things that matter. It is a mistake to use a few wrongheaded examples as an excuse to quarantine the sciences of human nature from our attempt to understand the human condition.To simplify is not to be simplistic.
Take our understanding of politics. “What is government itself,” asked James Madison, “but the greatest of all reflections on human nature?” The new sciences of the mind are reexamining the connections between politics and human nature, which were avidly discussed in Madison’s time but submerged during a long interlude in which humans were assumed to be blank slates or rational actors. Humans, we are increasingly appreciating, are moralistic actors, guided by norms and taboos about authority, tribe, and purity, and driven by conflicting inclinations toward revenge and reconciliation. These impulses ordinarily operate beneath our conscious awareness, but in some circumstances they can be turned around by reason and debate. We are starting to grasp why these moralistic impulses evolved; how they are implemented in the brain; how they differ among individuals, cultures, and sub- cultures; and which conditions turn them on and off.
The application of science to politics not only enriches our stock of ideas, but also offers the means to ascertain which of them are likely to be correct. Political debates have traditionally been deliberated through case studies, rhetoric, and what software engineers call HiPPO (highest-paid person’s opinion). Not surprisingly, the controversies have careened without resolution. Do democracies fight each other? What about trading partners? Do neighboring ethnic groups inevitably play out ancient hatreds in bloody conflict? Do peacekeeping forces really keep the peace? Do terrorist organizations get what they want? How about Gandhian nonviolent movements? Are post-conflict reconciliation rituals effective at preventing the renewal of conflict?
History nerds can adduce examples that support either answer, but that does not mean the questions are irresolvable. Political events are buffeted by many forces, so it’s possible that a given force is potent in general but submerged in a particular instance. With the advent of data science—the analysis of large, open-access data sets of numbers or text—signals can be extracted from the noise and debates in history and political science resolved more objectively. As best we can tell at present, the answers to the questions listed above are (on average, and all things being equal) no, no, no, yes, no, yes, and yes.
The humanities are the domain in which the intrusion of science has produced the strongest recoil. Yet it is just that domain that would seem to be most in need of an infusion of new ideas. By most accounts, the humanities are in trouble. University programs are downsizing, the next generation of scholars is un- or underemployed, morale is sinking, students are staying away in droves. No thinking person should be indifferent to our society’s disinvestment from the humanities, which are indispensable to a civilized democracy.
Diagnoses of the malaise of the humanities rightly point to anti-intellectual trends in our culture and to the commercialization of our universities. But an honest appraisal would have to acknowledge that some of the damage is self-inflicted. The humanities have yet to recover from the disaster of postmodernism, with its defiant obscurantism, dogmatic relativism, and suffocating political correctness. And they have failed to define a progressive agenda. Several university presidents and provosts have lamented to me that when a scientist comes into their office, it’s to announce some exciting new research opportunity and demand the resources to pursue it. When a humanities scholar drops by, it’s to plead for respect for the way things have always been done.
Those ways do deserve respect, and there can be no replacement for the varieties of close reading, thick description, and deep immersion that erudite scholars can apply to individual works. But must these be the only paths to understanding? A consilience with science offers the humanities countless possibilities for innovation in understanding. Art, culture, and society are products of human brains. They originate in our faculties of perception, thought, and emotion, and they cumulate and spread through the epidemiological dynamics by which one person affects others. Shouldn’t we be curious to understand these connections? Both sides would win. The humanities would enjoy more of the explanatory depth of the sciences, to say nothing of the kind of a progressive agenda that appeals to deans and donors. The sciences could challenge their theories with the natural experiments and ecologically valid phenomena that have been so richly characterized by humanists.
In some disciplines, this consilience is a fait accompli. Archeology has grown from a branch of art history to a high-tech science. Linguistics and the philosophy of mind shade into cognitive science and neuroscience.
READ: The argument continues: Pinker and Wieseltier, Science vs. Humanities, Round III
Similar opportunities are there for the exploring. The visual arts could avail themselves of the explosion of knowledge in vision science, including the perception of color, shape, texture, and lighting, and the evolutionary aesthetics of faces and landscapes. Music scholars have much to discuss with the scientists who study the perception of speech and the brain’s analysis of the auditory world.
As for literary scholarship, where to begin? John Dryden wrote that a work of fiction is “a just and lively image of human nature, representing its passions and humours, and the changes of fortune to which it is subject, for the delight and instruction of mankind.” Linguistics can illuminate the resources of grammar and discourse that allow authors to manipulate a reader’s imaginary experience. Cognitive psychology can provide insight about readers’ ability to reconcile their own consciousness with those of the author and characters. Behavioral genetics can update folk theories of parental influence with discoveries about the effects of genes, peers, and chance, which have profound implications for the interpretation of biography and memoir—an endeavor that also has much to learn from the cognitive psychology of memory and the social psychology of self-presentation. Evolutionary psychologists can distinguish the obsessions that are universal from those that are exaggerated by a particular culture and can lay out the inherent conflicts and confluences of interest within families, couples, friendships, and rivalries that are the drivers of plot.
And as with politics, the advent of data science applied to books, periodicals, correspondence, and musical scores holds the promise for an expansive new “digital humanities.” The possibilities for theory and discovery are limited only by the imagination and include the origin and spread of ideas, networks of intellectual and artistic influence, the persistence of historical memory, the waxing and waning of themes in literature, and patterns of unofficial censorship and taboo.
Nonetheless, many humanities scholars have reacted to these opportunities like the protagonist of the grammar-book example of the volitional future tense: “I will drown; no one shall save me.” Noting that these analyses flatten the richness of individual works, they reach for the usual adjectives: simplistic, reductionist, naïve, vulgar, and of course, scientistic.
The complaint about simplification is misbegotten. To explain something is to subsume it under more general principles, which always entails a degree of simplification. Yet to simplify is not to be simplistic. An appreciation of the particulars of a work can co-exist with explanations at many other levels, from the personality of an author to the cultural milieu, the faculties of human nature, and the laws governing social beings. The rejection of a search for general trends and principles calls to mind Jorge Luis Borges’s fictitious empire in which “the Cartographers Guild drew a map of the Empire whose size was that of the Empire, coinciding point for point with it. The following Generations ... saw the vast Map to be Useless and permitted it to decay and fray under the Sun and winters.”
And the critics should be careful with the adjectives. If anything is naïve and simplistic, it is the conviction that the legacy silos of academia should be fortified and that we should be forever content with current ways of making sense of the world. Surely our conceptions of politics, culture, and morality have much to learn from our best understanding of the physical universe and of our makeup as a species.
Steven Pinker is a contributing editor at The New Republic, the Johnstone Family Professor of Psychology at Harvard University, and the author, most recently, of The Better Angels of our Nature: Why Violence Has Declined.
Read More:
Science, Scientism, Philosophy, Humanities, TNR Best Of 2013, Culture
Science Is Not Your Enemy
An impassioned plea to neglected novelists, embattled professors, and tenure-less historians
Steven Pinker/August 6, 2013
UIG VIA GETTY IMAGES
The great thinkers of the Age of Reason and the Enlightenment were scientists. Not only did many of them contribute to mathematics, physics, and physiology, but all of them were avid theorists in the sciences of human nature. They were cognitive neuroscientists, who tried to explain thought and emotion in terms of physical mechanisms of the nervous system. They were evolutionary psychologists, who speculated on life in a state of nature and on animal instincts that are “infused into our bosoms.” And they were social psychologists, who wrote of the moral sentiments that draw us together, the selfish passions that inflame us, and the foibles of shortsightedness that frustrate our best-laid plans.
These thinkers—Descartes, Spinoza, Hobbes, Locke, Hume, Rousseau, Leibniz, Kant, Smith—are all the more remarkable for having crafted their ideas in the absence of formal theory and empirical data. The mathematical theories of information, computation, and games had yet to be invented. The words “neuron,” “hormone,” and “gene” meant nothing to them. When reading these thinkers, I often long to travel back in time and offer them some bit of twenty-first-century freshman science that would fill a gap in their arguments or guide them around a stumbling block. What would these Fausts have given for such knowledge? What could they have done with it?
WATCH: Leon Wieseltier's rejoinder: Science doesn't have all the answers
We don’t have to fantasize about this scenario, because we are living it. We have the works of the great thinkers and their heirs, and we have scientific knowledge they could not have dreamed of. This is an extraordinary time for the understanding of the human condition. Intellectual problems from antiquity are being illuminated by insights from the sciences of mind, brain, genes, and evolution. Powerful tools have been developed to explore them, from genetically engineered neurons that can be controlled with pinpoints of light to the mining of “big data” as a means of understanding how ideas propagate.
One would think that writers in the humanities would be delighted and energized by the efflorescence of new ideas from the sciences. But one would be wrong. Though everyone endorses science when it can cure disease, monitor the environment, or bash political opponents, the intrusion of science into the territories of the humanities has been deeply resented. Just as reviled is the application of scientific reasoning to religion; many writers without a trace of a belief in God maintain that there is something unseemly about scientists weighing in on the biggest questions. In the major journals of opinion, scientific carpetbaggers are regularly accused of determinism, reductionism, essentialism, positivism, and worst of all, something called “scientism.” The past couple years have seen four denunciations of scientism in this magazine alone, together with attacks in Bookforum, The Claremont Review of Books, The Huffington Post, The Nation, National Review Online, The New Atlantis, The New York Times, and Standpoint.
The eclectic politics of these publications reflects the bipartisan nature of the resentment. This passage, from a 2011 review in The Nation of three books by Sam Harris by the historian Jackson Lears, makes the standard case for the prosecution by the left:
Positivist assumptions provided the epistemological foundations for Social Darwinism and pop-evolutionary notions of progress, as well as for scientific racism and imperialism. These tendencies coalesced in eugenics, the doctrine that human well-being could be improved and eventually perfected through the selective breeding of the "fit" and the sterilization or elimination of the "unfit." ... Every schoolkid knows about what happened next: the catastrophic twentieth century. Two world wars, the systematic slaughter of innocents on an unprecedented scale, the proliferation of unimaginable destructive weapons, brushfire wars on the periphery of empire—all these events involved, in various degrees, the application of sceintific research to advanced technology.
The case from the right, captured in this 2007 speech from Leon Kass, George W. Bush’s bioethics adviser, is just as measured:
Scientific ideas and discoveries about living nature and man, perfectly welcome and harmless in themselves, are being enlisted to do battle against our traditional religious and moral teachings, and even our self-understanding as creatures with freedom and dignity. A quasi-religious faith has sprung up among us—let me call it "soul-less scientism"—which believes that our new biology, eliminating all mystery, can give a complete account of human life, giving purely scientific explanations of human thought, love, creativity, moral judgment, and even why we believe in God. ... Make no mistake. The stakes in this contest are high: at issue are the moral and spiritual health of our nation, the continued vitality of science, and our own self-understanding as human beings and as children of the West.
These are zealous prosecutors indeed. But their cases are weak. The mindset of science cannot be blamed for genocide and war and does not threaten the moral and spiritual health of our nation. It is, rather, indispensable in all areas of human concern, including politics, the arts, and the search for meaning, purpose, and morality.
The term “scientism” is anything but clear, more of a boo-word than a label for any coherent doctrine. Sometimes it is equated with lunatic positions, such as that “science is all that matters” or that “scientists should be entrusted to solve all problems.” Sometimes it is clarified with adjectives like “simplistic,” “naïve,” and “vulgar.” The definitional vacuum allows me to replicate gay activists’ flaunting of “queer” and appropriate the pejorative for a position I am prepared to defend.
Scientism, in this good sense, is not the belief that members of the occupational guild called “science” are particularly wise or noble. On the contrary, the defining practices of science, including open debate, peer review, and double-blind methods, are explicitly designed to circumvent the errors and sins to which scientists, being human, are vulnerable. Scientism does not mean that all current scientific hypotheses are true; most new ones are not, since the cycle of conjecture and refutation is the lifeblood of science. It is not an imperialistic drive to occupy the humanities; the promise of science is to enrich and diversify the intellectual tools of humanistic scholarship, not to obliterate them. And it is not the dogma that physical stuff is the only thing that exists. Scientists themselves are immersed in the ethereal medium of information, including the truths of mathematics, the logic of their theories, and the values that guide their enterprise. In this conception, science is of a piece with philosophy, reason, and Enlightenment humanism. It is distinguished by an explicit commitment to two ideals, and it is these that scientism seeks to export to the rest of intellectual life.
The Linder Gallery, c.1622-1629, Cordover Collection, LLC
The first is that the world is intelligible. The phenomena we experience may be explained by principles that are more general than the phenomena themselves. These principles may in turn be explained by more fundamental principles, and so on. In making sense of our world, there should be few occasions in which we are forced to concede “It just is” or “It’s magic” or “Because I said so.” The commitment to intelligibility is not a matter of brute faith, but gradually validates itself as more and more of the world becomes explicable in scientific terms. The processes of life, for example, used to be attributed to a mysterious élan vital; now we know they are powered by chemical and physical reactions among complex molecules.
Demonizers of scientism often confuse intelligibility with a sin called reductionism. But to explain a complex happening in terms of deeper principles is not to discard its richness. No sane thinker would try to explain World War I in the language of physics, chemistry, and biology as opposed to the more perspicuous language of the perceptions and goals of leaders in 1914 Europe. At the same time, a curious person can legitimately ask why human minds are apt to have such perceptions and goals, including the tribalism, overconfidence, and sense of honor that fell into a deadly combination at that historical moment.
Many of our cultural institutions cultivate a philistine indifference to science.
The second ideal is that the acquisition of knowledge is hard. The world does not go out of its way to reveal its workings, and even if it did, our minds are prone to illusions, fallacies, and super- stitions. Most of the traditional causes of belief—faith, revelation, dogma, authority, charisma, conventional wisdom, the invigorating glow of subjective certainty—are generators of error and should be dismissed as sources of knowledge. To understand the world, we must cultivate work-arounds for our cognitive limitations, including skepticism, open debate, formal precision, and empirical tests, often requiring feats of ingenuity. Any movement that calls itself “scientific” but fails to nurture opportunities for the falsification of its own beliefs (most obviously when it murders or imprisons the people who disagree with it) is not a scientific movement.
In which ways, then, does science illuminate human affairs? Let me start with the most ambitious: the deepest questions about who we are, where we came from, and how we define the meaning and purpose of our lives. This is the traditional territory of religion, and its defenders tend to be the most excitable critics of scientism. They are apt to endorse the partition plan proposed by Stephen Jay Gould in his worst book, Rocks of Ages, according to which the proper concerns of science and religion belong to “non-overlapping magisteria.” Science gets the empirical universe; religion gets the questions of moral meaning and value.
Unfortunately, this entente unravels as soon as you begin to examine it. The moral worldview of any scientifically literate person—one who is not blinkered by fundamentalism—requires a radical break from religious conceptions of meaning and value.
To begin with, the findings of science entail that the belief systems of all the world’s traditional religions and cultures—their theories of the origins of life, humans, and societies—are factually mistaken. We know, but our ancestors did not, that humans belong to a single species of African primate that developed agriculture, government, and writing late in its history. We know that our species is a tiny twig of a genealogical tree that embraces all living things and that emerged from prebiotic chemicals almost four billion years ago. We know that we live on a planet that revolves around one of a hundred billion stars in our galaxy, which is one of a hundred billion galaxies in a 13.8-billion-year-old universe, possibly one of a vast number of universes. We know that our intuitions about space, time, matter, and causation are incommensurable with the nature of reality on scales that are very large and very small. We know that the laws governing the physical world (including accidents, disease, and other misfortunes) have no goals that pertain to human well-being. There is no such thing as fate, providence, karma, spells, curses, augury, divine retribution, or answered prayers—though the discrepancy between the laws of probability and the workings of cognition may explain why people believe there are. And we know that we did not always know these things, that the beloved convictions of every time and culture may be decisively falsified, doubtless including some we hold today.
In other words, the worldview that guides the moral and spiritual values of an educated person today is the worldview given to us by science. Though the scientific facts do not by themselves dictate values, they certainly hem in the possibilities. By stripping ecclesiastical authority of its credibility on factual matters, they cast doubt on its claims to certitude in matters of morality. The scientific refutation of the theory of vengeful gods and occult forces undermines practices such as human sacrifice, witch hunts, faith healing, trial by ordeal, and the persecution of heretics. The facts of science, by exposing the absence of purpose in the laws governing the universe, force us to take responsibility for the welfare of ourselves, our species, and our planet. For the same reason, they undercut any moral or political system based on mystical forces, quests, destinies, dialectics, struggles, or messianic ages. And in combination with a few unexceptionable convictions— that all of us value our own welfare and that we are social beings who impinge on each other and can negotiate codes of conduct—the scientific facts militate toward a defensible morality, namely adhering to principles that maximize the flourishing of humans and other sentient beings. This humanism, which is inextricable from a scientific understanding of the world, is becoming the de facto morality of modern democracies, international organizations, and liberalizing religions, and its unfulfilled promises define the moral imperatives we face today.
Moreover, science has contributed—directly and enormously—to the fulfillment of these values. If one were to list the proudest accomplishments of our species (setting aside the removal of obstacles we set in our own path, such as the abolition of slavery and the defeat of fascism), many would be gifts bestowed by science.
The most obvious is the exhilarating achievement of scientific knowledge itself. We can say much about the history of the universe, the forces that make it tick, the stuff we’re made of, the origin of living things, and the machinery of life, including our own mental life. Better still, this understanding consists not in a mere listing of facts, but in deep and elegant principles, like the insight that life depends on a molecule that carries information, directs metabolism, and replicates itself.
Science has also provided the world with images of sublime beauty: stroboscopically frozen motion, exotic organisms, distant galaxies and outer planets, fluorescing neural circuitry, and a luminous planet Earth rising above the moon’s horizon into the blackness of space. Like great works of art, these are not just pretty pictures but prods to contemplation, which deepen our understanding of what it means to be human and of our place in nature.
And contrary to the widespread canard that technology has created a dystopia of deprivation and violence, every global measure of human flourishing is on the rise. The numbers show that after millennia of near-universal poverty, a steadily growing proportion of humanity is surviving the first year of life, going to school, voting in democracies, living in peace, communicating on cell phones, enjoying small luxuries, and surviving to old age. The Green Revolution in agronomy alone saved a billion people from starvation. And if you want examples of true moral greatness, go to Wikipedia and look up the entries for “smallpox” and “rinderpest” (cattle plague). The definitions are in the past tense, indicating that human ingenuity has eradicated two of the cruelest causes of suffering in the history of our kind.
Though science is beneficially embedded in our material, moral, and intellectual lives, many of our cultural institutions, including the liberal arts programs of many universities, cultivate a philistine indifference to science that shades into contempt. Students can graduate from elite colleges with a trifling exposure to science. They are commonly misinformed that scientists no longer care about truth but merely chase the fashions of shifting paradigms. A demonization campaign anachronistically impugns science for crimes that are as old as civilization, including racism, slavery, conquest, and genocide.
Just as common, and as historically illiterate, is the blaming of science for political movements with a pseudoscientific patina, particularly Social Darwinism and eugenics. Social Darwinism was the misnamed laissez-faire philosophy of Herbert Spencer. It was inspired not by Darwin’s theory of natural selection, but by Spencer’s Victorian-era conception of a mysterious natural force for progress, which was best left unimpeded. Today the term is often used to smear any application of evolution to the understanding of human beings. Eugenics was the campaign, popular among leftists and progressives in the early decades of the twentieth century, for the ultimate form of social progress, improving the genetic stock of humanity. Today the term is commonly used to assail behavioral genetics, the study of the genetic contributions to individual differences.
I can testify that this recrimination is not a relic of the 1990s science wars. When Harvard reformed its general education requirement in 2006 to 2007, the preliminary task force report introduced the teaching of science without any mention of its place in human knowledge: “Science and technology directly affect our students in many ways, both positive and negative: they have led to life-saving medicines, the internet, more efficient energy storage, and digital entertainment; they also have shepherded nuclear weapons, biological warfare agents, electronic eavesdropping, and damage to the environment.” This strange equivocation between the utilitarian and the nefarious was not applied to other disciplines. (Just imagine motivating the study of classical music by noting that it both generates economic activity and inspired the Nazis.) And there was no acknowledgment that we might have good reasons to prefer science and know-how over ignorance and superstition.
At a 2011 conference, another colleague summed up what she thought was the mixed legacy of science: the eradication of smallpox on the one hand; the Tuskegee syphilis study on the other. (In that study, another bloody shirt in the standard narrative about the evils of science, public-health researchers beginning in 1932 tracked the progression of untreated, latent syphilis in a sample of impoverished African Americans.) The comparison is obtuse. It assumes that the study was the unavoidable dark side of scientific progress as opposed to a universally deplored breach, and it compares a one-time failure to prevent harm to a few dozen people with the prevention of hundreds of millions of deaths per century, in perpetuity.
A major goad for the recent denunciations of scientism has been the application of neuroscience, evolution, and genetics to human affairs. Certainly many of these applications are glib or wrong, and they are fair game for criticism: scanning the brains of voters as they look at politicians’ faces, attributing war to a gene for aggression, explaining religion as an evolutionary adaptation to bond the group. Yet it’s not unheard of for intellectuals who are innocent of science to advance ideas that are glib or wrong, and no one is calling for humanities scholars to go back to their carrels and stay out of discussions of things that matter. It is a mistake to use a few wrongheaded examples as an excuse to quarantine the sciences of human nature from our attempt to understand the human condition.To simplify is not to be simplistic.
Take our understanding of politics. “What is government itself,” asked James Madison, “but the greatest of all reflections on human nature?” The new sciences of the mind are reexamining the connections between politics and human nature, which were avidly discussed in Madison’s time but submerged during a long interlude in which humans were assumed to be blank slates or rational actors. Humans, we are increasingly appreciating, are moralistic actors, guided by norms and taboos about authority, tribe, and purity, and driven by conflicting inclinations toward revenge and reconciliation. These impulses ordinarily operate beneath our conscious awareness, but in some circumstances they can be turned around by reason and debate. We are starting to grasp why these moralistic impulses evolved; how they are implemented in the brain; how they differ among individuals, cultures, and sub- cultures; and which conditions turn them on and off.
The application of science to politics not only enriches our stock of ideas, but also offers the means to ascertain which of them are likely to be correct. Political debates have traditionally been deliberated through case studies, rhetoric, and what software engineers call HiPPO (highest-paid person’s opinion). Not surprisingly, the controversies have careened without resolution. Do democracies fight each other? What about trading partners? Do neighboring ethnic groups inevitably play out ancient hatreds in bloody conflict? Do peacekeeping forces really keep the peace? Do terrorist organizations get what they want? How about Gandhian nonviolent movements? Are post-conflict reconciliation rituals effective at preventing the renewal of conflict?
History nerds can adduce examples that support either answer, but that does not mean the questions are irresolvable. Political events are buffeted by many forces, so it’s possible that a given force is potent in general but submerged in a particular instance. With the advent of data science—the analysis of large, open-access data sets of numbers or text—signals can be extracted from the noise and debates in history and political science resolved more objectively. As best we can tell at present, the answers to the questions listed above are (on average, and all things being equal) no, no, no, yes, no, yes, and yes.
The humanities are the domain in which the intrusion of science has produced the strongest recoil. Yet it is just that domain that would seem to be most in need of an infusion of new ideas. By most accounts, the humanities are in trouble. University programs are downsizing, the next generation of scholars is un- or underemployed, morale is sinking, students are staying away in droves. No thinking person should be indifferent to our society’s disinvestment from the humanities, which are indispensable to a civilized democracy.
Diagnoses of the malaise of the humanities rightly point to anti-intellectual trends in our culture and to the commercialization of our universities. But an honest appraisal would have to acknowledge that some of the damage is self-inflicted. The humanities have yet to recover from the disaster of postmodernism, with its defiant obscurantism, dogmatic relativism, and suffocating political correctness. And they have failed to define a progressive agenda. Several university presidents and provosts have lamented to me that when a scientist comes into their office, it’s to announce some exciting new research opportunity and demand the resources to pursue it. When a humanities scholar drops by, it’s to plead for respect for the way things have always been done.
Those ways do deserve respect, and there can be no replacement for the varieties of close reading, thick description, and deep immersion that erudite scholars can apply to individual works. But must these be the only paths to understanding? A consilience with science offers the humanities countless possibilities for innovation in understanding. Art, culture, and society are products of human brains. They originate in our faculties of perception, thought, and emotion, and they cumulate and spread through the epidemiological dynamics by which one person affects others. Shouldn’t we be curious to understand these connections? Both sides would win. The humanities would enjoy more of the explanatory depth of the sciences, to say nothing of the kind of a progressive agenda that appeals to deans and donors. The sciences could challenge their theories with the natural experiments and ecologically valid phenomena that have been so richly characterized by humanists.
In some disciplines, this consilience is a fait accompli. Archeology has grown from a branch of art history to a high-tech science. Linguistics and the philosophy of mind shade into cognitive science and neuroscience.
READ: The argument continues: Pinker and Wieseltier, Science vs. Humanities, Round III
Similar opportunities are there for the exploring. The visual arts could avail themselves of the explosion of knowledge in vision science, including the perception of color, shape, texture, and lighting, and the evolutionary aesthetics of faces and landscapes. Music scholars have much to discuss with the scientists who study the perception of speech and the brain’s analysis of the auditory world.
As for literary scholarship, where to begin? John Dryden wrote that a work of fiction is “a just and lively image of human nature, representing its passions and humours, and the changes of fortune to which it is subject, for the delight and instruction of mankind.” Linguistics can illuminate the resources of grammar and discourse that allow authors to manipulate a reader’s imaginary experience. Cognitive psychology can provide insight about readers’ ability to reconcile their own consciousness with those of the author and characters. Behavioral genetics can update folk theories of parental influence with discoveries about the effects of genes, peers, and chance, which have profound implications for the interpretation of biography and memoir—an endeavor that also has much to learn from the cognitive psychology of memory and the social psychology of self-presentation. Evolutionary psychologists can distinguish the obsessions that are universal from those that are exaggerated by a particular culture and can lay out the inherent conflicts and confluences of interest within families, couples, friendships, and rivalries that are the drivers of plot.
And as with politics, the advent of data science applied to books, periodicals, correspondence, and musical scores holds the promise for an expansive new “digital humanities.” The possibilities for theory and discovery are limited only by the imagination and include the origin and spread of ideas, networks of intellectual and artistic influence, the persistence of historical memory, the waxing and waning of themes in literature, and patterns of unofficial censorship and taboo.
Nonetheless, many humanities scholars have reacted to these opportunities like the protagonist of the grammar-book example of the volitional future tense: “I will drown; no one shall save me.” Noting that these analyses flatten the richness of individual works, they reach for the usual adjectives: simplistic, reductionist, naïve, vulgar, and of course, scientistic.
The complaint about simplification is misbegotten. To explain something is to subsume it under more general principles, which always entails a degree of simplification. Yet to simplify is not to be simplistic. An appreciation of the particulars of a work can co-exist with explanations at many other levels, from the personality of an author to the cultural milieu, the faculties of human nature, and the laws governing social beings. The rejection of a search for general trends and principles calls to mind Jorge Luis Borges’s fictitious empire in which “the Cartographers Guild drew a map of the Empire whose size was that of the Empire, coinciding point for point with it. The following Generations ... saw the vast Map to be Useless and permitted it to decay and fray under the Sun and winters.”
And the critics should be careful with the adjectives. If anything is naïve and simplistic, it is the conviction that the legacy silos of academia should be fortified and that we should be forever content with current ways of making sense of the world. Surely our conceptions of politics, culture, and morality have much to learn from our best understanding of the physical universe and of our makeup as a species.
Steven Pinker is a contributing editor at The New Republic, the Johnstone Family Professor of Psychology at Harvard University, and the author, most recently, of The Better Angels of our Nature: Why Violence Has Declined.
Read More:
Science, Scientism, Philosophy, Humanities, TNR Best Of 2013, Culture
Science as Ideology: Scientism
The use of the label “scientism” typically implies a negative judgment about a problematic fidelity to science, but a few theorists have embraced the label as well. There is no simple relationship between science and scientism. Many scientists reject scientism, while some humanities scholars promote it. When humanists decide they ought to work within a metaphysics they imagine to be scientific, they may feel compelled to adopt a materialist or reductionist framework rejecting traditional categories of humanistic inquiry, such as person, will, freedom, judgment, or agency. Insofar as natural sciences might not recognize those categories, some humanistic scholarship has been transformed—some would say attenuated—by the loss of such concepts (Pfau 2013).
We can identify at least four challenges for scientism. First, an overweening loyalty to science and rejection of nonscience may presuppose that such categories have discrete boundaries. As noted in Section 1, however, the longstanding attempt to characterize science through a definition or definitive methods has been largely unsuccessful. It has proven incredibly difficult to specify exactly what makes an approach to the world scientific, which obviously problematizes the derogation of nonscience. Second, the appeal to science can obscure the question of which parts of science are being drawn upon. If science consists of a variety of distinctive practices, answering many different questions with many different methodological approaches, then appeals to science simpliciter can obfuscate important questions about which science is being included, which omitted, and how it is analyzed. This is important because different scientific studies and methods often do not align to provide straightforward results: Separate analyses even of the very same data can yield remarkably divergent conclusions (Stegenga 2011). Third, proponents of scientism sometimes marshal their own scientific credentials to back their claims. In a society that grants so much cultural authority to scientists, those credentials can easily bestow rhetorical power. Nonetheless, scientific expertise does not automatically entail expertise in other areas, and it has proved all too easy for, say, some biologists to make philosophical and theological pronouncements without training in, or even appreciation for, those other fields of study. A fourth challenge faces scientism as a replacement for politics; the problem is that political debates are typically not exhausted by their scientific dimensions. Issues like climate change or race relations, for example, involve more than scientific results; they also include conceptions of justice, freedom, economics, and even religion, which are each infused with ethical concerns. Politics cannot be reduced to technical scientific problems, and so the attempt to convert essentially ideological debates into straightforward scientific hypotheses can misconstrue what is at stake and overlook important issues under debate (Oakeshott 1962, Bernstein 1976, Seliger 1976).
Insofar as science’s powers are rooted in methods aimed at studying nature independent of any ideologies, this also represents a limit to its application. While scientific inquiry can contribute to nearly any problem we face, science typically cannot determine the solutions to those problems on its own; to think otherwise is to fall prey to scientism. Most real-world problem solving involves more than just applying scientific results; it also involves complex philosophical and ethical judgments, whether or not those are explicitly articulated.
Science and Ideology
This article illustrates some of the relationships between science and ideologies. It discusses how science has been enlisted to support particular ideologies and how ideologies have influenced the processes and interpretations of scientific inquiry.
An example from the biological sciences illustrates this. In the early 20th century, evolutionary theory was used to support socialism and laissez-faire capitalism. Those two competing ideologies were justified by appeal to biological claims about the nature of evolution.
Those justifications may seem puzzling. If science claims to generate only a limited set of facts about the world—say, the mechanisms of biological diversification—it is unclear how they could inform anything so far removed as economic theory. Part of the answer is that the process of interpreting and applying scientific theories can generate divergent results. Despite science’s capacities to render some exceedingly clear and well-verified central cases, its broader uses can become intertwined with separate knowledge claims, values, and ideologies. Thus, the apparently clear deliverances of natural sciences have been leveraged to endorse competing views.
Rightly or wrongly, this leveraging has long been part of the aims and practice of scientists. Many of the Early Modern progenitors of natural science hoped that science would apply to large swaths of human life. They believed that science could inform and improve politics, religion, education, the humanities, and more. One fictional version of this ideal, from Francis Bacon in the 17th century, imagined scientists as the political elites, ruling because they are best equipped to shape society. Such hopes live on today.
It is not only in its applications that science can become ideological; ideologies also can be part of the formation of sciences. If natural sciences are not hermetically sealed off from society, but instead are permeable to social values, power relations, or dominant norms of an era, then it is possible for science to reflect the ideologies of its practitioners. This can have a particularly pernicious effect when the ideologies that make their way into the science are then claimed to be results derived from the science. Those ideologies, now “naturalized,” have sometimes been granted added credibility because of their supposedly scientific derivation.
Not all sciences seem equally susceptible to ideological influence or appropriation. Ideologies seem to have closer connections to those sciences investigating topics nearer to human concerns. Sciences that claim to bear upon immigration restrictions, government, or human sexuality find wider audiences and wider disputes than scientific conclusions limited to barnacle morphology or quantum gravity.
The potential for science to become entwined with ideology does not necessarily undermine scientific claims or detract from science’s epistemic and cultural value. It hardly makes science trivial, or just one view among others. Science must be used well and taken seriously in order to solve real-world challenges. Part of taking science seriously involves judicious analysis of how ideologies might influence scientific processes and applications.
The topic is vast, and this article confines itself to some historical cases that exemplify significant interactions between science and ideologies.
Table of Contents
- Terminology
- Science and Political Economy
- Science and Race
- Science and Gender
- Science and Religion
- Science as Ideology: Scientism
- Conclusion
- References and Further Reading
THE ENCYCLOPEDIA OF LIBERTARIANISM
Spencer, Herbert (1820-1903)
https://www.libertarianism.org/encyclopedia/spencer-herbert
Although Herbert Spencer has been rightly regarded as the most influential libertarian theoretician of the 19th century, he was much more than that. He also was a founding father of modern sociology, a pioneer in the theory of evolution—his theories were developed prior to those of Charles Darwin—an important figure in progressive education—his name still adorns the education building at Stanford University—and a philosopher of distinction whose work in ethics and epistemology gained the respect of J. S. Mill and other notable contemporaries. He was, moreover, an early proponent of the rights of children, of equal rights for women, and of other civil liberties that have since gained widespread acceptance in Western democracies.
Spencer lived during a period that coincided with the rise and fall of political liberalism—or what Spencer called true liberalism, what today is known as classical liberalism. This political philosophy, which had made great strides in bringing about greater individual freedom in religion, commerce, speech, and other areas, had suffered a setback during the Napoleonic Wars. After this conflict ended in 1815, England experienced a revival of liberal ideas. Peace brought with it not only a resurgence of these views, but also the development of a form of social organization, “voluntary cooperation” founded on a “regime of contract,” that supplanted much of an older form of social organization based on “compulsory cooperation” that characterized a “regime of status.”
Unfortunately, things had changed for the worse by the time Spencer published The Man versus the State in 1884. In the first of four essays that comprise this work, “The New Toryism,” Spencer noted that “most of those who now pass themselves off as Liberals are Tories of a new type.” This new liberalism had abandoned its original central principle that “habitually stood for individual freedom versus State-coercion.” Instead, it embraced the Tory principle of unlimited state authority with a slight difference: Whereas the Tories prior to the Glorious Revolution had vested unconditional authority in the monarch, a theory that was opposed to the Whig doctrine of conditional authority, modern liberalism, riding the wave of democratic sentiments, vested unconditional authority in “the people.” Spencer viewed this disagreement as a distinction without an essential difference: “the real issue is whether the lives of citizens are more interfered with than they were; not the nature of the agency which interferes with them.” A person is no less coerced and his rights are no less violated merely because unjust restrictions on his liberties are imposed by a majority, rather than by a single ruler.
Spencer offered several reasons for the transformation of what had been a philosophy of individual liberty into a new type of statism. The most interesting of these reasons, from a philosophical point of view, is one that he had cautioned against decades earlier. While in his early 20s, Spencer published a series of 12 letters (1842–1843) in Edward Miall’s periodical, The Nonconformist. Collectively titled “The Proper Sphere of Government,” these articles address a central problem of political philosophy, viz: “Is there any boundary to the interference of government? and, if so, what is that boundary?”
Spencer responded with the classical Lockean doctrine that the fundamental purpose of government was “to defend the natural rights of man—to protect person and property—to prevent the aggressions of the powerful upon the weak—in a word, to administer justice.” He contrasts this conception with the common belief that the purpose of government is to promote the “general good.” The “general good,” Spencer noted, lacks a determinate meaning, so it cannot serve as a standard, or criterion, of legislation. Its vagueness gives to government a blank check on power. Has not every law, no matter how tyrannical, been justified by appealing to the general good? Spencer concluded that the “general good” cannot serve to define the duties of government because the purpose of any definition is “to mark out the boundaries of the thing defined,” and “that cannot be a definition of the duty of a government, which will allow it to do anything and everything.”
Spencer expanded on this theme in his first book, Social Statics (1851). Here he focuses his criticism on the principle of utility defended by Jeremy Bentham and his followers. He there stated that a government should promote the greatest happiness for the greatest number of people. Spencer noted that standards of happiness are “infinitely variable,” so the principle of utility, although it may serve as a general formulation of the purpose of government, cannot serve as a determinate standard of legislation; an appeal to social utility does not tell us which measures a government should, and should not, enact. Hence, doctrines of expediency—whether expressed in terms of utility or the general good—“afford not a solitary command of a practical character. Let but rulers think, or profess to think, that their measures will benefit the community, and your philosophy stands mute in the presence of the most egregious folly, or the blackest misconduct.” Social Statics contained Spencer’s first extended justification of his celebrated “law of equal freedom,” according to which “every man may claim the fullest liberty to exercise his faculties compatible with the possession of like liberty by every other man.” The young Spencer, having been raised in a tradition of Protestant dissent, which he once described as “an expression of antagonism to arbitrary control,” grounded this principle in a divinely ordained duty to pursue happiness, which in turn requires the freedom to exercise one’s faculties according to one’s own judgments. He further defends a version of the moral sense theory that had been developed by Francis Hutcheson and other luminaries of the Scottish Enlightenment.
Spencer later abandoned these doctrines, replacing them with an ethical theory that was thoroughly positivistic and more attuned to his theory of evolution. The “establishment of rules of right conduct on a scientific basis is a pressing need,” wrote Spencer in 1879, and he published his two-volume Principles of Ethics to fill this need. These volumes constitute the final volumes of his massive Synthetic Philosophy, a project that took 38 years to complete. Spencer’s efforts to deduce moral rules, including the law of equal freedom, from the “laws of life” and thereby achieve “results which follow … in the same necessary way as does the trajectory of a cannon-shot from the laws of motion and atmospheric resistance” had mixed results. Some critics, including those who were otherwise sympathetic to Spencer’s ideas, have claimed that this scientistic approach to ethics undermined the earlier humanistic tradition of natural rights. However one may appraise Spencer’s “scientific” system of ethics, there can be little doubt it later became discredited as the Larmarckian theory of evolution on which it was based (which upheld the inheritability of acquired characteristics) fell into disfavor.
Spencer’s sociological insights almost certainly were to influence later libertarian thinkers such as Albert J. Nock more than did his ethical theories. In The Study of Sociology, Spencer pointed to instances of short-sighted political thinking by persons who have but a rudimentary grasp of social causation and who accordingly propose simplistic political solutions for complex social problems. Many people are ignorant of physical causation, he observed, so it is perhaps no surprise that many more are ignorant of social causation, “which is so much more subtle and complex.” Where there is little or no appreciation of social causation, “political superstitions” flourish. Among these false notions is the belief that government has a special efficacy “beyond that naturally possessed by a certain group of citizens subsidized by the rest of the citizens.” In addition, the “ordinary political schemer is convinced that out of a legislative apparatus, properly devised and worked with due dexterity, may be had beneficial State-action, without any detrimental reaction.”
In addition to his other contributions to libertarian theory, such as his detailed typology of the militant and industrial forms of social organization, Spencer made seminal contributions to the theory of spontaneous order. In The Principles of Sociology, Spencer likened social development to a “rolling snowball or a spreading fire” where there is “compound accumulation and acceleration.” An intricate social network evolves as in a market economy that is so interdependent that any considerable change in one activity “sends reverberating changes among all the rest.” Society, in other words, is an unplanned spontaneous order, one that “grows” rather than is “made.” A major function of sociology—which in Spencer’s conception subsumed economics—is to explain the evolution of this order that is the result of human action, but not of human design. The difficulty of this task is why Spencer displayed such contempt for social planners: “A fly seated on the surface of the body has about as good a conception of its internal structure, as one of the schemers has of the social organization in which he is embedded.”
Further Readings
Caneiro, Robert L. “Herbert Spencer as an Anthropologist.” Journal of Libertarian Studies 5 no 2 (Spring 1981): 153–210.
Francis, Mark. Herbert Spencer and the Invention of Modern Life. Ithaca, NY: Cornell University Press, 2007.
Smith, George H. “Herbert Spencer’s Theory of Causation.” Journal of Libertarian Studies 5 no. 2 (Spring 1981): 113–153.
Spencer, Herbert. The Man versus the State; with Six Essays on Government, Society, and Freedom. Indianapolis, IN: Liberty Classics, 1981.
———. “The Proper Sphere of Government.” Political Writings. John Offer, ed. Cambridge: Cambridge University Press, 1994.
———. Social Statics. New York: Robert Schalkenbach Foundation, 1954.
by George H. Smith
Originally published August 15, 2008.
See alsoLiberalism, Classical; Limited Government; Sociology and Libertarianism; Spontaneous Order.
tremendous popularity among his contemporaries is difficult to believe. Though he was
an “independent scholar” in the truest sense of the word—refusing all institutional
affiliation and almost all public recognition of any kind—he was famous not just among
intellectuals, but among educated readers worldwide. He was, quite possibly, the only
philosopher in history whose books sold more than a million copies during his lifetime.1
Spencer’s work was translated into more than a dozen languages, first into Russian, then
to French, German, Italian, and Spanish, and eventually to Hungarian, Bohemian, Polish,
Dutch, Swedish, Greek, Japanese, Chinese, and possibly even Mohawk.2 For many non‐
Western readers he stood for science, rationality, and progress against backwardness
and mysticism; his anti‐imperialism took some of the sting out of his affirmations of
European superiority.3 Educated men and women all over the world considered Spencer
the master thinker of the age, whether or not they agreed with him.
Spencer wrote on an incredible variety of topics, from physics to metaphysics,
from biology to aesthetics. His greatest work, and the one which cemented his fame,
was his ten‐volume Synthetic Philosophy, which covered all scientific phenomena from
the beginning of the cosmos to its end, embracing biology, psychology, sociology, and
ethics. He also wrote on topics as diverse as style, manners, music, art, health, and
parenting. Yet today his books sit in libraries unread. One brave explorer, reporting that
he was the first to read the Royal Society’s copy of The Principles of Biology, described
the volumes as “thicker and squarer than Gibbon’s, each bound in a cloth which has
acquired with age a reptilian colour and texture, so putting one in mind of some great
extinct monster of philosophic learning.”4 The metaphor is pervasive: in a book on
Spencer’s sociology, J. D. Y. Peel referred to Spencer’s works as “the fossil remains of an
extinct megasaur,” while Richard Hofstadter called them “a fossil specimen from which
the intellectual body of the period may be reconstructed.”5 Spencer’s work no longer
lives for modern readers. However, a study of its petrified remains provides a window
into the intellectual culture of the Victorian era. This is particularly true of the American
scene, for Spencer was more popular in the United States than in his home country,
especially at the beginning of his career
1 Michael W. Taylor, The Philosophy of Herbert Spencer (London: Continuum, 2007), 4. Taylor
extrapolates based on the nearly 370,000 books sold by his American publisher alone.
2 John Offer, Herbert Spencer and Social Theory (Houndmills, Basingstoke, Hampshire: Palgrave
Macmillan, 2010), 8‐9.; Naomi Beck, “The Diffusion of Spencerism and its Political Implications in France
and Italy,” in Herbert Spencer: The Intellectual Legacy, Proceedings of a Conference Organised by the
Galton Institute, London, 2003 (London: The Galton Institute, 2004), 41‐57.
3 J. D. Y. Peel, Herbert Spencer: The Evolution of a Sociologist (New York: Basic Books, 1971),
Being Modern
The Cultural Impact of Science
in the Early Twentieth Century
Edited by Robert Bud
Paul Greenhalgh
Frank James
Morag Shiach
PDF E-BOOK
https://discovery.ucl.ac.uk/id/eprint/10057847/1/Being-Modern.pdf
Foreword
History of science lacks organising narratives for the twentieth century.
This is especially true when we widen the lens to the discipline’s
more-inclusive coterie: science, technology, engineering, mathematics
and medicine. Mostly, we’ve chosen war as a narrative structure.
Add imperialism. Add globalisation, though that seems simply to be
imperialism by another name. We seek narratives that either describe or
explain science’s growing presence, resonance and (dare we suggest)
hegemony across a plenitude of landscapes. Try as we might, these
continue to prove elusive.
One viable choice engages the century’s endlessly nuanced
encounter with Modernity. Whatever Modernity is, or was, we seem
certain science is somehow intimately associated. At once science seems
causal for and caused by this thing, this philosophy, this miasma. Our
quest to delineate precisely what and how has led us scholars towards
ever more refined species of its genus. We seem to be getting somewhere,
though the going is slow and the way is sometimes lost.
Being Modern shifts our perspective from observer to participant.
The aim is to capture Modernity at work within mentalities, within
cultural and biographical aesthetics, within the collisions between
scientific and other things occurring in the lived experience of the people
we study and from within their perspective. This anthology is a collective
study of potency, infection and resistance.
The result is a refreshing alternative to scholastic delineations of
movements seen from abstracting distances. This collection of original
papers delivers richly researched, critical and thought-filled case studies
of Modernity as an actor’s category, observed in situ. It ranges across
familiar and new settings. It certainly will help us as we build a better
conceptualisation of the Modern both as project and product.
Joe Cain
Professor of History and Philosophy of Biology
Head of Department of Science and Technology Studies
UCL
Edited by Robert Bud
Paul Greenhalgh
Frank James
Morag Shiach
PDF E-BOOK
https://discovery.ucl.ac.uk/id/eprint/10057847/1/Being-Modern.pdf
Foreword
History of science lacks organising narratives for the twentieth century.
This is especially true when we widen the lens to the discipline’s
more-inclusive coterie: science, technology, engineering, mathematics
and medicine. Mostly, we’ve chosen war as a narrative structure.
Add imperialism. Add globalisation, though that seems simply to be
imperialism by another name. We seek narratives that either describe or
explain science’s growing presence, resonance and (dare we suggest)
hegemony across a plenitude of landscapes. Try as we might, these
continue to prove elusive.
One viable choice engages the century’s endlessly nuanced
encounter with Modernity. Whatever Modernity is, or was, we seem
certain science is somehow intimately associated. At once science seems
causal for and caused by this thing, this philosophy, this miasma. Our
quest to delineate precisely what and how has led us scholars towards
ever more refined species of its genus. We seem to be getting somewhere,
though the going is slow and the way is sometimes lost.
Being Modern shifts our perspective from observer to participant.
The aim is to capture Modernity at work within mentalities, within
cultural and biographical aesthetics, within the collisions between
scientific and other things occurring in the lived experience of the people
we study and from within their perspective. This anthology is a collective
study of potency, infection and resistance.
The result is a refreshing alternative to scholastic delineations of
movements seen from abstracting distances. This collection of original
papers delivers richly researched, critical and thought-filled case studies
of Modernity as an actor’s category, observed in situ. It ranges across
familiar and new settings. It certainly will help us as we build a better
conceptualisation of the Modern both as project and product.
Joe Cain
Professor of History and Philosophy of Biology
Head of Department of Science and Technology Studies
UCL
Technological parables and iconic illustrations: American Technocracy and the rhetoric of the technological fix
Sean F. Johnston
University of Glasgow
School of Interdisciplinary Studies
Abstract
This paper traces the role of American technocrats in popularizing the notion later dubbed the “technological fix”. Channeled by their long-term “chief”, Howard Scott, their claim was that technology always provides the most effective solution to modern social, cultural and political problems. The account focuses on the expression of this technological faith, and how it was proselytized, from the era of high industrialism between the World Wars through, and beyond, the nuclear age. I argue that the packaging and promotion of these ideas relied on allegorical technological tales and readily-absorbed graphic imagery. Combined with what Scott called “symbolization”, this seductive discourse preached beliefs about technology to broad audiences. The style and conviction of the messages were echoed by establishment figures such as National Lab director Alvin Weinberg, who employed the techniques to convert mainstream and elite audiences through the end of the twentieth century. Keywords technocracy; technological fix; rhetoric; Howard Scott; Alvin M. Weinberg Introduction Confidence in societal progress via engineering solutions became a feature of industrial discourse from the early twentieth century. 1 This paper addresses the popularization of such modernist faith over subsequent decades, focusing on the narrative techniques that underlay them. It argues that effective rhetoric about the problem-solving powers of technologies was developed and delivered by two key apostles, the technocrat Howard Scott and national post-WWII laboratory director Alvin Weinberg. Their evangelizing of the transformative social and political potential of technologies was unusually enduring, influencing broad audiences through the end of the century. The paper focuses on a specific but fertile article of their shared faith: the notion that technological solutions are superior to more traditional political, economic, educational, and other social-science approaches to problem-solving. In the most radical form of the claim, its proponents argued that technological innovation could bypass or entirely replace these traditional approaches to human issues. By tracing the idea through its networks of dissemination, and employing close textual analysis of newly available sources, the paper addresses how modern technological beliefs were packaged and spread for wider publics. The account traces these discourses about technology, and the ways they were communicated, from the era of high industrialism between the World Wars through, and beyond, the nuclear age. Trust in the transformative social powers of technology was promoted most consistently in North America by a handful of self-identified “technocrats”, identified by John M. Jordan as the most radical of a wave of progressive technologists.2 Centered initially on groups associated with autodidact engineer Howard Scott (1890-1970), the nascent concept was later refined and championed through the speech-making of physicist-administrator Alvin Weinberg (1915-2006) from the 1960s who dubbed it “the technological fix”. The time frame, historical correlations and methodology of this study are noteworthy extensions of prior researches. Scott and Weinberg are both well known to historians of the twentieth century in the distinctive contexts of interwar Technocracy and postwar nuclear power, respectively, but they and their organizations have previously been studied History and Technology 33 (2017) “Technological parables and iconic illustrations: American Technocracy and the rhetoric of the technological fix” 2 separately and over the periods of their greatest public prominence, and with attention to more diffuse themes. 3 The present paper instead begins from such familiar but segregated accounts to trace the intersecting professional activities of these key promoters over some nine decades. Its focus is not the flowering and decline of a political movement, or of societal experimentation with novel energy supplies. Instead, the work specifically tracks the promotion of engineering solutions for societal problems, a notion that was condensed into popular faith in technological fixes. The research is based on hitherto unavailable archival holdings that chronicle this broad timespan via a variety of unpublished correspondence, speeches, exhibition materials and limited-circulation texts. Importantly, the archives of regional Technocracy chapters extend some thirty years beyond the death of their founder to the end of the twentieth century, and document how narratives about the societal power of technology mutated during the postScott era for members and their targeted audiences. Similarly, the unpublished papers of Alvin Weinberg provide significant insights into how his private views and public addresses about technology altered over the latter decades of his career. The collections reveal how, in both contexts, their creators dedicated unusual attention to condensing and communicating their claims. These textual and illustrative materials consequently provide privileged access to evolving notions of technological fixes and to the development of influential rhetorical practices. 4 Indeed, careful attention to the nature of this discourse, and its orientation toward wider culture, is at the methodological center of this piece.5 Focusing on a close-reading of the speeches, articles and illustrations employed by both Scott and Weinberg, I argue that the techniques of popularization adopted by them were markedly different from traditional engineering communications. This rhetorical interpretation illustrates how their style of dissemination, as much as the rationale of their arguments, promoted cultural confidence in technological fixes. The work argues that self-evident and simple examples were presented as easily-absorbed tales that reshaped the radical discourse of interwar technocracy into a style of communication amenable to post-Second World War policymaking and public understandings of science and technology. Key determinants in this transition were the characters of Howard Scott and Alvin Weinberg as energetic missionaries, and the form and content of their rhetoric, which supported a form of persuasion more akin to religious discourse. The resonances between expressions of technological confidence, social progress and religious faith had been remarked as early as the 1920s, with Dora Russell, for example, linking American industrial zeal (“the dogmas of machine-worshippers”) with the social ideals of the Russian revolution.6 Both the ideological and theological connotations of this conviction, and more particularly the style of communication by which it was promoted, are threads interwoven through this paper. It focuses on how the deceptively discrete and simple claim was proselytized to influence wider cultural creeds. Scott and Weinberg preached tales of wise technological problem-solving to broad audiences. Their typical narrative structure resembled a parable, and iconic graphics replaced detailed illustrations. Recounting universalized tales of engineering authority and honed by years of repetition, the sparse narratives and concrete examples attracted successive waves of receptive audience. The timescale, comparative approach, and attention to the style of dissemination to broad audiences argue that faith in the progressive nature of technologies was not limited to a naïve period of early engagement, but became a confidence embedded throughout modern culture by the late twentieth century
Subscribe to:
Posts (Atom)