Alcibiades—the famously handsome Athenian, ward of Pericles, friend and pupil of Socrates, and charismatic general infamous for serial disloyalty—was one of the most remarkable figures of the Golden Age of Athens. In Nemesis: Alcibiades and the Fall of Athens, David Stuttard gives us a riveting account of the man and his age, “bringing to singing life,” per Paul Cartledge, “the mercurial, magnetic, passionate, and persuasive personality of this still hugely controversial Athenian aristocrat of the fifth century BC.” In what follows below, a slightly condensed version of his recent British Museum presentation on the book, Stuttard introduces us to one of ancient Greece’s most fascinating and slippery characters.
I want to take you back just over 2,400 years to the high Anatolian plain of central Turkey. It’s the year 404 BC; it’s Autumn; and it’s the dark hours of the night. Alcibiades, perhaps the most controversial Greek of his generation, is living in exile in a compound at Melissa—probably modern Afyonkarahisar—where strange rock formations erupt out of the rolling plain, near the fabled Royal Road that runs from Sardis in the west to Susa, capital of Persia’s Empire, in the east. For now, everyone inside is sleeping, but then something awakens them. Perhaps the barking of a dog. Or perhaps the acrid smell of burning creeping through the rooms, or the ever-louder crackling of fire as brown smoke pours in beneath the door, and through the cracks beside the doorposts.
Here, from my book, is what happened next:
Fully awake now, fully alert, Alcibiades leapt out of bed and threw the door wide open. Outside, stacks of dried wood had been piled high, and the flames were already tearing through them. Shouting to the people in the house to help, he dragged out rugs and mattresses and blankets, and flung them on the fire. The flames were smothered. At least for the time being. But now the smoke was bellying, and tongues of fire were licking at the edges of the blankets, and the orange heat was growing more intense. And then the arrows came. From all directions. Thudding into walls and roof and earth. Anonymous and deadly. The only warning of their approach a soft sighing of air.
The household was panic-stricken as Alcibiades, his instincts kicking in, reached for his weapons. But they had disappeared. Somehow in the night someone had taken them. All that he had now was a short knife, which a comrade pressed into his hand. But no shield. No armour. Just a blanket wrapped around his left arm as he stood, poised on the threshold. And then, calling to his friend to follow him, he bellowed his war cry and ran, naked and exposed, out into the darkness. Silhouetted against the burning house, he made an easy target. From all around him javelins rained and arrows thumped like hail as first one, then another, then another found its mark. All Alcibiades could do was run into the night, and run, and keep on running while he could, until the night engulfed him.
That description’s based meticulously on ancient sources, but all seems so theatrical, so filmic, so larger than life. But political assassinations can be larger than life, and Alcibiades’ life was every bit as complex and nail-biting as any modern political thriller.
Alcibiades was born bang in the middle of the fifth century BC, in 452 in Athens, a city that nearly sixty years before had expelled a hated ruler and established the most radical participatory democracy the world had ever seen. His mother’s family and his father’s were wealthy aristocrats from powerful families whose members numbered famous politicians, winners in Olympic chariot races, decorated war heroes—and, go back far enough, even Homeric heroes of the Trojan War. And when Alcibiades’ father, Cleinias, was killed in battle, the five-year-old boy was looked after by the most powerful politician of his day, the equally aristocratic Pericles, whose policy shaped democratic Athens, whose dream inspired the Parthenon, whose ambition was for the Athenian Empire to rule the waves.
Now, already we’ve uncovered some pretty big dichotomies: Athens, a democracy born out of a revolution against aristocrats, but now effectively governed by aristocrats. Athens, a democracy that was at the same time at the head of an empire—treating subject states, moreover, in a distinctly undemocratic way. To navigate the potential minefield of Athenian politics, people such as Pericles needed to be very astute indeed—in fact, it’s fascinating to see how far he went to seem to slough off his aristocratic roots. He was distinctly unostentatious; he made sure never to show emotion in public; and he never ever was so flamboyant as to enter a chariot to race at the Olympic Games.
How unlike his ward, Alcibiades! From the very start, Alcibiades embraced his bloodline with passionate enthusiasm. Even from childhood, he seems to have been motivated not by contemporary values of demokratia (literally ‘People Power’) but by the creed of the Homeric heroes whose blood pumped in his veins, the creed that urged them ‘aien aristeuein kai hupeirochon emmenai allon’, ‘always to be best and to surpass all others’. Now, the most famous of all the Greek heroes, whose father sent him off to Troy with these very words, was not in fact one of Alcibiades’ supposed ancestors. It was Achilles, and what I’d like to do in this talk is not simply give a potted biography of Alcibiades, but rather to explore how Alcibiades did everything he could to live up to Achilles’ creed, and how—in doing so—he set himself on a path that would see him not only mirroring or surpassing many of Achilles’ triumphs but repeating many of the Homeric hero’s mistakes.
Aien aristeuein. Even as a child, Alcibiades wanted to be best. In later years, he was accused of refusing to take part in athletic contests, because to do so meant competing with the low-born and ill-educated. In fact, it was probably because he was afraid of being defeated. He refused to learn the aulos, an instrument rather like an oboe, because (he claimed) it made his cheeks puff out and spoilt his beauty, but again it was probably because he knew he’d never be a virtuoso.
Beauty, by the way, was something he needed to have no worries about. He was quite simply the most handsome youth in Athens. And—according to his biographer Plutarch, at least—remained handsome throughout his life. Sadly there are few, if any, reliable representations of him, so we need to take accounts of his good looks on trust, but they and his fiery character do seem to have made young Alcibiades a problematic pupil. But fortunately, just as legend tells us how the hero Achilles was educated by the wise centaur, Cheiron, Alcibiades too found a charismatic teacher. Not the old bumbling tutor Zopyrus that Pericles found for him, but the rapier-sharp, argumentative, and controversial Socrates.
Socrates’ and Alcibiades’ relationship shines through contemporary literature, but there was more to it than philosophy. Famously the two are said to have served on campaign together when the Athenians sent an army north to Potidaea, when that city tried to break away from Athens’ empire. Almost as soon as they arrived they were involved in a hard-fought battle:
A forest of spears jabbing, thrusting, breaking now; the clash of shield on shield; the dust, the shouting and the screams; the sudden impact of a heavy blow; a flash of swords; a blossoming of pain; a jet of blood; a jostling of bodies, before one side collapsed in disarray, and its hoplites fled in panic, while close at their heels the enemy, a pack of bronze men, masked in the anonymity of grim, gleaming helmets, ran in merciless pursuit. ‘In battle’, as one poet proclaimed, ‘it’s the sweetest thing to slice your running enemy full through the midriff.’
Which is how it all unfolded on that warm September day at Potidaea, on a narrow spit of land between two lazy seas. A battle like so many others. A skirmish, which so easily might be forgotten. Except for an image which seared itself into the minds of those Athenians who saw it: a young man plunging fearlessly into the heart of the melée; a young man fighting with ferocious bravery; a young man falling, wounded on the blood-red soil. And then, within a heartbeat, an older man stood over him, battling the enemy as they swarmed around him, scooping up the young man and supporting him, desperately slicing a path back to safety, an act of almost superhuman strength and fearlessness, an act which saved the young man’s life.
Almost certainly, in that moment Socrates (for the older man was he) rescued the injured Alcibiades from an all-too-early grave. But when the battle was over and the Athenians victorious, when the generals were discussing whom they should honour with the coveted award for bravery, the philosopher refused it, insisting instead that it should go to Alcibiades. And so in a ceremony held before the gathered troops, wounded but triumphant, his beauty not only undiminished but burnished by his brush with death, the son of Cleinias claimed his prize—a suit of armour and a victor’s wreath.
The prize was called the aristeia, the prize awarded to the best man in the army. ‘Aien aristeuein kai hupeirochon emmenai allon’: already in his first encounter with the enemy, Alcibiades had lived up to Achilles’ creed.
It was in his thirties, the age when a man could enter politics, that Alcibiades most avidly pursued his dream to be the best—not just in Athens but in the whole Greek world. And the setting in which he chose to realise this dream was the highest-profile gathering, where men from across the whole Greek world, and especially the great and good, came every four years to take part in a festival in honour of the great god Zeus, to sacrifice, to banquet, to indulge in top-level (often secret) diplomacy: the festival of the Olympic Games. Alcibiades, you will recall, refused to compete personally in athletic contests because it might mean being defeated by someone whom he classed as his inferior. But his ancestors had an enviable track record of winning chariot races, and for Alcibiades chariots were an obsession.
The horses that pulled the chariot of the hero Achilles were immortal and endowed with human speech. Alcibiades’ horses weren’t quite in the same league, but he did have enviable stud farms on his estates in Attica; he personally tracked down the best, most streamlined chariots; and come the Olympic Games of 416 BC, he was determined that nothing but nothing would come in his way of winning. That August, with dazzling self-confidence, magnetic poise and an unerring instinct for self promotion, he entered into the Olympic Games not one but seven chariots. For the four-horse chariot race. Which meant that he brought with him no fewer than twenty eight horses. The result was breathtaking, if unsurprising. As Alcibiades watched the chariots shoot past the post, he felt enormous satisfaction: first, second, third—they all belonged to him. It was a cause of great rejoicing, a well-earned pretext for an orgy of self-aggrandizement.
At Olympia Alcibiades held a lavish banquet, sponsored by friends and allies in Ionia and the east Aegean islands, a banquet to which he invited every single person attending the Olympic Games. It was a stunning demonstration of his power and popularity. But not everyone was as impressed with it as Alcibiades. For many Athenians it was simply anathema to see a pampered young aristocrat swanning around town with the kind of airs and graces normally associated with hateful oligarchs—or worse, the tyrants they had driven out a century before, in whose blessed absence Athens instigated her democracy. And when Alcibiades proposed his latest great idea, there were not a few who feared it represented just the next step on his path towards a power grab, towards tyranny. And what was that idea? To lead Athens to glory, as Achilles led the Greeks at Troy, by launching a military expedition against Sicily.
Now, this wasn’t actually anything particularly revolutionary. There had already been two expeditions sent to Sicily within the past fifteen years with the intention of helping allied cities, who felt threatened by Peloponnesian colonies. But what made this expedition controversial was in part the hostility of Alcibiades’ greatest political rival, Nicias, a hostility that ended up backfiring, because, rather than stopping the expedition, the Athenian Assembly voted to double the numbers of men and ships involved, and turned it into something Alcibiades never intended in the first place: an invasion force of such size that it seemed as if its purpose was nothing short of the annexation of the whole of Sicily. The other things that made the expedition controversial was a scandal that erupted just before it sailed, a scandal that would almost ruin Alcibiades.
For some time before the expedition sailed the atmosphere in Athens was febrile as Alcibiades and Nicias traded insults like two Homeric heroes. What made things worse was that both were part of the expedition’s leadership. Then one morning, a dreadful sacrilege: throughout the city, statues had been systematically disfigured and defaced. But not just any statues. Whoever was behind it had chosen their targets carefully: the so-called Herms—squared pillars topped with the head of the god Hermes, and furnished, halfway up, with genitalia and an exuberant, erect phallus. Hermes was the god of travellers, the god who would protect the expedition as it sailed. A committee of enquiry was hurriedly set up offering not just immunity from prosecution but rewards to any who came forward with information about not just the mutilation of the Herms but any other irreligious act that might have been committed in the city.
And at once the floodgates opened. At a stroke, anyone wishing to make accusations of the most malicious kind against his enemies had effectively been granted carte blanche. And Alcibiades’ enemies jumped at the opportunity. At a meeting of the Assembly a slave was brought in who accused him of having desecrated what was arguably the most sacred religious ceremony in all Attica, the Mysteries celebrated at Eleusis, a ritual that promised life after death. The febrile atmosphere became positively incandescent, but rather than allow Alcibiades to stand trial (as he requested) and demonstrate his innocence, his enemies engineered for him to sail to Sicily with the charge still hanging over him. It was a clever move. Many of Alcibiades’ supporters were army men, and with them out of the city it would be easier to secure a prosecution. So just weeks later they recalled him to stand trial on a charge for which, if found guilty, Alcibiades would almost certainly face execution.
Arguably at the Olympic Games the year before Alcibiades had demonstrated that most disturbing quality found in so many tragic heroes: hubris, when a man crosses the dividing line between what is acceptable within the bounds of human behaviour and what is not. And there were not a few who saw in Alcibiades’ ostentatious displays at Olympia, which cannot but have eclipsed the ceremonies in honour of Zeus, or in his behaviour back home in Athens—more than a little evidence of hubris on a quite spectacular scale. And as everyone knew, man’s hubris attracts the anger of the gods and brings about their punishment, his nemesis.
Alcibiades knew what would happen if he returned to Athens. He knew he’d be killed. But not for him to be a man of sorrows. Not for him to be acquainted with grief. Oh, no. Alcibiades had once beaten up a teacher who couldn’t put his hand immediately on a copy of the Iliad, and now, imbued with the ethos of Achilles, he turned his back on his fellow generals, and on his army, and on Athens and for four years he courted Athens’ enemies—first the Spartans, then the Persians—while by their ships’ sterns and the salty sea the Athenians were slaughtered, first in Sicily where the bungling, disease-ridden Nicias allowed the expedition to stagnate into disaster, then in Ionia, where Alcibiades helped lead a joint force of Persians and Spartans to a string of victories. In many ways these years are years of romance and adventure, perhaps the most compelling in all Alcibiades’ career. If only we had time to explore them now!
But since we don’t, suffice it to say for now that partly as a result of Alcibiades’ defection and hostility, Athens was brought almost to her knees. With many men killed, and many ships lost in Sicily, and her democracy in such tatters that for uneasy months it was actually overthrown, she should have been easily defeated. But the reason she survived—and prospered—for seven more years was in part thanks to Alcibiades. The Athenian fleet, worn down by defeat and alienated by the newly installed oligarchic government in Athens, recalled Alcibiades. Whether it was his brilliant generalship or simply his morale-boosting charisma, almost immediately after Alcibiades rejoined Athens’ fleet and army, their fortunes changed. The Spartans and their Persian allies were beaten off time and again for the following five years as Alcibiades blazed in glory through the Hellespont and Bosporus and Sea of Marmara. At last he returned to Athens in triumph, all charges were dropped, and he was appointed strategos autocrator: Commander in Chief.
But even now the shadow of Achilles haunted him. You will recall how, in the Iliad, Achilles and the Greeks reached a compromise: rather than do battle himself, Achilles allowed his friend, Patroclus, to don his armour and go out to fight. But Patroclus is killed, Achilles is consumed by self-reproach, and so begins the sequence of events that leads to his own destruction. Well, when Alcibiades and the fleet returned east to confront the Spartan fleet at Ephesus, Alcibiades, forced to do all he could to raise funds to allow him to continue the war, left in charge of the ships at Notium his own close friend, Antiochus. It was disastrous. Somehow Antiochus and a handful of ships managed to be intercepted by the Spartans. Antiochus was killed. And when the Athenian fleet came to the assistance of their colleagues, they did so in no order, and a significant number of ships were either holed or captured. Alcibiades’ enemies had a field day.
Again it seemed that he would be recalled to stand trial. And again he chose to flee, this time to Thrace, where he already had a private army and considerable estates. Again, I think it’s likely that he harboured hopes that his city would at last recall him—he even tried to engineer a recall when the Athenians were drawn up on the beaches of the Hellespont at Aegospotami, a disastrous location in Alcibiades’ opinion—and he rode into the camp where he argued with the generals, offered his help, and was inevitably rebuffed. The subsequent battle saw Athens’ fleet all but annihilated, and the next year Athens was forced to surrender. Once she had been the proudest of all cities. Now Nemesis had struck her, too.
As for Alcibiades, he fled back to the Persians, perhaps holding out the promise that he could help the Persian king. Just let him travel east to Susa, and they’d see how useful he could be! Which was why we find him in the compound at Melissa, in the company of his two travelling companions, the beautiful courtesans Theodote and Timandra, kicking his heels, waiting for the paperwork to come through to allow him to journey on the Royal Road. But what came instead were his assassins. Perhaps it was his enemies in Athens, perhaps it was the Spartans, maybe it was both—but the local Persian governor, Farnavaz (whom Greeks called Pharnabazus) received a request that in the current climate he simply could not refuse, and so he sent a death squad to Melissa. And so, just as Achilles had fallen in the dust of Troy, an arrow lodged deep in his heel, Alcibiades ran into the night to face the hail of arrows.
When the new day broke, Theodote and Timandra and all the household were already grieving, the women scouring their long nails across their lovely cheeks and screaming in their sorrow as they laid out the corpse. And then they washed him gently and wrapped him in their finest robes. But they could not comb his hair or close his eyes. For, as they galloped to Dascyleum, the assassins carried with them, tied tight to a saddle, a heavy, dripping sack, a trophy to present to Farnavaz as evidence that they had done their work. And, when he opened it and lifted out the blood-drained head to look upon the face that once had been the handsomest in Greece the satrap knew for certain: Alcibiades was dead.
I’ve only been able to scratch the surface of Alcibiades’ life today. But I hope I’ve given you a flavour. In so many ways Alcibiades is an enigma, an aristocrat with Homeric heroic intentions living in an age of People Power, for some a roguish hero, for others an unscrupulous traitor. I’ve spent several years of my own life tracking his, and I’ve enjoyed every minute of it. I hope that, if you read the book, you will enjoy it equally.
As expected, at this week’s Senate Intelligence Committee confirmation hearings for Gina Haspel, President Trump’s nominee to lead the Central Intelligence Agency, the C.I.A.’s post-9/11 “interrogation program” and Haspel’s own role in the program and its aftermath took center stage. Haspel sought to cast the program and its brutal techniques—many of which have since been outlawed—as an aberrant episode in the agency’s history, and vowed that under her leadership “C.I.A. will not restart such a detention and interrogation program.” Just as Haspel aimed to position that use of torture as an outlier in the history of the C.I.A., so too did the hearings propagate an understanding of the brutality of those years as a brief deviation from the long history of moral leadership demonstrated by the United States. Indeed, the majority of Americans would likely share such sentiment: we do not torture.
In Civilizing Torture: An American Tradition, forthcoming this fall, decorated American historian W. Fitzhugh Brundage shows that alongside the long American lineage of denouncing torture there’s an equally enduring culture of both embracing and excusing barbarism. Brundage revisits a series of moments and practices—from the initial contact of Europeans with North America, to the early American republic, to slavery, to the American imperial project, to local law enforcement’s embrace of “the third degree,” through the Cold War, and up to the present—to demonstrate that behavior considered to have been torturous in its own time has been far more prevalent in U.S. history than we acknowledge. By threading this past into a cohesive story of debate and dismissal, Brundage reveals the ways in which the mythos of American exceptionalism has underwritten the narrative so evident this week on Capitol Hill.
The passages below come from the introduction to Civilizing Torture.
In April 1858 Harper’s Weekly, one of the most popular American magazines of the day, published a gruesome article entitled “Torture and Homicide in an American State Prison.” Accompanied by graphic illustrations, the article dwelled on the so-called “water cure,” a punishment during which an inmate was stripped and seated in a stall with his feet and arms fastened in stocks and his head extended up into a tank that fit snugly around his neck. The prisoner’s head was drenched with freezing cold water that cascaded down from a height of a foot or more for several minutes at a time. The tank that encircled the prisoner’s neck emptied slowly, inducing a sensation of drowning while the prisoner struggled to keep his mouth and nose above the pool of draining water.
Thirty years later an investigation of practices at Elmira Reformatory, the most acclaimed American penal institution of the day, revealed that staff there continued to douse prisoners with cold water, in addition to confining them in darkened cells for weeks on end, shackling and hoisting them until their toes barely touched the floor, and “paddling” them with specially made boards. In 1899 American soldiers occupying the Philippines after the Spanish-American War sent home letters boasting of their routine application of a variant of the “water cure” to coerce information from Filipino guerrillas. To apply the “cure,” soldiers pinned their victim to the ground by his legs and arms and partially raised his head “so as to make pouring in the water an easier matter.” If he refused to keep his mouth open, his tormentors pinched his nose closed and used a rifle barrel or bamboo stick to pry his jaws apart. Then they poured water into his overextended mouth until he looked like “a pregnant woman.” According to witnesses, a few applications of the “water cure” usually elicited a flood of information.
A century later, the outlines of the “enhanced interrogation” methods adopted by the Central Intelligence Agency and military interrogators during the “War on Terror” became public. Americans learned that between 2003 and 2006 at least eighty-nine Middle Eastern detainees in CIA custody had been slapped, slammed against walls, deprived of sleep, stuffed into coffins, and threatened with violent death. The most severe method was “waterboarding,” a modern-day variant of the technique applied a century and a half earlier in American prisons. Waterboarding entailed pouring water over a cloth covering the face and breathing passages of an immobilized detainee, which produced an acute sensation of drowning. One detainee endured more than 180 waterboarding sessions.
Torture in the United States has been in plain sight, at least for those who have looked for it. To acknowledge this history of torture in the United States is not to suggest that it is equivalent to that of societies in which state-sponsored torture and terror has been endemic, such as Germany during the Nazi regime, Argentina during its “dirty war,” Guatemala during the late twentieth century, or Zimbabwe since its independence. Only some Americans have been vulnerable to being subjected to torture while most have been able to live in denial or ignorance of the practice of torture taking place in the nation.
The “American tradition” in this book’s subtitle is not a particular method of tormenting the body. It refers instead to the debates that Americans have waged regarding torture. Like a minuet in which the dancers change fashions over time yet the steps remain the same, these debates have unfolded in predictable fashion. When Americans have debated torture they invariably have invoked the nation’s utopian ambitions to serve as the exemplar of modern democratic civilization. Since the nation’s founding, Americans have boasted that the United States is a unique nation with uniquely humane laws and principles.
The roots of this certainty in American exceptionalism may be traced to the earliest days of the European conquest of North America. Champions of European ambitions there were insistent that they were transplanting the rule of law and civilization to a continent gripped by the savage and cruel violence of Indians. In order for civilization to survive and thrive in North America, savagery had to be eradicated. Thus, torture and violence of the New World became a mirror in which Europeans looked most often for confirmation of their righteousness and civility and less often for evidence of their regression. Europeans excused their complicity in the cycle of violence by insisting that their violence was retaliatory and necessary for the creation and preservation of civilization. In time, the generation of colonists who waged the American War of Independence and presided over the young republic would congratulate themselves for consigning torture and other relics of savagery to a past that no longer retained a hold over the new nation. Henceforth torture would be synonymous with tyrants and savages. Where torture persisted in the New World and beyond, it did so only because American institutions and civilization had yet to fulfill their destiny.
Undergirding this presumption of American exceptionalism is a widely held conviction that the nation’s founders bequeathed a constitution that codified the Enlightenment principle that rational laws, rather than superstition or tradition, would secure the greatest justice and best government. Enshrined in the nation’s constitution is a prohibition of torture and “cruel and unusual punishments.” Americans during the nineteenth century heaped scorn on autocratic regimes elsewhere in the world for their fealty to inhumane traditions, including torture. During the past century Americans found further confirmation of the singular virtues of their institutions and principles as Fascist and Communist regimes in Europe, Russia, and Asia employed torture as an essential tool of statecraft. Both defenders and opponents of torture have staked out their positions confident that they were citizens of a modern, civilized nation that was, in Abraham Lincoln’s words, mankind’s best hope.
The choreography of participants in debates regarding American torture follows a strikingly consistent pattern, reflecting the imperative for defenders and opponents alike to square themselves with the nation’s professed principles and with the dictates of modern civilization. When accusations of torture are first broached, implicated officials are certain to issue categorical denials of any systematic and inhumane violence. Intentional violations of sacred American principles are unthinkable, as Secretary of War Elihu Root implied in 1901 when he vouched that the occupation of the Philippines had been conducted “with scrupulous regard for the rules of civilized warfare” and “with self-restraint, and with humanity never surpassed.” Such categorical denials prompt opponents of the controversial practices to ferret out more evidence of wrongdoing, which in turn goads apologists to concede that a few lamentable transgressions may have occurred even if their incidence and character were grossly exaggerated by critics. A prominent national spokesman for police superintendents adopted this stance in 1910 in response to quiet grumblings about cruel police interrogation techniques; he assured Americans that “rough usage” of suspected criminals had once occurred but “such procedure does not obtain at large nowadays.”
Defenders of controversial practices are also likely to dismiss the victims of alleged torture as neither credible nor deserving of sympathy. Torture victims invariably are portrayed as loathsome, depraved, and violent. The lawyer representing a Chicago policeman accused of torturing scores of suspects during the 1980s, for example, exemplified this response when he denounced his client’s accusers as “the scum of the earth” while lauding his client for decorated service in the military and police. Anyone who takes up the defense of odious criminals, groups, or enemies of the nation risks guilt by association. When unassailable evidence surfaced in 1969 that American troops had committed torture, rape, and other atrocities in Vietnam, Governor Ronald Reagan of California raged that the exaggerated attention on atrocities gave “comfort and aid to our enemies.”
These denunciations of alleged torture victims and their champions are often accompanied by claims that the controversial practices are justifiable and effective in the specific circumstances in which they are used. Without the judicious application of stern measures, the argument goes, American civilization will succumb to attacks by vicious enemies unrestrained by respect for our civilization and institutions. Thus at the outset of the “War Against Terrorism,” while President Bush warned the nation that it would wage “a war unlike any other,” Vice President Dick Cheney explained, “it’s going to be vital for us to use any means at our disposal basically to achieve our objectives.” The “enhanced interrogation” methods that the administration subsequently adopted were wholly consistent with its stance that the president had a carte blanche to respond to the extraordinary provocation of the terrorist attacks of September 2001.
In this and other instances, opponents are likely to counter that stern measures are at once counterproductive and immoral. Equally important, the nation’s principles should not be bent to circumstances. Any concession to expediency risks undermining institutions and principles that are themselves the nation’s greatest protection. Senator John McCain made precisely this argument in 2014 when he denounced the interrogation methods employed in Guantanamo and Iraq, and pledged that all Americans “are obliged by history, by our nation’s highest ideals and the many terrible sacrifices made to protect them, by our respect for human dignity to make clear we need not risk our national honor to prevail in this or any war. We need only remember in the worst of times, through the chaos and terror of war, when facing cruelty, suffering and loss, that we are always Americans, and different, stronger, and better than those who would destroy us.”
When the offending practices are halted either because the circumstances that gave rise to them ease or the controversy that they aroused prompts it, the substance of debate shifts to the significance that should be attached to the practices. Defenders of them are sure to insist that they were effective and therefore no handwringing over them is warranted. And in those instances of exceptional violations carried out by a “few bad apples,” it is counterproductive to assign broader responsibility or to implicate the larger cause with which they were associated. In 1903, for instance, Senator Henry Cabot Lodge batted away demands for further congressional investigations of American atrocities in the Philippines by disparaging the accusations as hackneyed and baseless persecutions of soldiers in harm’s way while defending the nation’s interests.
A common thread in these rival perspectives is the presumption that Americans should exist in a state of national innocence, with torture held at an arm’s length. Americans have been at best complacent and at worst willful in presuming that torture is something that other people do elsewhere.
United States Senator for New York Kirsten Gillibrand has just introduced major new legislation to create a Postal Bank, which would establish a retail bank in all of the U.S. Postal Service’s 30,000 locations. As detailed by the Senator’s announcement, the Postal Bank “would effectively end predatory payday lending industry practices overnight by giving low-income Americans, particularly communities of color and rural communities, access to basic banking services that they currently don’t have. The lack of access to traditional banking services makes it nearly impossible for low-income Americans to escape the cycle of poverty because they are often forced to spend large percentages of their income to cash their paychecks or pay back high-interest predatory payday loans.”
Postal banking was America’s most successful experiment in financial inclusion. Today, there are many communities across the country that are banking deserts. The only financial service providers are fringe lenders and check cashers whose business model relies on the poor paying more for banking services than anyone else. This is a threat to our democracy. Yet post offices serve all of these communities regardless of cost and without exploitation. Postal banking can provide safe, accessible, and much-needed financial services to the most struggling communities in our country. It will make it less expensive to be poor.
Below, Baradaran details the predations of the fringe lending industry and the promise of postal banking:
In Exposed: Desire and Disobedience in the Digital Age, Bernard Harcourt assays the deeply troubling implications of pervasive surveillance in our age of lives lived online, and the degree to which we willingly trade our privacy for the fleeting rewards of digital affirmation. To Harcourt, a professor of law and political science at Columbia University and the author most recently of The Counterrevolution: How Our Government Went to War Against Its Own Citizens, Facebook founder Mark Zuckerberg’s appearance before Congress last week was a pageant that will do nothing to address the perilous dynamic of our society of exposure. Meaningful reform will come only when we recognize the libidinal allure of today’s digital platforms, as Harcourt explains below.
The Facebook hearings last week were quite the spectacle. Mark Zuckerberg deftly deflected his inquisitors and misled them, while share price rose 4.5% in a single day. Senators and representatives postured for their constituents and got free prime-time media exposure. Privacy experts crowed and gloated that they had always been right, but unfairly ignored. The media and the Internet harvested abundant costless content. And social media lit up, abuzz. Between the schadenfreude and the glee, and the plain-old gawking and goggling, everybody seemed to pleasure themselves. It was win-win—except, perhaps, for the ordinary digital subjects who were left high and dry: pleasantly entertained, but totally exposed.
In the end, the Facebook hearings were nothing more than another tantalizing but anxious digital distraction. The greatest paradox, perhaps, is how much personal data and digital exhaust we all emitted and how many digital traces we shed watching Zuckerberg and simultaneously fretting over our privacy.
If anything, the Facebook hearings confirm the dreadful bind in which we find ourselves: social media and the Internet companies have us all in the palms of their hands because the digital experience itself is so seductive, consuming, and self-gratifying. Their Faustian business model works because their platforms tap directly into our pleasure centers and trigger deep reward circuits. Seeing our selfies online, tracking our likes and shares, counting our followers and retweets—these stimuli are almost more reinforcing than food or sex. We find ourselves going from one digital platform or device to another, swiping and clicking, pressing the levers like a rat in Skinner’s box, desperately seeking more stimulus and gratification.
And unless and until we come to grips with the place of desire and of our libidinal and at times narcissistic urges in relation to these new digital technologies, we won’t make any progress, we won’t get anywhere. Yes, the #DropFacebook campaign just gained Susan Sarandon and Apple co-founder Steve Wozniak. But the vast majority of the users of Facebook—as well as Instagram, Twitter, YouTube, etc.—will stay put because these platforms satisfy their desires, provide the gratification, and remain the easiest way to enjoy social relations today, even when they do make us anxious about our privacy.
The fact is, power circulates differently in the digital age, and the social media powerhouses have tapped deep into our pleasure centers and egos. The dark analogies to George Orwell’s 1984 or to Foucault’s “panopticon” just do not capture the present moment, nor will they alone stop us from sharing and liking.
Today, we are no longer being coerced to give up our privacy, as Winston and Julia were by Big Brother. We are no longer confined to a panoptic cell, naked before the all-seeing guard tower. There is no telescreen forcibly anchored into our apartment walls. Instead, today we share our personal information jubilantly, out of love and desire, and for self-affirmation. We post selfies on Instagram, status updates on Facebook, screeds on Twitter. We invite Echo into our homes. We build personal websites open to all. And it feels so good, it’s so pleasurable, that even when we are warned about how much of our private information the social media and Internet companies have, we cringe but go on.
We are lustful and hooked on projecting ourselves onto the public screen. Even when we resist and try to tame that lust, it becomes obvious, so utterly obvious that it is practically impossible to live an active life today without shedding our data and leaving traces everywhere. Searching the web, buying online, finding directions—the truth is, we are exposed even when we try to resist. Yes, you can start the search on DuckDuckGo, but pretty soon you’ll be on another site that installs cookies and culls your data, or you are inputting personal information into another service provider without any way to avoid it. There are, to be sure, ways to protect yourself, but you need the time, expertise, and resources to buy your server or learn TOR (as if that were safe!). And most of us are already so distracted and stimulated by the next ping, alert, popup, or notification that we’ve already forgotten what the problem was and ignore the greatest risk.
That risk is not just Big Brother or Foucaultian discipline, but the larger mode of governing that all the collected data feed into: the NSA surveillance that enables total information awareness as the first prong of a counterinsurgency warfare paradigm of governing our own citizens. We are so distracted by our digital screens and by the “attention merchants”—and now by a presidential Reality-TV management style that produces early morning Twitter screeds and daily TV episodes—that we cannot even see the new danger we face, our new form of governing through fictitious internal enemies.
Facebook’s business model works because we thrive on it. We love to see ourselves projected onto the screen. We like to be liked. We want to be shared. We thrive on that attention.
And now that we are locked into this digital pleasure circuit, there won’t likely be a way out, in terms of our privacy—or the privacy of the vast majority of us—unless and until we find a template that pleases us more. That is more gratifying. There will not be the political will, even less the political space, to address the privacy issues until we discover a better design to lever our pleasure centers.
That’s what we need to figure out now. That’s where we need to invest. We need better, more seductive platforms that satisfy our libido and protect our privacy. It’s not going to happen by putting ourselves on a diet—our digital lust is too virtual, unlike our weight. We need to invent entirely new possibilities.
One option might be to privatize personal data so that the digital subjects would be the ones making the profit and controlling the information flow—in effect, to increase the stimulus, but motivate it toward privacy. This was an approach pioneered by the 18th century Scottish Enlightenment—the method of unleashing self-interest to tame the passions so artfully described by Albert Hirschman in The Passions and the Interests: Political Arguments for Capitalism Before Its Triumph. But I will confess, that feels too crass and base. It would involve embracing the worst of our natures in order to fix our addictions. And it’s also hard to imagine that the small monetary sums at the individual level—it’s the aggregate that really matters in this business—would be sufficient to motivate proper monitoring of our privacy on our part. There could be aggregators, to be sure; but we would have to study closely what business model they would operate under, too.
Another option would be to nationalize the social media and Internet companies, turn them into non-profits, and treat them as public utilities. Pull the plug on the financial drivers, set up a range of non-profit businesses, and give free reign to reputational competition and rewards. Here we might turn to other critical traditions to cultivate the commons. And here too, of course, there are endless possible objections. But that can’t stop us.
None of these options will be uncontested. None of them will be unopposed. But the first step, the essential place to begin, is by recognizing that we’re in this spot because of our desires and appetites. It’s because these digital platforms are so seductive and pleasurable. And until we recognize and address those libidinal dimensions and our own narcissism, we will not be able to interrupt the flow of digital pleasure and exposure.
Many compare the emergence of the blockchain to the arrival of the internet, and anticipate a corresponding transformation in communications, business, and individual freedoms. In Blockchain and the Law, new this month, Primavera De Filippi and Aaron Wright examine both the profound opportunities the technology presents and the legal and even ethical challenges it poses. In the brief excerpt below, they consider possible paths away from our current crossroads between the rule of code and the rule of law.
When Satoshi Nakamoto released Bitcoin to the world, he had a clear idea in mind, which was reflected in the message he included in Bitcoin’s genesis block:
The Times 03 / Jan / 2009 Chancellor on brink of second bailout for banks
Nakamoto released the Bitcoin network in the middle of a financial crisis, as a reaction to an unstable international banking system. In doing so, he gave birth to a new currency—one controlled not by any government or central bank but only by cryptography and code.
As a global and decentralized payment system that operates without centralized control, Bitcoin held out the hope of newfound economic freedom for those dubious of governmental authority. Early Bitcoin adopters subscribed to the notion of vires in numeris (strength in numbers), a motto emphasizing the fact that, when it comes to money, only math can be trusted.
But Bitcoin was only the first step in a much grander vision. Shortly after Bitcoin’s release, technologists began to realize that the true potential of Bitcoin—the real innovation—was its underlying data structure: a blockchain. While Bitcoin offered the ability to replace the role of central banks and eliminate the need for financial institutions, blockchain technology could be applied more generally to reduce the need for middlemen in many sectors of the economy. Whenever a trusted authority is necessary to coordinate social or economic activity, blockchain technology could provide the necessary infrastructure to replace this activity. The roles of banks, financial institutions, stock exchanges, clearinghouses, content providers, online operators, and even governmental systems could all be modeled by a set of protocols and code-based rules deployed on top of a blockchain-based network.
Blockchain technology presents some risks, however. The technology supports technological systems and decentralized applications that operate independently of any centralized institution or trusted authority. They implement their own internal systems of rules, which often ignore or attempt to circumvent traditional systems of control. Unlike other technological constructs currently deployed on the Internet, these decentralized systems and applications can be governed almost exclusively by the rules of code.
The Internet had already raised a fundamental tension between the rule of law, based on geographical boundaries, and the rule of code, based on topological constructs. The regulation of “cyberspace” lies at the intersection between these two normative systems—which can either cooperate or compete with one another, depending on the circumstances at hand.
At the outset, legal scholars thought that the rule of code would ultimately prevail on the Internet. With code, people could implement their own systems of rules, enforced by a technological construct that operates outside of any legal jurisdiction. This is what inspired a number of technology activists to believe that cyberspace was an unregulatable space that governments did not have the right or ability to control—as opposed to the “meat space,” which is mostly governed by the rule of law.
Eager to bypass the politics of enclosure and control enacted by governments and corporations, these groups believed that the Internet would foster new normative systems, which would facilitate the free flow of information and promote political and cultural autonomy. The Internet marked the beginning of a new paradigm for regulation—one where regulation would be applied through the rule of code, with power dynamics that differed significantly from those of the physical world. Over time, however, governments recognized and embraced the potential for the rule of code to maintain the rule of law on the Internet. Governments have extended their control by requiring that intermediaries change their code to maintain and respect jurisdictional laws.
With the advent of Bitcoin and blockchain technology more generally, we are poised to witness a new wave of decentralization and new calls that the world will—once again—be governed by the rule of code. Echoes of the first Internet wave permeate the discourse around blockchains, with claims that blockchain technology will lead to greater individual freedom and emancipation, as these early technology advocates initially aspired to. Blockchain technology is viewed as a new opportunity by many cypherpunks and decentralization advocates, who see it as a new means for people to liberate themselves from the tyranny of governments and corporations—in ways that are quite reminiscent of the Internet’s early days.
Blockchain technology facilitates the emergence of new self-contained and autonomous systems of rules that create order without law and implement what can be thought of as private regulatory frameworks, which we refer to as lex cryptographica. These systems enable people to communicate, organize, and exchange value on a peer-to-peer basis, with less of a need for intermediary operators. They provide individuals with the opportunity to create a new normative layer or a customized system of code-based rules that can be readily incorporated into the fabric of this new technological construct—thereby making it easier for people to circumvent the law.
Lex cryptographica shares certain similarities with the more traditional means of regulation by code. Both purport to regulate individuals by introducing a specific set of affordances and constraints embedded directly into the fabric of a technological system. Lex cryptographica, however, distinguishes itself from today’s code-based regimes in that it operates autonomously—independently of any government or other centralized authority.
If the vision of blockchain proponents edges toward reality, we may delegate power to technological constructs that could displace current bureaucratic systems, governed by hierarchy and laws, with algocratic systems, governed by deterministic rules dictated by silicon chips, computers, and those that program them. These systems could improve society in demonstrable ways, but they also could restrain rather than enhance individual freedom.
When it comes to freedom and autonomy, the assumption that the rule of code is superior to the rule of law is a delicate one—and one that has yet to be tested. As Lawrence Lessig has already warned, “When government disappears, it’s not as if paradise will take its place. When governments are gone, other interests will take their place.”
Those working to liberate individuals from the whims of governments and corporations could wind up surrendering themselves (and others) to the whims of a much more powerful entity: autonomous code. If blockchain technology matures, we will need to acquire a greater understanding of the impact that lex cryptographica could have on society, observing and analyzing the deployment of blockchain-based systems and carefully evaluating how to regulate the technology. As one might expect, the deployment of autonomous systems regulated only by code is likely to raise new challenges when it comes to establishing liability and responsibility, creating tensions between existing legal rules, focused on regulating intermediaries, and these newly established code-based rules.
In the end, however, blockchain technology does not spell the end of the rule of law as we know it. Even in a world with widespread use of blockchains, governments still retain their four regulatory levers—laws, code, market forces, and social norms—which could be used to either directly or indirectly regulate this new technology.
Blockchain-based systems can be controlled in areas where they intersect with regulated entities—such as individuals, network operators, and all those intermediaries who either develop or support the technology. New intermediaries servicing blockchain-based networks are already beginning to emerge, including hardware manufacturers, miners, virtual currency exchanges, and other commercial operators interacting with a blockchain-based system. So long as these intermediaries remain subject to the rule of law-because of their country of operation or incorporation—governments will be able to enforce their laws, either directly or indirectly impacting the way in which lex cryptographica will be defined and enforced.
Governments could, for instance, exert pressure on the intermediaries in charge of developing, deploying, or maintaining the technology. They could require software developers and hardware manufacturers of mining devices to implement specific features into their technology to ensure that governments can intervene, if necessary, to regulate autonomous blockchain-based systems. In the case of harm, they could demand that miners censor certain transactions or even revert the blockchain back to its previous state to recover damages or remedy harm. Governments could also impose laws on commercial operators interacting with decentralized blockchain-based applications to regulate the use of these technologies indirectly.
Alternatively, or in addition to this, governments could intervene to regulate a blockchain’s underlying incentivization schemes and influence social norms. They could introduce a set of economic incentives aimed at shaping the activities of autonomous blockchain-based systems. Governments also could try to influence social norms, shaping the moral or ethical standards of the community of users and miners supporting a particular blockchain-based network. Indeed, because a blockchain operates through distributed consensus, all parties supporting the network have the power to intervene—through a coordinated action—to enforce the application of specific legal or community norms.
When combined, these different approaches could constrain the operations of lex cryptographica. However, it is far from apparent what combination will enable governments to regulate these emergent blockchain-based systems without excessively limiting the opportunities for innovation.
Given that blockchain technology is still largely immature, there is a danger that regulating the technology too early could preclude the emergence of new and unexpected applications that have not yet been fully explored or discovered. Permission-based regulations could prevent public and private parties from freely experimenting with this new technology, ultimately chilling innovation.
At the same time, a complete lack of regulation could also prove problematic. Given the lack of a well-defined regulatory framework for blockchain-based applications, parties seeking to deploy the technology could find themselves in a legal gray area, incapable of knowing whether what they are doing today is lawful and whether it will continue to be so further down the line. The lack of a proper regulatory framework for blockchain technology could dissuade entrepreneurs, start-ups, and incumbents from deploying these new technologies for fear of stepping too early into untested waters.
Only time will tell whether blockchains will transform and seep into the fabric of society, shaping an increasing range of social interactions and market transactions. If such a future comes to pass, the ideals of disintermediation, free markets, anarchy, and distributed collaboration could blur into each other, with lex cryptographica facilitating the emergence of new blockchain-based systems that are less dependent on the government, enabling capital and value to flow across the world in a more unconstrained manner.
Law and code are two important regulatory mechanisms, each of which comes with its own benefits and limitations. The main drawbacks of the law—in terms of ambiguity and uncertainty—are also its greatest strengths, in that they provide legal and contractual rules with an increased degree of flexibility and adaptability. Similarly, the main advantages of smart contracts—in terms of automation and guaranteed execution—also constitute their greatest limitation, which might lead to excessive rigidity and an inability to keep pace with changing circumstances.
As Yochai Benkler puts it, “There are no spaces of perfect freedom from all constraints”—all we can do is choose between different types of constraints. While some people might be tempted to use blockchain technology to escape from the law, others might use it to establish an alternative or complementary system, made up of self-enforcing technical rules that are much more rigid and restraining than traditional legal rules.
If blockchain technology matures, we may need to ask ourselves whether we would rather live in a world where most of our economic transactions and social interactions are constrained by the rules of law—which are universal but also more flexible and ambiguous, and therefore not perfectly enforceable—or whether we would rather surrender ourselves to the rules of code. Decentralized blockchain-based applications may well liberate us from the tyranny of centralized intermediaries and trusted authorities, but this liberation could come at the price of a much larger threat—that of falling under the yoke of the tyranny of code.
In the space of less than a decade, Samuel Moyn has defined—and largely created—the field of the history of human rights. With 2010’s radically revisionist The Last Utopia: Human Rights in History, he revealed how our modern notion of human rights was birthed only in the 1970s, showing their rise to have been a precarious, contingent, and uneven development. Now, with Not Enough: Human Rights in an Unequal World, Moyn places the history of the human rights project alongside the precisely concurrent ascendance of neoliberalism to consider how the age of human rights has been a golden age for the rich. In the exchange below we pose a few questions to Moyn about this new work that George Soros says “breaks new ground in examining the relationship between human rights and economic fairness.”
Q: Your first book on human rights, The Last Utopia: Human Rights in History, was published in 2010. You begin Not Enough by noting that “no account of how the present emerged is definitive for long,” and point to the feeling that history is accelerating at a pace in our present age that makes that especially true. How would you characterize the ways in which the transformations of the years between these books have changed your perspective on the origins and achievements of the human rights movement?
Like so many others, I used to be primarily concerned about the viability of the liberal international order consecrated when the Cold War ended. I was curious how human rights became the moral lingua franca of international affairs at that moment, and my research for The Last Utopia took me into histories of ethics and politics as they led to an international order in which human rights matter and social movements pursue them. What was left out, however, was economics—even though I argued in The Last Utopia that, for most of modern times, socialism appealed to more people for far longer than international human rights principles. Already when I wrote up that first book in 2008-9, the global financial crisis was helping make its framing incomplete or even obsolete. Almost ten years later, in 2016-17, I wrote this sequel placing distributional justice and political economy much closer to the heart of the history of human rights than before. (I only hope the spike in populism that year has not made my new book incomplete or obsolete!)
Q: One way of describing Not Enough is that it’s a world history of how people have thought about the fair distribution of the good things in life—who should get what—and you track the shifting global focus from equality (a concern for the share of those good things enjoyed by the least fortunate with respect to the share going to the world’s most well off) to sufficiency (a concern for the share going to the least fortunate only with respect to some minimum level of provision). The philosopher Harry Frankfurt, among others, has argued that inequality shouldn’t concern us as long as sufficiency is satisfied, and Steven Pinker has of course just published another bestseller arguing that humankind has never been better off, in large part because of the steady improvements in absolute quality of life for many in the developing world. Make the case for equality.
Equality might matter as a moral principle in its own right, as many philosophers other than Frankfurt have contended. From the Jacobins in the French Revolution to John Rawls, some modicum of equality always has had a certain number of proponents, though they have debated what precisely requires equalization and how far to justify departures from perfect equality, as most did. But as a history book, instead of arguing directly on behalf of equality in response, I just try to show how many from the French Revolution through just a few decades ago have held out for a more ambitious equality in the face of less exacting obligations. Indeed, one of my goals in Not Enough is to locate the first thinker who contended that sufficient provision is all that morality requires of our institutions—I claim it was Thomas Paine.
I also track how early socialists slowly took on board distributional equality as a norm and, much later, postcolonial voices were the first to demand truly global equality. By the end, the goal is to show how unusual it is to have advocates of sufficiency alone as prominent as Frankfurt or Pinker, in the absence of egalitarian movements and politics, which were rife across modern history. It would have appalled many of our ancestors to see how many people today agree that it is morally acceptable how far the rich are allowed to tower over the rest as long as the poor are at least somewhat better off.
Q: In Not Enough you’re careful to note the real advances wrought by the human rights movement, but are ultimately critical of its low aims, and the abandoned ambitions of the era of decolonization. Despite your disappointment with its goals, though, you do largely grant the good faith of the human rights revolution, unlike some further to the left who’d say it’s been a sham all along. So, absent a conscious decision to set a deliberately low bar, what are the forces that have contributed to what you identify as the movement’s shortcomings?
All ethical movements exaggerate when they promote their relevance to potential affiliates, and the human rights movement is no exception to this rule. The trouble is that the audience of the human rights movement took it too frequently to be a cure-all or panacea, without realizing how limited its aims have been—how rarely and slowly it engaged the distribution of the good things in life, and how, when it did engage, it strove for sufficient distribution alone, even as inequality exploded in so many countries. You might put it by saying that the human rights movement has not been neoliberal, but that, in our enthusiasm for it as the morality of the end of history, we have been.
Q: The historical dynamic really driving the book is the collision of movements for global justice with the ascendance of the neoliberal project, and your interest in identifying the relationship between the two. You’re unequivocal about your judgment that “neoliberalism, not human rights, is to blame for neoliberalism.” And yet, you write that the conclusion that human rights did not abet neoliberalism “makes how they could so easily accompany it more pressing to consider.” Why is that?
“It would have appalled many of our ancestors to see how many people today agree that it is morally acceptable how far the rich are allowed to tower over the rest as long as the poor are at least somewhat better off.”
I have always thought we should never grant the importance and fair successes of ethical movements that appeal to their audiences for affiliation without identifying their limitations, too. After all, there is no reason to accept the current menu of ethical choices or the existing list of social movements that struggle for them. And the lifespan of the human rights movement has coincided with that neoliberal ascendancy that has made material equality its chief casualty. Once we appraise human rights properly, we can see the present not as a moment for ethical self-congratulation, but as one of a vacuum to fill with some of our next efforts.
Q: You refer to human rights as our highest ideals, and, as you note above, the movement has certainly successfully lodged itself in the consciousness of millions of people around the world who are concerned with justice. If you could get a single, succinct message from Not Enough to, say, the many well-intentioned people who donate to organizations like Human Rights Watch, what would it be?
Well, the fact that the book has been endorsed by George Soros—who remains the chief funder of the human rights movement worldwide today—shows that the time is ripe for an enlargement of our sensibilities. If you care about human rights, whether as a member of the audience for the struggle, a person considering how to spend your career, or just as someone with a few extra dollars to donate, you still cannot neglect the aim of distributional fairness. As a matter of morality, equality may require a lot more of our generosity than human rights seem to demand. Even if you are merely thinking strategically, you should recognize that majorities do not seem willing to defend the rights of others at home or abroad if they do not feel they are living in a fair society themselves. For either reason or both, human rights are not enough if not connected to a broader egalitarian agenda.
Ministers denounced it from their pulpits. Sunday school teachers warned their classes of its demonic origins. Yet Christianity and rock ’n’ roll music were surprisingly intertwined from the time when the music first made national headlines in the mid-1950s, and have remained closely linked ever since. Historian Randall Stephens tells the tale of their relationship in The Devil’s Music: How Christians Inspired, Condemned, and Embraced Rock ’n’ Roll, new this month.
As Stephens shows, the long and complex affair between believers and pop culture has been a dynamic story of how the faithful encountered and reacted to the perceived moral chaos surrounding them. Here’s a bit from the book’s introduction:
Rock ’n’ roll’s exciting, unconventional mix of country, gospel, and rhythm and blues was first broadcast on the nation’s radio stations in the mid-1950s. What had seemed like an undeniable abomination would, after the 1960s countercultural revolution, appear less and less problematic. American Christians eventually made an uneasy peace with the pop music that they had once battled so relentlessly. With each passing decade, it had come to seem less and less threatening to the faithful. Pop genres would become key to American Christian identity and church growth, helping make evangelicalism the largest American religious tradition by 2008. As organs and pianos dominated religious music in earlier eras, guitars, drums, keyboards, and bongos were now typical in evangelical churches. Stripped-down praise choruses became the order of the day. Much of what animates evangelical churches in the twenty-first century comes directly from the unlikely fusion of pentecostal religion, conservative politics, and rock and pop music.
A book about music will of course arouse the ears, so Stephens pulled together a playlist of songs and artists that he considers in The Devil’s Music. Have a listen via the player below.
And, from Stephens’s own archive comes the following string of endorsements from some of the book’s key figures.
Most people would likely claim a general understanding of neoliberalism as a movement of laissez-faire principles aimed at ensuring free markets and an “unfettered” economy. In Globalists: The End of Empire and the Birth of Neoliberalism, one of the first intellectual histories of the movement, Quinn Slobodian shows that, in the beginning, neoliberalism was actually about shielding the economic world from the political world—about protecting the global economy, not freeing it. For Slobodian, neoliberalism is ultimately less a theory of the market or economics than of law and state, and his work gives us a much clearer sense of how the old world of empire gave way in the twentieth century not to a quasi-libertarian world of markets but to international institutions that were highly active in prescribing trade policies and rules about competition. Below, Slobodian introduces his study, and recounts its origins in a late-1990s moment when passion could often outpace understanding.
If I had to give this book an origin point, it would be almost 20 years ago, at the 1999 protests against the World Trade Organization meeting in Seattle. For people who don’t know much about it, a famous coalition of “teamsters and turtles,” meaning environmentalists along with labor unions, students, anarchists, old hippies, and assorted others shut down the meeting and shook the organization to its core. Despite the director-general’s famous call to “re-brand” in the early 2000s, the WTO hasn’t completed a negotiating round since. So, the protest was a big deal.
I was a junior in college in Portland at the time, a few hours south of Seattle, and for reasons mostly to do with laziness and ennui, I didn’t go. I remember watching the protests on our small box TV with a coat hanger stuck in the back with the sinking feeling that “oh no, a world historical event is taking place that I passed on to watch some Lars von Trier movies on VHS.”
My friends and classmates felt activated, painting huge papier-mâché fists and strapping them to their backpacks, filled with certainty. Myself, I felt demobilized and filled mostly with disorientation.
The 1990s were a weird time to come of age. Middle class North American white kids like me were profiting from everything that went under the decade’s buzzword of globalization but also saw it as something ominous, sometimes verging on evil. We were the Adbusters generation. My sixteen-year-old sister was printing small stickers attacking McDonald’s viscerally. I wrote and photocopied a zine that said on the cover in block letters: OPEN YOUR EYES.
But open them to what exactly? The world economy was like the Nothing from my favorite childhood film, The Neverending Story, an anonymous, faceless force that seemed to swallow everything in its path, extinguishing particularity; it Coca-Colonized, to use a term of the time, squashed dreams of global modernization and turned the Third World back into a labor colony to make our stuff.
Before there was fake news there were what I saw then as fake needs, created by advertising and the mind control of the blizzard of logos—the swooshes and stripes that now adorn the cool kids in Berlin and Brooklyn. Back then we were all little Frankfurt Schoolers, or Frankfurt Pre-Schoolers, distrustful of irony though we were steeped in it, seeking elusive authenticity, wanting to unveil, unmask, expose, and upend—in my case, preferably from my desk, typewriter, and the warm chairs of the library.
So this book came out of a simple question that germinated at that time: why would anyone defend the great Nothing of the world economy, the power that forced the hands of democratically elected governments, anonymously imposed new rules and strictures and only rewarded us with what I saw in my adolescent mind as a disposable culture that would choke us all with plastic, garbage, and refuse before I even had a chance to die a natural death. (Capturing some of the melodrama of the time here).
I wanted to come as close as I could to a kind of deep logic of the moment. I knew, or thought I knew, what “we” wanted—but what did “they” want? How did “they” understand their own mission in the world?
The word that arose in the 90s to describe what “they” believed was “neoliberalism.” Although the term was coined by a group of intellectuals in the 1930s to describe themselves, as I recount in the book, in the 1990s and afterward it was used mostly by its enemies—as an academic curse word. As I write in the book’s introduction:
Neoliberals, we were told, believed in global laissez-faire: self-regulating markets, shrunken states, and the reduction of all human motivation to the one-dimensional rational self-interest of Homo economicus. The neoliberal globalists, it was claimed, conflated free-market capitalism with democracy and fantasized about a single world market without borders.
But why would anyone promote such a philosophy? There are a few obvious answers. The first is that the big Nothing was actually the big Everything: it was lifting the aggregate wealth and productivity of humanity as a whole. The number of people living on a dollar a day was dropping year to year. Even if inequality was also growing—and ecological problems were not going anywhere—the rising tide, globally speaking, was lifting all ships. Therefore there was nothing “evil” about the IMF, the World Bank, the economics profession, and the Financial Times op ed page. They were simply watching the biggest line graph of all—world economic growth—creep ever higher. We, on the other hand, were all nostalgic brats of the global north, unable to apprehend the bigger picture.
The second explanation was that neoliberal globalization made a small number of people very rich, and it was in the interest of those people to promote a self-serving ideology using their substantial means by funding think tanks and academic departments, lobbying congress, fighting what the Heritage Foundation calls “the war of ideas.” Neoliberalism, then, was a restoration of class power after the odd, anomalous interval of the mid-century welfare state.
There is truth to both of these explanations. Both presuppose a kind of materialist explanation of history with which I have no problem. In my book, though, I take another approach. What I found is that we could not understand the inner logic of something like the WTO without considering the whole history of the twentieth century. What I also discovered is that some of the members of the neoliberal movement from the 1930s onward, including Friedrich Hayek and Ludwig von Mises, did not use either of the explanations I just mentioned. They actually didn’t say that economic growth excuses everything. One of the peculiar things about Hayek, in particular, is that he didn’t believe in using aggregates like GDP—the very measurements that we need to even say what growth is.
What I found is that neoliberalism as a philosophy is less a doctrine of economics than a doctrine of ordering—of creating the institutions that provide for the reproduction of the totality. At the core of the strain I describe is not the idea that we can quantify, count, price, buy and sell every last aspect of human existence. Actually, here it gets quite mystical. The Austrian and German School of neoliberals in particular believe in a kind of invisible world economy that cannot be captured in numbers and figures but always escapes human comprehension.
After all, if you can see something, you can plan it. Because of the very limits to our knowledge, we have to default to ironclad rules and not try to pursue something as radical as social justice, redistribution, or collective transformation. In a globalized world, we must give ourselves over to the forces of the market, or the whole thing will stop working.
So this is quite a different version of neoliberal thought than the one we usually have, premised on the abstract of individual liberty or the freedom to choose. Here one is free to choose but only within a limited range of options left after responding to the global forces of the market.
One of the core arguments of my book is that we can only understand the internal coherence of neoliberalism if we see it as a doctrine as concerned with the whole as the individual. Neoliberal globalism can be thought of in its own terms as a negative theology, contending that the world economy is sublime and ineffable with a small number of people having special insight and ability to craft institutions that will, as I put it, encase the sublime world economy.
To me, the metaphor of encasement makes much more sense than the usual idea of markets set free, liberated or unfettered. How can it be that in an era of proliferating third party arbitration courts, international investment law, trade treaties and regulation that we talk about “unfettered markets”? One of the big goals of my book is to show neoliberalism is one form of regulation among many rather than the big Other of regulation as such.
What I explore in Globalists is how we can think of the WTO as the latest in a long series of institutional fixes proposed for the problem of emergent nationalism and what neoliberals see as the confusion between sovereignty—ruling a country—and ownership—owning the property within it. I build here on the work of other historians and show how the demands in the United Nations by African, Asian, and Latin American nations for things like the Permanent Sovereignty over Natural Resources, i.e. the right to nationalize foreign-owned companies, often dismissed as merely rhetorical, were actually existentially frightening to global businesspeople. They drafted neoliberal intellectuals to do things like craft agreements that gave foreign corporations more rights than domestic actors and tried to figure out how to lock in what I call the “human right of capital flight” into binding international codes. I show how we can see the development of the WTO as largely a response to the fear of a planned—and equal—planet that many saw in the aspirations of the decolonizing world.
Perhaps the lasting image of globalization that the book leaves is that world capitalism has produced a doubled world—a world of imperium (the world of states) and a world of dominium (the world of property). The best way to understand neoliberal globalism as a project is that it sees its task as the never-ending maintenance of this division. The neoliberal insight of the 1930s was that the market would not take care of itself: what Wilhelm Röpke called a market police was an ongoing need in a world where people, whether out of atavistic drives or admirable humanitarian motives, kept trying to make the earth a more equal and just place.
The culmination of these processes by the 1990s is a world economy that is less like a laissez-faire marketplace and more like a fortress, as ever more of the world’s resources and ideas are regulated through transnational legal instruments. The book acts as a kind of field guide to these institutions and, in the process, hopefully recasts the 20th century that produced them.
The tiny, innocuous fruit fly has been a subject of research for more than a century, and is discussed in upwards of 100,000 scientific publications. Its surprising parallels with humans—the fruit fly has a beating heart, a brain, and other organs comparable to our own; exhibits complex behaviors including sleep and aggression; and slows down with age—have made it an ideal subject for research that’s furthered our understanding of biology, health, and disease. In First in Fly: Drosophila Research and Biological Discovery, Stephanie Elizabeth Mohr celebrates the power and importance of curiosity-driven pure research conducted with fruit flies, providing historic and contemporary examples of the profound knowledge such work has afforded. After First in Fly, readers will have a new appreciation for the beauty of the fruit fly, and the common genetic threads that connect us to other creatures. And yet, for many of us, that recognition won’t likely dislodge the lived experience of the fruit fly as a miraculously multiplying kitchen nuisance. For that, Mohr offers up the flytrap described below.
Although researchers take great care to contain fruit flies in a lab setting, some inevitably get free, and once free, the flies will follow enticing scents to places they are unwelcome, such as a lunchroom or a yeast research lab. Drosophila researchers are not likely to bring insecticides into the lab—we are for the most part trying very hard to keep fruit flies alive, after all, so that we can perform genetic crosses and do other types of studies. But even the most dedicated fly researchers recognize that rogue flies can be a nuisance. We have developed an effective strategy for capturing flies that have escaped from their culture vials: we place simple flytraps in strategic locations around the lab. The same approach can be used in a home to get rid of an unwanted infestation or to collect wild flies for study. In the 2014 BioSCAN project, volunteers in Los Angeles, California, placed flytraps in their backyards as part of a species survey. These backyard collections led to identification of several new species of phorid flies, as well as the observation that Drosophila flavohirta, a species not previously found in the Americas, has taken up residence in Los Angeles.
To make a trap, first gather the following: an empty bottle, a rubber band or tape, and a tablespoon of something attractive to the flies, such as fruit juice, cider, champagne, wine, red wine vinegar, mushy banana, tired grapes, or overripe mango. The materials should be assembled as shown, then placed in areas frequented by the flies. The paper cone or a similar cover with a single small opening is essential: an open jar constitutes a fly feeder rather than a flytrap. The opening in the paper cone or cover should be just large enough for the small flies to fit through. Flies will be attracted by the bait and enter the chamber. Once inside, they are unlikely to try to escape, and even if they try, it will be next to impossible for them to find the tiny hole through which they entered.
This is a live trap method. To get rid of the flies, seal and throw away the trap within a few days. If forgotten, the trap is likely to become the birthplace of a next generation of flies. Before tossing the trap out, the curious might examine the flies to determine what species of flies were caught and whether they have unusual or interesting attributes. Should you decide to culture the flies, sliced bananas with a sprinkle of baker’s yeast should suffice as a food source, and a piece of cloth secured with a rubber band or a plug of cotton can be used to stopper the top without depriving the flies of oxygen. Be careful not to let the food dry out or expose the flies to prolonged heat or cold. To anesthetize the flies for close examination, place the trap on ice for five to ten minutes. If the food is very mushy, put the trap on its side, so the anesthetized flies do not get stuck in the food, or use a funnel to transfer the flies to an empty bottle before placing them on ice. A small paintbrush, makeup brush, or crab pick can be used as a pusher, and a magnifying glass, macro lens attachment on a camera, or smartphone-compatible microscope can help you inspect your flies closely. Further information, including food recipes and video demonstrations, can be found online. Who knows? Isolation of a spontaneous mutation, observation of an interesting behavior, or some other chance finding might launch a groundbreaking study.
Over the month’s course the store received dozens of outstanding entries from around the world, including drawings, photographs, videos, poems, essays, paintings, and more. A handful of submissions illustrating the fragment “I went in search of myself” are below; for a more complete roundup of highlights from the month, head on over to the Co-op’s blog.