Loading...

Follow The Atlantic - Shadi Hamid on Feedspot

Continue with Google
Continue with Facebook
or

Valid

Mohamed Morsi’s life, especially his later life, was the product of a series of accidents. When I first met him, he was a senior but relatively obscure and not particularly important official in the Muslim Brotherhood—and one could easily imagine him staying that way. He was a loyalist, a functionary, and an enforcer. Then he became something else: Egypt’s first democratically elected president—and also the last, at least for the foreseeable future. Visionary leaders sometimes emerge during moments of crisis and transition. But just as often, ordinary men and women find themselves in the midst of historical events, both shaping them and being shaped by them.  

Morsi, who died in a Cairo courtroom Monday, was elected in 2012 and deposed in a military coup a year later. He was many, but not all, of the things his critics derided him for. He wasn’t what you would call charismatic. He was not a strategic thinker. He seemed a man particularly unsuited for the responsibility bestowed upon him. In retrospect, knowing what they know now, many in the Brotherhood—in prison, in exile, in hiding—would wish that the organization’s leadership had never opted to field a presidential candidate. But this had little to do with Morsi. Morsi wasn’t meant to be president.

The Brotherhood’s original candidate for president was the businessman Khairat al-Shater, towering in his physical presence, preternaturally confident, and perhaps overwhelmed by ambition. Some called him Egypt’s most powerful man. He was disqualified from running based on a legal technicality. Like so many other things, this, for the group, seemed to confirm that the military sought to block the Brotherhood’s rise by any means necessary. And so Morsi, derided in the Egyptian media as Shater’s “spare tire,” became the accidental candidate and then the accidental president.  

When I sat down with Morsi back in May 2010, the longtime dictator Hosni Mubarak was still in office, and the kind of uprising that could force him out seemed implausible. At that point, Morsi insisted the Brotherhood had no interest in power and even objected to the use of the word opposition to describe the group. Repression was intensifying, and political space was closing years after the brief promise of the (first) Arab Spring in 2004 and 2005. The November 2010 parliamentary elections were arguably the most fraudulent in the country’s history, reducing the Brotherhood from 88 seats to 0. Members of the Muslim Brotherhood seemed deflated but not necessarily in despair. They were playing the long game, which is what the Brotherhood always preferred to play. To be tempted by power, on the other hand, led them, and ultimately Morsi himself, into a series of missteps and miscalculations.

The campaign for president in the spring of 2012 took place in a chaotic, uncertain Egypt. Though burdened by a weak candidate, and with only two months to campaign, Brotherhood activists fanned across the country, promoting Morsi’s so-called renaissance project (which had been Shater’s “renaissance project”). In one coordinated show of strength, they held 24 simultaneous mass rallies across the country in a single day. At one rally, I asked a young Brotherhood activist if he was enthusiastic about Morsi. He smiled and then laughed.

It was easy to dismiss Morsi then, and it will be easy to dismiss him now, as a footnote in history. Buried without fanfare and under the glare of a near-totalitarian state—the most repressive in Egypt’s history—he will be easy to forget. But the brief 12 months in which he found himself in power was an unusual time for Egypt. Morsi was incompetent and polarizing, and managed to alienate nearly everyone outside the Brotherhood. Ultimately, he and the Muslim Brotherhood failed. But he was not a fascist or a new pharaoh, as his opponents liked to claim. In a previous piece for The Atlantic, a colleague and I scored Morsi’s one year in power using the Polity IV index, one of the most widely used empirical measures of autocracy and democracy, and then compared it to other cases. We concluded that “decades of transitions show that Morsi, while inept and majoritarian, was no more autocratic than a typical transitional leader and was more democratic than other leaders during societal transitions.”

But to keep the focus narrowly on Morsi, as a person or as a president, is to miss something important, and that something has become clearer to me in the five years since we wrote that piece. That year may have witnessed unprecedented polarization, fear, and uncertainty, but for that time Egypt was the freest, in relative terms, that it had been since its independence in 1952. Egyptians were shouting, protesting, striking, and hoping, both for and against Morsi. This, of course, is also what made the year frightening: the freewheeling intellectual combat, the seemingly endless sparring of ideas and individuals, but also the sheer sense of openness (and the insecurity that came with it). No other period, or even year, comes close. This was not because of Morsi, but because Egypt—with the help of millions of Egyptians—was trying to become a democracy, albeit a flawed one. And Morsi himself, also deeply flawed, was a product of that brief experiment. To remember Morsi, then, is to remember what was lost.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Is the Israeli-Palestinian conflict fundamentally about land and territory? It is certainly partly about that. But when you hear the objections and grievances of both sides, the issue of who has what part of which territory doesn’t necessarily figure all that prominently.

I recently took part in a study tour on religion and nationalism in Israel and the West Bank organized by the Philos Project. One Palestinian official whom we met told us, “I’m not going to compromise my dignity.”

The problem with what we know of the Trump administration’s “peace plan” is that it asks Palestinians to do precisely that. The entire Donald Trump approach seems to be premised on calling for unilateral surrender. It is premised on destroying the will of a people, and on hoping that despair might one day turn into acquiescence. This is the only way to interpret Trump’s senior adviser and son-in-law Jared Kushner’s insistence on prioritizing economic incentives over political progress, but this misunderstands most of what we know about human motivation.

I have a bias: I don’t tend to think that people are primarily motivated by measurable, quantifiable things. To the extent that territory becomes a seemingly insurmountable obstacle, it matters, but it matters as a proxy for other, deeper issues. As my Brookings Institution colleague Shibley Telhami put it: “To assume that the promise of economic improvement would outweigh ordinary human aspirations of a people who have painfully struggled for decades is to miss the nature of the human condition.”

Our Palestinian interlocutor’s refusal to cede his dignity wasn’t a performance; it was despair. It felt to me like an epitaph. There have been conflicts in which leaders have made compromises that may have seemed like betrayals, only for history to view them as both bold and necessary. But those conflicts are not this conflict.

[Micah Goodman: Eight steps to shrink the Israeli-Palestinian conflict]

The Israelis’ narrative is quite different from the Palestinians’, and on its own terms, it’s not necessarily wrong. According to this perspective, Arabs, from the founding of Israel in 1948 onward, have either longed for the Jewish state to disappear or taken action to actually make it disappear. This relates to the Israeli refrain that there is no Palestinian partner for peace; the most moderate Palestinians may accept Israel’s existence as an unfortunate fact, this argument goes, but not even they believe in Israel’s right to exist as the national homeland for the Jewish people.

In their long history together, Muslims knew Jews less as an ethnic group than as adherents of another religion, different from Islam but also like it. In The Jews of Islam, Bernard Lewis noted that when Muslims expressed negative attitudes toward Jews, they were “usually expressed in religious and social terms, very rarely in ethnic or racial terms.” In conversation, many Palestinians express discomfort with the idea that Jews are both a people and a religion, and Israeli Jews tend to view this lack of recognition as sinister and evidence of Arab irreconcilability.

Many of the early Zionists were secular, so their vision for a State of Israel did not depend on a shared religious faith. It depended, instead, on being a people. The moniker “Jewish state” itself captures this, since a Jewish state can be a secular home for Jews, whereas an “Islamic state”—to use another legalistic religion—suggests a religious mission and theological premises.

Divergent histories and narratives shape the interpretation of otherwise factual questions about what actually happened and didn’t happen at key moments. For example, Israeli politicians attack Palestinians for squandering Prime Minister Ehud Barak’s “generous offer” of 2000, and so a story of Arab and Palestinian recalcitrance builds uninterrupted, with each new rejection confirming the previous one: First, Arabs rejected the 1947 United Nations partition plan. Then Arab nations waged war against the new Israeli state. Decades later, when they finally had their chance, Palestinians rejected Barak’s offer. Then they rejected Prime Minister Ehud Olmert’s offer, and so on.

[Einat Wilf: The fatal flaw that doomed the Oslo Accords]

To put it mildly, Palestinians do not share this interpretation of what went wrong. They believe the offer was far from generous, coming after six years of “more Israeli settlements, less freedom of movement, and worse economic conditions,” as the senior Clinton-administration adviser Rob Malley and Hussein Agha argue in one of the definitive accounts of the 2000 Camp David negotiations. In practice, Barak, the dove, wasn’t much of a dove. As Malley and Agha write: “Behind almost all of Barak’s moves, Arafat believed he could discern the objective of either forcing him to swallow an unconscionable deal or mobilizing the world to isolate and weaken the Palestinians if they refused to yield.”

Palestinian activists tend to speak in terms of justice. An injustice was done, so it must be undone. Christopher Hitchens, in his valediction for the Palestinian American author Edward Said, wrote that his friend’s “feeling for the injustice done to Palestine was, in the best sense of this overused term, a visceral one. He simply could not reconcile himself to the dispossession of a people or to the lies and evasions that were used to cover up this offense.”

Pro-Palestinian protesters often chant the mantra of “no justice, no peace.” One former Israeli official we spoke with in Jerusalem had a different view. He said, “If we make this about justice, there will not be peace.” Too many Palestinians celebrate victimhood—fueled by a profound sense of injustice—rather than overcome it, he suggested.

But then we return to the question of dignity. No one should be asked to overcome their victimhood by giving up their dignity, the one thing even an occupier shouldn’t be able to take away. That might sound naive and impractical, especially for those who would rather Palestinians just get on with it, but that doesn’t make it any less true.

If I were advising the Palestinians, I’d tell them to reject Kushner’s offer, but they don’t need anyone to tell them what’s already painfully obvious. If someone doesn’t understand anything about the history of the Palestinians, their grievances and their narratives, then what’s the point? The outgoing French ambassador to the United States, Gérard Araud, described Kushner this way: “He is so pro-Israeli also, that he may neglect the point that if you offer the Palestinians the choice between surrendering and committing suicide, they may decide the latter. Somebody like Kushner doesn’t understand that.”

Because the two sides are so far apart and are likely to remain so for the foreseeable future, the United States—if it’s unwilling to put serious pressure on Israel or take seriously Palestinian objections—is better off disengaging from an imaginary peace process, rather than lending legitimacy to Israel’s behavior or giving the illusion of progress without the substance. Otherwise we are all just wasting time, at least until a new president attempts to fundamentally rethink America’s sometimes well-intentioned but almost always tragic role in one of the world’s most enduring conflicts.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Robert Mueller’s inquiry into links between Russia and the president’s campaign could have turned out so much worse for Trump. That it would once seemed certain, but it didn’t. Instead, the end of the Mueller investigation has now made hollow the maximalist charges of collusion against Trump and his team.

The collusion claim was an indirect—or direct—way of saying that Donald Trump was illegitimately elected. For Mueller’s team to stop short of concluding collusion had occurred, then, was the best possible result for American democracy. Citizens should be relieved, not disappointed, when election outcomes are upheld.

Conspiracy with Russia wasn’t the only thing that commentators—both liberals and Never Trump conservatives—got wrong. There was another, related charge that was graver and, on its face, more implausible: that Trump would (or could) destroy American democracy. And he would do so with the help of his Russian enablers. Here, the two claims came together, that the Russians wished to end the American experiment and that Trump provided the vehicle for their ambitious designs.

This was part of a grand narrative. But what if the narrative of American democracy under mortal threat—with or without Russian help—was fundamentally flawed from the very start?

Grand narratives are appealing because they help us comprehend the incomprehensible. In this case, they helped to make sense of the endless shock of Donald Trump’s victory. The democracy-is-doomed narrative is crumbling, and rarely do you hear it anymore—at least not with the full-throated zeal that became routine throughout 2017 and 2018.

It began before that, during the campaign. As the New Yorker’s Adam Gopnik wrote: “Hitler’s enablers in 1933—yes, we should go there, instantly and often, not to blacken our political opponents but as a reminder that evil happens insidiously, and most often with people on the same side telling each other, Well, he’s not so bad, not as bad as they are. We can control him.” This sort of thing continued for more than two years.

On January 4, 2018, despite the helpful information that America hadn’t become a dictatorship in 2017, Vox’s Matt Yglesias wrote in an article titled “2018 is the year that will decide if Trumpocracy replaces American democracy” that “Trump has been extremely long on demagogic bluster but rather conventional—if extremely right-wing in some respects—on policy. But … this is entirely typical. Even Adolf Hitler was dismissed by many as a buffoon.”

Preemptively suggesting that your ideological opponents won’t accept the results of elections if they lose isn’t nearly as bad as, well, not accepting the results of elections, but it is still bad. In the case of the 2018 midterm elections, it also happened to be wrong. New York Times columnist Paul Krugman wrote: “Remember, Donald Trump claimed—falsely, of course—that millions of immigrants voted illegally in an election he won. Imagine what he’ll say if he loses, and what his supporters will do in response.” Krugman went on, suggesting that those who voted for the other party were, in fact, voting for autocracy: “If we take one path, it will offer at least a chance for political redemption, for recovering America’s democratic values. If we take the other, we’ll be on the road to autocracy, with no obvious way to get off.”

Claims such as these weren’t just overblown rhetoric from pundits in the heat of the electoral moment. They came with the imprimatur of some the country’s most respected political scientists. Harvard University’s Steven Levitsky and Daniel Ziblatt published How Democracies Die in 2018, and the book became an alarmist bible (even though the book itself is more nuanced than its enthusiasts let on). In New York, Jonathan Chait wrote, “It is hard to read this fine book without coming away terribly concerned about the possibility Trump might inflict a mortal wound on the health of the republic.”

How could so many get it wrong? Underlying these various accounts of doom is a major analytical flaw. In some sense, the flaw is so obvious that I wasn’t entirely aware of it until I started thinking about this article. If we exclude cases of military conquest or occupation, as occurred during World War II, there is no clear case of a longstanding, established democracy becoming an autocracy. Democracies backslide—it is a spectrum, after all. But democracies, or at least certain kinds of democracies, do not “die.”

Germany is a touchstone for any conversation over the fragility of democracy. But Germany, when Adolf Hitler entered politics, was a young democracy and the particular democratic configuration known as the Weimar Republic was even younger, having only been established in 1918. Young democracies are fragile. Moreover, Germany was suffering from historical afflictions that the United States—and, for that matter, most other countries—are not likely to experience again. In an essay for The American Interest, and also the subject of his forthcoming book Democratic Stability in an Age of Crisis, Jørgen Møller lays out the case in convincing detail. Germany, Austria, and Italy, he writes, were “bedeviled by the legacy of the World War” which “created revanchist yearnings in all three countries, which could be harnessed by undemocratic forces on the Right that had, in the first place, been brutalized by four years of fighting in the trenches.”

It is always possible to extract generalizable lessons from historical events. But that is different than thinking that the 1930s were in any meaningful way comparable to the current political moment. Yet some leading academics are rather unapologetic in their analogizing. In the bestselling On Tyranny, written before Trump even took office, the Yale historian Timothy Snyder focuses on Hitler’s rise almost immediately, and he is explicit that Americans, in thinking about the future of their own country, have much to learn from the death of German democracy. “If we worry today that the American experiment is threatened by tyranny,” he writes, “we can follow the example of the Founding Fathers and contemplate the history of other democracies and republics. The good news is that we can draw on more recent and relevant examples than ancient Greece and Rome.”

But what exactly does the fall of Weimar Germany have to do with Donald Trump? If there had been a third world war in which hundreds of thousands of Americans had perished in humiliating defeat, then it might be more appropriate to bring up the Weimar Republic. Until then, the election of a president, Donald Trump, however uniquely bad, is simply not enough to justify rather dramatic chapter titles like “be wary of paramilitaries,” “make eye contact and small talk,” “be reflective if you must be armed,” and “establish a private life.”

Analogies are useful for understanding the people who use them to understand events, not necessarily for understanding the events themselves. As Richard Fontaine and Vance Serchuk of the Center for a New American Security note, “Parallels from the past too often are put forward less to focus debate and discussion than to shut them down. That’s exactly why the invocation of dates like 1938 or 2003 are such political catnip.” History, after all, does not repeat itself. Anything resembling World War I will not happen again, mostly because it can’t. Too many variables have changed. (The broader and, by now, somewhat banal lesson that “small, seemingly trivial events can have tremendous, catastrophic consequences” still applies, however). Journalist Jonathan Chait acknowledges that “the concern of serious democracy scholars is not a totalitarian state that murders its opposition en masse. It is ‘democratic backsliding’.” If the concern is democratic backsliding, however, it is unclear why Hitler or 1933 would be touchstones, since Hitler did, in fact, murder his opposition en masse.

To put it differently, that American politics feels existential (which the very use of the self-title “The Resistance” seems to imply) is different than saying that it actually is. Many, if not most criticisms, of Donald Trump revolve around norms and Trump’s propensity to break them or, more precisely, to act as if they never existed. Often, the refrain is that Trump’s sins may not be illegal or unconstitutional, but that they violate the “normal” conduct of presidents and statesmen. They most certainly do, but this, by itself, isn’t necessarily anti-democratic. The legal scholar Jedediah Purdy puts it this way in his critique of the move toward norm fetishism:

One problem with identifying the protection of political norms with the defense of democracy is that such norms are intrinsically conservative (in a small-c sense) because they achieve stability by maintaining unspoken habits—which institutions you defer to, which policies you do not question, and so on.

That something happens to be a norm does not necessarily mean it is a good norm or that it is inherently democratic. Sometimes, writes the political scientist Corey Robin, “Norm erosion is not antithetical to democracy but an ally of it.” If we think of democracies as constantly evolving—of needing to evolve at particular historical junctures—then rethinking the unspoken habits of political engagement and competition is simply a requirement of any truly progressive politics. All transformative figures are, by definition, norm breakers, whether that was the leaders of the civil rights movement in the 1960s or abolitionists in the 1800s.

In a previous piece, I pointed to Representative Alexandria Ocasio-Cortez’s endorsement of a 70 percent marginal tax rate as an example of a “radical” proposal that, irrespective of its substantive policy content, is important because it expands the window of the politically possible and encourages politicians and voters alike to consider creative ideas outside the norm. This makes democracies more, not less, responsive to a broader range of ideas and proposals from voters and politicians alike, as democracies should be. The rise of right-wing populism in both the United States and nearly every major Western democracy is, itself, a product of the norm-centric and unimaginative center-left and center-right governing models that dominated in the 1990s and 2000s. A return to norms cannot be both the solution and the problem.

The strongest defense of alarmist politics and of fearing the worst—even in the absence of evidence that the worst is yet to come—is that it encourages the very constraints that prevent truly terrible outcomes. In this reading, the Russia investigation, even if it didn’t produce evidence of collusion, provided an important check on the Trump administration’s ability to do harm. Here, the fear of democracy dying motivates citizens to vote, to petition, and to organize.

This, though, is the job of activists and advocates who are, understandably, less interested in being accurate in their historical analogies and more interested in accomplishing specific political and partisan objectives. It is not the job of journalists, political scientists, or (especially) historians to take historical events and twist them beyond recognition. Even something as seemingly uncontroversial as the use of “the Resistance” to describe the anti-Trump opposition is odd when you think about it—unless, as journalist Jamie Kirchick reminds us, you happen to be “burying weapons in the forests of Poland or hiding in the basements of French country houses.”

To claim the mantle of resistance is also to suggest that your opponents are something akin to fascists or, more modestly, that they are authoritarians. But, unlike the 1930s, today’s right-wing populists do not generally condemn the idea of democracy. More likely, they salute it (or at least a majoritarian version of it). Rather than dispatching brownshirts in the streets, they call for referenda and plebiscites. As many observers have noted for years, “direct democracy” isn’t necessarily good, but direct democracy isn’t quite the first thing you think of when you think of Benito Mussolini or Adolf Hitler.

Being in a constant state of alarm, particularly when there’s little actual threat of being imprisoned for your beliefs, can be unusually thrilling. Carl Schmitt, the hugely influential jurist-philosopher who joined the Nazi Party in 1933, called this the romance of “the occasion.” Romantics, writes the political theorist David Runciman, “want something, anything, to happen, so that they can feel themselves to be at the heart of things.”

But, more than two years after Trump assumed power, there is the risk that they may no longer have an enemy worthy of the title. Democrats control the House of Representatives. More than norms, institutions—the courts, the media, and the machinery of government—have constrained the Trump presidency. In policy terms, outside of immigration, the Republican Party has more coopted the Trump administration than the other way around.

The conclusion of the Mueller investigation, which enjoyed bipartisan support, is important on its own terms, but it also allows us to do away, with the romantic belief that the worst is always yet to come, however much we may have wanted it to. Anything and everything is, of course, possible, but this doesn’t justify treating it as a particularly likely outcome. For this reason, the investigation, however much the Trump administration objected and attacked, was necessary.

It is unfortunate that a result of no collusion had to come after two years of the purposeful delegitimization of a legitimate democratic outcome. Trump’s very real badness has no bearing on the question of legitimacy.

But these objections did not figure into the breathless doomsday scenarios. Instead, academic research, even when it was sound, was used not in the service of “truth” but in the service of a particular political agenda. As Corey Robin memorably described it: “Brooding on the bloodlands of Europe, meditating on the dark night of the populist soul, anxious media professionals find academic confirmation for their sense that they are exiles in their own land.” Robin points to the pitfalls of “explainers” and “news analysis,” which “unconstrained by the protocols of academe or journalism [draw] on the authority of the first for the sake of the second.”

Books, particularly books by academics, are usually more nuanced than the headlines they produce. So those focused on the unabashed anti-Trumpism in a book like How Democracies Die might miss an argument the authors hint at in passing but only explicitly state in the final pages. “Even if Democrats were to succeed in weakening or removing President Trump via hardball tactics,” they write, “their victory would be pyrrhic.” After all, it would only mean Republicans returning the favor in kind, perhaps with even more vehemence, after a new Democratic president takes office. Every time one party won, the other would try to impeach its president, claiming that he or she was both illegitimate and a threat to the very foundations of the republic.

Many will still make such claims, holding on to the notion that they are fighting a world-historical struggle against a would-be dictator. They can believe that they are. But they are not, or at least they aren’t any longer.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Most Americans—myself included—probably don’t have a well-thought-out position on whether a 70 percent marginal tax rate is a good idea. But it probably doesn’t matter whether it is, or whether it would “work.” To argue that “workability” is secondary might sound odd to many Democrats, particularly party leaders and experts who have long prided themselves on being a party of pragmatic problem-solvers. This, though, could be the most important contribution so far of Representative Alexandria Ocasio-Cortez and the new crop of progressive politicians—the realization that the technical merits of a particular policy aren’t the most relevant consideration. For these new Democrats, the purpose of politics (and elections) is quite different.   

Commentators often note that, when it comes to policy, the differences between the Democratic Party’s left-wing and center-left are minimal. Referring to young progressives like Ocasio-Cortez who challenged establishment politicians, David Freedlander writes, “The policy differences … are microscopically small. Nearly all Democrats favor tackling income inequality, raising taxes on the wealthy and the minimum wage, and reforming the criminal justice system.”

This misses the point in a rather fundamental way. Few people actually vote based on policy. As I recently argued in American Affairs, even the better educated don’t primarily vote based on policy. In fact, higher levels of education can increase polarization. (In other contexts, such as the Middle East, the advent of universal education and higher college attendance fueled ideological divides.) As the political scientist Lilliana Mason notes, “Political knowledge tends to increase the effects of identity as more knowledgeable people have more informational ammunition to counter argue any stories they don’t like.”

[Derek Thompson: Alexandria Ocasio-Cortez has the better tax argument]

People’s politics tend to determine their policy preferences, and not the other way around. In one example from the 1960s, as Christopher Achens and Larry Bartels write in Democracy for Realists, even southerners who supported racial integration left the Democratic Party. Once they became Republican, they then adjusted their views on race and affirmative action to fit more comfortably with their new partisan identity. Put another way, if a person with no prior partisan attachments decides to become a Republican, he is likely to become pro-life. If that same person, with the same genetics and life experience, decides to become a Democrat, he is likely to become pro-choice.

Ocasio-Cortez and other progressives appear to understand instinctually what this growing body of research on voter preferences suggests. And its implications are potentially far-reaching. Once you accept that voters are rationally irrational, you can’t help but change how you understand political competition. Incidentally, this is one reason that right-wing populists across Europe (and India and the Philippines and many other places) have been surprisingly—or unsurprisingly—successful: They seem to have relatively little interest in what works. Instead they are concerned with altering the political and ideological imagination of ordinary voters and elites alike, to make what once seemed impossible, possible. In their case, it often involves normalizing Islamophobia and other forms of bigotry. They have managed to drag the center-right further rightward on issues such as immigration and how to integrate (or not integrate) Muslim minorities. But this same approach can be used by left-wing and center-left parties for more constructive ends.

This focus on shifting the contours of the national debate is sometimes referred to as expanding the “Overton window.” It is altogether possible that Ocasio-Cortez doesn’t think that a 70 percent marginal tax rate is realistic in our lifetime—she might not even think it’s the best option from a narrow, technocratic perspective of economic performance—but it doesn’t need to be. As the Open Markets Institute’s Matt Stoller notes, “One thing that [Ocasio-Cortez] has shown is that political leadership matters. Just proposing a 70 percent marginal tax rate has restructured a debate over taxes. Obama’s presidency was defined by self-imposed limits.”

[Read: How Alexandria Ocasio-Cortez’s plain black jacket became a controversy]

Today, in a way that hasn’t been true for decades, more Americans are at least aware of something that might otherwise have been ignored as either overly wonky or, well, crazy. The 70 percent figure proposed by Ocasio-Cortez was a subject of debate—and derision—at Davos. But by joking about it, billionaires and aspiring billionaires, in effect, helped legitimate it. After all, if the richest people in the world are worried about it, it might just be a good idea. (And Dell CEO Michael Dell unintentionally helped remind the audience that the United States had a 70 percent marginal rate as recently as the 1970s). The economist William Gale, my Brookings colleague, wrote a piece responding to Ocasio-Cortez’s proposal, taking issue with the 70 percent figure but agreeing with the underlying principle: “Ultimately, if we want more revenue from the rich, we should broaden the base and boost rates. Raising taxes on the rich is an idea whose time has come, receiving consistent support in polls over the last few decades.”

This new style of Democratic politics is a far cry from the technocratic “what works-ism” that has dominated in center-left parties since the 1990s. The incrementalist approach, by its very nature, preemptively accepts policy and ideological concessions in the name of prudence. It prioritizes being sensible and serious. But why is being sensible an end in itself? As Ocasio-Cortez’s chief of staff, the 32-year-old Saikat Chakrabarti, said regarding another seemingly unrealistic idea, the Green New Deal: “If it’s really not possible, then we can revisit. The idea is to set the most ambitious thing we can do and then make a plan for it. Why not try?”

I don’t feel strongly about a 70 percent marginal tax rate, but I don’t need to. I might even conclude that it simply “feels” too high. But that just means that if and when a Democratic candidate for president proposes a 50 percent tax rate on income that’s more than $10 million, I’ll be impressed with how “moderate,” reasonable, and sensible it sounds.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

There is something corrupting not just about the struggle for power through politics but about politics itself. Philosophers and pundits have long condemned the political as both profane and belittling, the near opposite of the pure and higher spiritual pursuits. In his valediction for the great, controversial scholar Edward Said, Christopher Hitchens wrote: “Indeed, if it had not been for the irruption of abrupt force into the life of his extended family and the ripping apart of the region by partition and subpartition, I can easily imagine Edward evolving as an almost apolitical person, devoted to the loftier pursuits of music and literature.”

Recently, Andrew Sullivan argued that in losing religion, Americans have more and more sought to satisfy their search for meaning publicly rather than privately. In his response, The Atlantic’s Graeme Wood wonders how we might desacralize politics and points to Japan as an alternative, if still flawed, model of lower-stakes competition. Countries that have experienced fascist rule or military defeat, or both, are more likely to accept normal politics, Wood suggests, although even in these places the rise of right-wing populist parties, such as the Alternative for Germany (AfD), points to the limits of historical remembrance.

But neither returning to the Christianity of previous generations nor desacralizing American politics is likely to fix a public sphere that is simply too invested with meaning for anyone’s safety. Instead, Americans need to construct a different sort of public faith—one that borrows from religious sensibilities to infuse debate with a spirit of humility, instead of theological certainty. The problem with America’s public life isn’t that it has too much religion, or too little—but rather, that it has the wrong kind.

[Graeme Wood: The gods that will fail ]

Both Sulllivan and Wood draw a clear, almost idealized line between the private and public. It is certainly true that, in the Christian West, the line has always been there—in theory if not necessarily in practice—to a degree it never was in Muslim-majority contexts. As Sullivan writes: “Liberalism is a set of procedures, with an empty center, not a manifestation of truth, let alone a reconciliation to mortality. But, critically, it has long been complemented and supported in America by a religion distinctly separate from politics, a tamed Christianity that rests, in Jesus’ formulation, on a distinction between God and Caesar.” But this distinction between the religious and the political, which solidified itself with the rise of liberal, secular politics in the 19th and 20th centuries, has remained more fungible than we might like to admit, particularly among intellectuals and philosophers who proved unable to content themselves with being merely that.  

In The Reckless Mind, Mark Lilla looks at some of the most influential Western philosophers of the modern era to make sense of how men so brilliant could become so dangerous in political life. Martin Heidegger, who made common cause with the Nazis and remained a member of the party until the end of the war, told a German newspaper in the 1960s (after coming to see himself as a victim of the Nazis): “Only God can save us now.” Hannah Arendt, his lover for many years, tried to explain Heidegger’s Nazism by pointing to “a spiritual playfulness that stems in part from delusions of grandeur and in part from despair.” In Lilla’s reading, philosophical passion—or “intellectual sorcery,” as he also calls it—all too easily morphed into a kind of magical political thinking. Meanwhile, the hugely influential jurist-philosopher Carl Schmitt—who also joined the Nazis and helped provide legal justification for their ideas—wrote about the political as a form of divine struggle (for him, the political always seemed to be in a state of italicized agitation). Lilla calls Schmitt “a theologian marooned in the realm of the profane.”

In each of these cases, political fanaticism seems to draw from a sublimated religious impulse; God isn’t enough. But the notion that actual religion might temper worldly passions might sound odd to the modern ear. After all, among the more secular-minded, it is basically an article of faith that religious passion fuels extremism and intolerance (which it no doubt sometimes does). But there is also a private contentment, rooted in religious faith, that allows individuals to accept imperfection in this life in anticipation of the next.

[Alex Wagner: The church of Trump]

The question that both Sullivan and Wood are asking is how we might make politics more boring, after the interregnum of near-constant excitement known as Trump. Sullivan wants a return to a Christian cultural sensibility, if not a Christian religious faith, that allows us to live with a public politics that is more or less procedural. Wood asks that we consider how the Japanese have built in an expectation that politics isn’t and shouldn’t be especially interesting.

The difficulty with these proposals is that they ask something of people that, even in our secularizing age, is easier to achieve in principle than in practice: the separation of the personal and the political. The line will always be breached, particularly by the more passionate among us (a passion often amplified by technology). Sullivan is right to recognize that we are all religious even when we’re not members of any faith, that we desire not just meaning but ultimate meaning. For those who believe in God, this shouldn’t be surprising: If God exists, presumably he would instill such a desire into his creation. But perhaps the kind of religion that can be insulated from politics is itself becoming untenable, even within otherwise secularized Christian cultures.

In his masterwork City of God, Saint Augustine wrote that the city of man and the city of God, though they inevitably overlapped, were separate, and he sometimes even portrayed them as walled cities, standing in opposition to each other. The gap between them could not be erased, at least not entirely. This dualism in Christian theology sometimes led to a passivity and fatalism. This passivity is more difficult to sustain in an era of mass politics. Higher literacy rates, the spread of university education, and universal access to information (and the resulting sidelining of clerics as the protectors of knowledge) have been major drivers of ideological politics, in the form of socialism in the West and Islamism in the Muslim world.

[Read: Politics as the new religion for progressive Democrats]

The Dutch theologian Abraham Kuyper, who—unusually for a theologian—served as prime minister of the Netherlands from 1901 to 1905, is the major modern exponent of “Christian pluralism.” He believed that all ideas, when strongly held and believed, were effectively faith-based. According to his intellectual biographer, the American theologian Matthew Kaemingk, Kuyper thought that although one can find some individuals who wish to keep their belief private, “the absence of an ultimate point of loyalty, meaning, or purpose cannot persist for long.”

If this is the case, then it becomes a question of where individuals find their “ultimate point of loyalty.” Is it in a nation, rationalism, truth, God, or some mix of these things? The inherent risk of finding ultimate loyalty in a charismatic leader or a sovereign state is that they are of this world. To claim them, then, requires seeking victory in this world, because they are of this world and this world alone. As the writer Kyle Orton remarks, “Tolerance might not be possible from the secular world, tinged as it is with utopianism and a drive for final victories.” The fundamental question becomes how to clip such a drive.

Kuyper and Kaemingk offer one potential answer. Christian pluralism sees the city of man as inherently broken and fallen from sin, which, in turn, means that politics must be acknowledged as a site of uncertainty, rather than certainty. The solution, then, wouldn’t be walling off one’s Christianity from the domain of Caesar, but rather applying it in a more self-conscious manner.

There is a corollary to this line of thought in Islam that receives perhaps even less attention. One Koranic passage declares: “No one can know the soldiers of God except God.” The “soldiers” part of this tends to attract notice, some of it negative. But some religious scholars, such as the American Islamic legal theorist Khaled Abou El Fadl, interpret this as an endorsement of suspending judgment: No one can know, in this life, who is in fact God’s soldier. In a famous prophetic hadith, if a mujtahid (an authority in Islamic law) strives for God’s truth and is “correct,” then he receives two good deeds; if he is wrong, he still receives one bounty. If the mujtahids disagree with one another, then only God knows which one of them is correct.

If only God knows, then we cannot know. The key idea in these somewhat lost traditions is not the suspension of judgment, so much as the postponement of judgment. For the believer, the judgment presumably comes, but it comes later. For those who do not believe in God, it simply wouldn’t come at all.

Regardless of their faith (it would be a practical challenge to transform a critical mass of Americans into theological pluralists), a small but growing number of citizens can make the conscious decision to resist making the political wholly theological. They can choose to abstain on the question of whether a policy matter—an immigration quota or a Supreme Court nomination—represents an absolute, incontrovertible truth. In practice, this would mean that very few citizens of any nation are outside the fold or beyond the pale. For Americans, it means that, save for a relative few on the fringes, there are no “good” or “bad” Americans in any ultimate sense—or at least not in any ultimate sense that mere humans might be privy to. This is what an American public faith could theoretically look like, and the good thing is that anyone can start believing in it.  

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Donald Trump’s Middle East policy is many things, but it is not incoherent. At the core of the president’s approach has been a stark redrawing of the friend-enemy distinction: doubling down on support, often unquestioning, for allies like Saudi Arabia and Israel, while refocusing the near-entirety of American ire on Iran.

That Trump has bet big on the de facto leader of Saudi Arabia, Mohammed bin Salman, makes the Saudis’ disappearing and likely assassination of the dissident Jamal Khashoggi in their Istanbul consulate—“monstrous” on its own terms—a different sort of escalation. For Trump, this has been personal. His son-in-law and adviser, Jared Kushner, has worked to develop a close relationship with bin Salman, colloquially known as MbS, seeing the young crown prince as a strong partner in isolating Iran and softening Arab enmity toward Israel.

In Trump’s world, friends—particularly friends that are both Arab and authoritarian—are to be criticized as little as possible, especially on low priorities for the administration, like human rights. This hands-off approach has emboldened and empowered MbS to increasingly destructive effect over the past year and a half, offering a reminder that the prospect of U.S. pressure—if not actual U.S. pressure—serves as a constraint on allies that tend toward overreach.

What is both striking and telling is how halfhearted and generally uninterested the Saudis have been in countering evidence that they assassinated Khashoggi. (Take, for example, the Saudi ambassador’s suggestion to Senator Bob Corker that the consulate surveillance video only “live-streams.”) But this is precisely what makes Saudi Arabia’s behavior in this episode even more reckless than the ongoing crackdown on even its milder critics or its increasingly callous disregard for human life in the Yemen war. Trump has invested political capital and extended unprecedented goodwill to MbS, drawing considerable criticism in the process. In a sense, this has been Trump’s “big bet,” perhaps his biggest one in the Middle East. As Axios’s Jonathan Swan put it, “The Trump administration, led by Jared Kushner, made about as big a bet on MbS the visionary-reformer and the Saudis as it’s possible for a US admin to make.” The goodwill has not been reciprocated. Rather, MbS is, in effect, taunting Trump, gloating in his ability to get away with anything.

A supposedly close friend acting in such a manner could be—and perhaps should be—taken by Trump as a personal affront. This further and perhaps even conclusively invalidates Trump’s decision to orient America’s Middle East strategy around a new and changing Saudi Arabia. Trump’s recent comments were somewhat unclear, but he seemed bothered that he even had to talk about it: “I am concerned about that,” he said in response to a question about Khashoggi’s disappearance. “I don’t like hearing about it, and hopefully that will sort itself out. Right now, nobody knows anything about it. There’s some pretty bad stories about it. I do not like it.”

With Trump, however, it is always hard to tell where his passions may lead him. Trump may not care enough about the assassination of a prominent journalist, who was also a U.S. resident and a Washington Post columnist. But the events may also awaken the side of Trump that, first as a candidate and now as president, has suggested an impatience with Saudi Arabia and its military dependence on the United States. In an October 2 rally in Southaven, Mississippi, Trump riffed, saying something that received little attention at the time but in any other administration would have likely have been front-page news: “We protect Saudi Arabia—would you say they’re rich? And I love the king, King Salman, but I said, ‘King, we’re protecting you. You might not be there for two weeks without us. You have to pay for your military. You have to pay.’”

It is little surprise that Khashoggi’s disappearance has provoked not just frustration from U.S. politicians, but anger, including from Senator Lindsey Graham, who said there would be “hell to pay” if Khashoggi was assassinated. This is the danger of transactional relationships with countries and rulers that not only don’t share our values, but often don’t share our interests either. Outsourcing our Middle East policy to a reckless leader who appears to have such disrespect for the United States—and its support for his country—was always a mistake. Now it is also an embarrassment.

Even for those who care little about human rights in the Middle East, the disappearance of Khashoggi calls into question the reliability of an ally that has insisted on acting in such brazen fashion. If Trump’s foreign policy really is about “America first,” then allies who show blatant disregard for America, American values, and American interests should incur significant costs. What might those costs be? This is what the conversation over the future of the U.S.-Saudi relationship must turn to. Trump is at least partly correct about Saudi dependence on the United States. As my colleague Bruce Riedel writes, the Saudi air force “is entirely dependent on American and British support for its air fleet of F15 fighter jets, Apache helicopters, and Tornado aircraft. If either Washington or London halts the flow of logistics, the [air force] will be grounded.”

Bad allies, particularly in the Middle East, where they abound, have been a recurring problem for successive U.S. administrations. U.S. policy makers need—or think they need—them, even when those allies go out of their way to undermine their relationship with the United States. Those allies believe—mostly correctly—that the United States will express concern and complain, but ultimately do little. The worst offenses will be forgotten in the name of national-security interests, as they have been so many times before. It is time to call Saudi Arabia’s bluff.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

The election of Trump—and the populist upsurge he helped encourage—has confirmed that politics is no longer the art of the possible, but the improbable. If Trump can win the highest office in the land, then why can’t the rest of us run for something, too? Why shouldn’t a 33-year old Egyptian-American named Abdul run for Michigan governor? Why shouldn’t a 28-year old, who was only a bartender a year ago, defeat a Democratic establishment stalwart? And why shouldn’t that person say, without shame or apology, that she’s a socialist?

Alexandria Ocasio-Cortez’s primary-election victory, coming on the heels of Bernie Sanders’s insurgent presidential campaign, has thrust “socialism” into the center of the American political conversation. Ideas once dismissed as radical are now gaining a hearing. Fights are raging within the Democratic Party, and on the political left. And that reinvigorated debate—and the other political conflicts Trump has inflamed—may be one of Trump’s more unlikely and ultimately positive contributions to American democracy.

A shocking insurgent victory in New York

Few people would say that conflict is a thing to be embraced. The usual assumption is that conflict and polarization undermine democracy. We hear paeans to civility, unity, and coming together as a nation. But conflict, or at least the threat of it, can be a powerful motivator.

If a government has no fear that the poor might one day revolt, then it will have few incentives to check the excesses of the rich. If elected leaders have no fear that they might lose the minority vote, they will have little reason to take racism as seriously as they should. If established parties have no fear that populist parties might take their place, they will have little reason to rethink their basic approach to politics. Without pressure from populist challengers, centrist parties will avoid addressing sensitive issues, instead postponing them until crisis hits. And crisis almost certainly does.

This confusion around the desirability of conflict makes it difficult to assess how well or poorly the world’s most established democracies are faring, now that nearly every one of them has been significantly affected (with Portugal being a notable exception). As some would have it, America, along with large chunks of Europe, is on the verge of dictatorship from which it may never recover.  

If you view the very election of Trump—to say nothing of what he’s actually done in office—as an “extinction-level event,” then alarmism is precisely what’s called for: the more, the better. But I, for one, do not believe that Trump is anything more than damaging and destructive—as bad as that is. Two or six years from now, America will emerge with considerable damage, but intact. And by then, the experience of having lived under Trump will produce other consequences, some of them positive. In fact, it’s already producing them.

Trumpism—or some variation of the populist-nationalism that has proved so compelling from Italy and Poland to Israel and India—will survive Trump. The ideas of this visceral but vague populism—obsessed with demographic change and trafficking in proposals that only 4 years ago would have been beyond the pale—are almost entirely unconcerned with the norms of what was, up until 2016, a somewhat narrow mainstream consensus.

Peter Pomerantsev’s book, Nothing Is True and Everything Is Possible, popularized a bleak aphorism that encompassed the surrealism and absurdity of living in Putin’s Russia. In the United States, though, that everything might be possible, when it wasn’t before, means that the range of acceptable opinions is being broadened, whether that means democratic socialism, unabashed Catholic integralism, post-liberalism, or even something as silly as the notion that billionaires are well-suited to run for office.

Will Mark Zuckerberg run for president?

As Ben Judah wrote recently, a door has been opened: “Because by embracing everything about Donald Trump, [the Right] has embraced the idea that something is terribly wrong with America, and that the country needs big, beautiful solutions for terrible, awful problems. When the Right becomes populist, embraces deficits, dunks on free trade, and rails against elites, it suddenly becomes a lot tougher for it to ridicule a populist Left that is credibly offering more.”

Where Trump told voters that he (and only he) would “make America great again,” Hillary Clinton countered by saying “America was already great.” America is already great, but the problem with making that the theme of a national campaign is that it promises only minor variations of the status quo. Clinton—and so many of the center-left and center-right candidates hoping to forestall populist challengers—offered voters stability in a time of instability. Experiencing Trump on a daily basis tends to help one appreciate the prospect of once again being bored by politics. But stability, particularly in the long run, is an overrated political good that can actually forestall the kinds of deep changes that every society needs from time to time.

Another way of viewing it, and probably the easier way, is to see Trump as an accident of history and not something to ponder too deeply. Since the results could have easily been otherwise—had, say, James Comey not issued his letter in those final, critical days—there is no particular reason to shift our view of politics or democracy. To view Trump’s election as an extinction-level event is to argue, in effect, that the solution to Trump is self-evident: his removal from office. Politics can then return to at least some degree of normalcy. If Trump, however, is a product of a political order that is fundamentally broken, then the need for radical, unusual, or at least out-of-the-mainstream proposals becomes just as necessary if and when Trump loses—or even if he hadn’t won in the first place.  

Civility and consensus are only possible in homogeneous societies with a strong, shared national identity, something that the United States and most European countries can no longer claim. In diverse societies, where citizens no long agree on the common good, conflict and polarization are unavoidable. Like conflict, the word radical is usually used pejoratively, signifying chaos and disorder. But like conflict, radicalism isn’t necessarily bad, particularly if it allows a larger number of citizens to feel they have a stake in their own society. It also leaves open the possibility that ideas that were once considered unacceptable can be accepted. Some unacceptable ideas are unacceptable for a reason. Some, though, are not.

Today, ideas that were once considered radical and even politically suicidal, like same-sex marriage, are now so culturally pervasive that it’s hard to remember that they were once only held by a small minority. (As recently as 2009, President Barack Obama, despite his seeming private openness to gay marriage, was unwilling to endorse it publicly). It’s precisely through radical voices that the bounds of what’s politically and socially possible expands. At one point in American history, for example, the abolition of slavery was seen as outside the bounds of what was possible or acceptable. Through Bernie Sanders’s presidential candidacy, the idea of single-payer universal health care became normalized, shifting the entire debate around health-care provision onto what many Americans would consider a more moral foundation. (Of course, many other Americans see it as an unacceptable intrusion on the part of the state.)

Was Obama lying about opposing gay marriage?

To find a silver lining to this disruption of political complacency is not to excuse Trump. The families torn apart at the border; those who have lost their healthcare; the communities that will be polluted by environmental disaster; or the millions of people abroad who have suffered from Trump’s unashamedly pro-dictator foreign policy would have been better off had he never run for office. But even without Trump, disruption and conflict were coming; he was merely the catalyst. This—whatever this is exactly—is a universal phenomenon, emerging in dozens of incredibly different national contexts, across varying cultures, regions, religions, and levels of economic development. It may be hard to define, but what we are seeing is nothing less (or perhaps nothing more) than a rebirth of politics, with all the conflict that that entails.

The point about radical ideas is that some of them may be good, but there’s no way to know, definitively, whether they are, until they’re debated openly and freely. And, today, that’s precisely what’s happening. That’s a good thing, and we may have Trump to (partly) thank for that.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

In January 2017, when President Donald Trump’s so-called Muslim ban was first announced, I was passionately against it. It was one of the most frightening texts I’ve read from U.S. government officials in my lifetime. The Supreme Court just upheld the third iteration of the travel ban in Trump v. Hawaii, and I find myself in the odd position of opposing the court’s ruling on personal and moral grounds, while also thinking it was a legally plausible interpretation.

Like most political developments of the Trump era, there is a tension between having the “right” position and having the “correct” position. A pure anti-Trump position would entail opposing the court’s ruling regardless of its substantive content. This feels morally right—and it may even be the morally right—but that doesn’t necessarily make it correct. The Supreme Court, unlike Congress, is not tasked to make moral judgments about the law, at least not explicitly.

The first version of the travel ban, which, among other things, appears to have been intended to troll liberals, explicitly discriminated based on religion. The very fact of being Muslim was grounds for scrutiny. One clause, in particular, effectively imposed a religious test. Refugees facing religious persecution could be admitted but only if “the religion of the individual is a minority religion in the individual’s country of nationality.” The revised version, issued in September 2017, omits such language, and incorporates two non-Muslim countries, North Korea and Venezuela. Regarding Syrian refugees, this means that, in theory if not necessarily in practice, entry restrictions on Syrian refugees would apply equally to Muslims and Christians alike. Accordingly, Chief Justice John Roberts wrote that the president’s directive was “neutral on its face.”

Of course, the president’s directive is probably not neutral in intent. Trump and many of his senior aides bear an avowed animus toward Muslims or Islam, or both. Trump himself said during the 2016 campaign that he thinks “Islam hates us.” How much should intent matter? Constitutional law scholars—and of course the Supreme Court itself—are divided. In her dissent, Justice Sonia Sotomayor cited Trump’s rather long paper (or tweet) trail to argue that “taking all the relevant evidence together, a reasonable observer would conclude that the Proclamation was driven primarily by anti-Muslim animus.” But the extent to which certain motivations figure more than others is always difficult to divine. It’s also possible that someone’s intent changes over time, and it’s not necessarily the easiest task to discern what Trump’s “primary” versus “secondary” motivations might be on any specific matter.  What we do know is that the discriminatory nature of the text of the order itself is no longer self-evident, so what might have initially been an unequivocally discriminatory “Muslim ban” is now something else.

I am still deeply uncomfortable with the Supreme Court’s ruling. It contributes to the legitimization and mainstreaming of anti-Muslim bigotry. That’s certainly how it will be interpreted by millions of Americans. But that doesn’t mean the ruling itself, in narrow terms, rises to the level of one of the great moral questions of our time. The decision does not turn American Muslims like myself into “second-class citizens,” and to insist that it does will make it impossible for us to claim that we have actually become second-class citizens, if such a thing ever happens. To claim that Jim Crow or the Holocaust were similarly “legal” diminishes the moral seriousness of crimes that nearly all Americans, today, agree were unequivocally wrong. (No such consensus exists on Trump v. Hawaii and to assume that it will exist at some point in the future is to assume that morality will always necessarily be progressive and retroactive.) To use a less incendiary comparison, it is also difficult to argue that Trump v. Hawaii is comparable to the 1944 Korematsu v. U.S. decision permitting the internment of Japanese Americans. Nation-states generally have wide latitude in determining which non-citizens can enter their borders, where Korematsu targeted U.S. citizens.

Voters should be able to debate which entry and immigration policies are most appropriate, effective, and, yes, moral as it relates to non-citizens outside of our borders. To insist that such questions should be decided outside the confines of normal democratic deliberation undermines democratic responsiveness and accountability. This is particularly a risk at a time when immigration, rightly or wrongly, has become a top concern of voters in most Western democracies. To think that such questions can be resolved by dismissing or bypassing the views of your fellow citizens is a “long term recipe for public disillusionment and alienation,” writes Tablet’s Yair Rosenberg. The courts may be great places to bend the arc of history toward justice, but they’re only great places for that when they agree with whatever we already think is just.

As The New York Times notes, those who feel uncomfortable (or disgusted) with recent Supreme Court decisions must “look somewhere else. That place is the ballot box.” Moral judgments on constitutionally and legally muddy debates can be rendered, but they’re best rendered by persuading as many of our fellow citizens that they should stop voting for anti-Muslim presidents.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

The election of Trump—and the populist upsurge he helped encourage—has confirmed that politics is no longer the art of the possible, but the improbable. If Trump can win the highest office in the land, then why can’t the rest of us run for something, too? Why shouldn’t a 33-year old Egyptian-American named Abdul run for Michigan governor? Why shouldn’t a 28-year old, who was only a bartender a year ago, defeat a Democratic establishment stalwart? And why shouldn’t that person say, without shame or apology, that she’s a socialist?

Alexandria Ocasio-Cortez’s primary-election victory, coming on the heels of Bernie Sanders’s insurgent presidential campaign, has thrust “socialism” into the center of the American political conversation. Ideas once dismissed as radical are now gaining a hearing. Fights are raging within the Democratic Party, and on the political left. And that reinvigorated debate—and the other political conflicts Trump has inflamed—may be one of Trump’s more unlikely and ultimately positive contributions to American democracy.

A shocking insurgent victory in New York

Few people would say that conflict is a thing to be embraced. The usual assumption is that conflict and polarization undermine democracy. We hear paeans to civility, unity, and coming together as a nation. But conflict, or at least the threat of it, can be a powerful motivator.

If a government has no fear that the poor might one day revolt, then it will have few incentives to check the excesses of the rich. If elected leaders have no fear that they might lose the minority vote, they will have little reason to take racism as seriously as they should. If established parties have no fear that populist parties might take their place, they will have little reason to rethink their basic approach to politics. Without pressure from populist challengers, centrist parties will avoid addressing sensitive issues, instead postponing them until crisis hits. And crisis almost certainly does.

This confusion around the desirability of conflict makes it difficult to assess how well or poorly the world’s most established democracies are faring, now that nearly every one of them has been significantly affected (with Portugal being a notable exception). As some would have it, America, along with large chunks of Europe, is on the verge of dictatorship from which it may never recover.  

If you view the very election of Trump—to say nothing of what he’s actually done in office—as an “extinction-level event,” then alarmism is precisely what’s called for: the more, the better. But I, for one, do not believe that Trump is anything more than damaging and destructive—as bad as that is. Two or six years from now, America will emerge with considerable damage, but intact. And by then, the experience of having lived under Trump will produce other consequences, some of them positive. In fact, it’s already producing them.

Trumpism—or some variation of the populist-nationalism that has proved so compelling from Italy and Poland to Israel and India—will survive Trump. The ideas of this visceral but vague populism—obsessed with demographic change and trafficking in proposals that only 4 years ago would have been beyond the pale—are almost entirely unconcerned with the norms of what was, up until 2016, a somewhat narrow mainstream consensus.

Peter Pomerantsev’s book, Nothing Is True and Everything Is Possible, popularized a bleak aphorism that encompassed the surrealism and absurdity of living in Putin’s Russia. In the United States, though, that everything might be possible, when it wasn’t before, means that the range of acceptable opinions is being broadened, whether that means democratic socialism, unabashed Catholic integralism, post-liberalism, or even something as silly as the notion that billionaires are well-suited to run for office.

Will Mark Zuckerberg run for president?

As Ben Judah wrote recently, a door has been opened: “Because by embracing everything about Donald Trump, [the Right] has embraced the idea that something is terribly wrong with America, and that the country needs big, beautiful solutions for terrible, awful problems. When the Right becomes populist, embraces deficits, dunks on free trade, and rails against elites, it suddenly becomes a lot tougher for it to ridicule a populist Left that is credibly offering more.”

Where Trump told voters that he (and only he) would “make America great again,” Hillary Clinton countered by saying “America was already great.” America is already great, but the problem with making that the theme of a national campaign is that it promises only minor variations of the status quo. Clinton—and so many of the center-left and center-right candidates hoping to forestall populist challengers—offered voters stability in a time of instability. Experiencing Trump on a daily basis tends to help one appreciate the prospect of once again being bored by politics. But stability, particularly in the long run, is an overrated political good that can actually forestall the kinds of deep changes that every society needs from time to time.

Another way of viewing it, and probably the easier way, is to see Trump as an accident of history and not something to ponder too deeply. Since the results could have easily been otherwise—had, say, James Comey not issued his letter in those final, critical days—there is no particular reason to shift our view of politics or democracy. To view Trump’s election as an extinction-level event is to argue, in effect, that the solution to Trump is self-evident: his removal from office. Politics can then return to at least some degree of normalcy. If Trump, however, is a product of a political order that is fundamentally broken, then the need for radical, unusual, or at least out-of-the-mainstream proposals becomes just as necessary if and when Trump loses—or even if he hadn’t won in the first place.  

Civility and consensus are only possible in homogeneous societies with a strong, shared national identity, something that the United States and most European countries can no longer claim. In diverse societies, where citizens no long agree on the common good, conflict and polarization are unavoidable. Like conflict, the word radical is usually used pejoratively, signifying chaos and disorder. But like conflict, radicalism isn’t necessarily bad, particularly if it allows a larger number of citizens to feel they have a stake in their own society. It also leaves open the possibility that ideas that were once considered unacceptable can be accepted. Some unacceptable ideas are unacceptable for a reason. Some, though, are not.

Today, ideas that were once considered radical and even politically suicidal, like same-sex marriage, are now so culturally pervasive that it’s hard to remember that they were once only held by a small minority. (As recently as 2009, President Barack Obama, despite his seeming private openness to gay marriage, was unwilling to endorse it publicly). It’s precisely through radical voices that the bounds of what’s politically and socially possible expands. At one point in American history, for example, the abolition of slavery was seen as outside the bounds of what was possible or acceptable. Through Bernie Sanders’s presidential candidacy, the idea of single-payer universal health care became normalized, shifting the entire debate around health-care provision onto what many Americans would consider a more moral foundation. (Of course, many other Americans see it as an unacceptable intrusion on the part of the state.)

Was Obama lying about opposing gay marriage?

To find a silver lining to this disruption of political complacency is not to excuse Trump. The families torn apart at the border; those who have lost their healthcare; the communities that will be polluted by environmental disaster; or the millions of people abroad who have suffered from Trump’s unashamedly pro-dictator foreign policy would have been better off had he never run for office. But even without Trump, disruption and conflict were coming; he was merely the catalyst. This—whatever this is exactly—is a universal phenomenon, emerging in dozens of incredibly different national contexts, across varying cultures, regions, religions, and levels of economic development. It may be hard to define, but what we are seeing is nothing less (or perhaps nothing more) than a rebirth of politics, with all the conflict that that entails.

The point about radical ideas is that some of them may be good, but there’s no way to know, definitively, whether they are, until they’re debated openly and freely. And, today, that’s precisely what’s happening. That’s a good thing, and we may have Trump to (partly) thank for that.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

The perennial question of whether democracy can work in the Middle East isn’t always easy to answer. Generally, it hasn’t worked. But amid civil war in Yemen, Libya, and Syria, authoritarian resurgence in Saudi Arabia and Egypt, and economic instability in Jordan, there are at least three cases that challenge the notion that it can’t happen here. Tunisia, which held its first post-revolution municipal elections in May, continues to be a (relative) bright spot. Then there are the more unlikely cases of Iraq as well as Lebanon—probably the world’s most successful failed state. All three share two related features: Largely without controversy, they include Islamist parties in their democratic processes; and, second, they feature some degree of power-sharing.

In Lebanon, these arrangements are deeply flawed, chaotic, and responsible for entrenching sectarianism. Parliamentary seats are still apportioned by religious affiliation. At the same time, as the Carnegie Endowment’s Joseph Bahout has noted: “There are typically no winners and no vanquished emerging from crises in Lebanon … Many Lebanese seem to believe their system is the least bad option compared with neighbors.” The Lebanese writer Michael Young has argued that while each sectarian grouping is illiberal and insular, by interacting, “they tend to cancel each other out, creating spaces that allow individuals to function with relative freedom.”

Iraq, like Lebanon, saw lower turnout in its recent election. But as the Brookings Institution’s Tamara Wittes testified to Congress, both polls offered an important lesson. “If Lebanon and Iraq can pull off free elections,” she wrote, “it’s harder for strongmen in other Arab states to argue that they can’t afford the risk to stability of allowing their own peoples a choice in who governs them.”

The very presence of Islamist parties can be inherently polarizing, particularly when they represent large, powerful, and conservative constituencies. Through successive administrations, the United States has regarded too much Islamist representation—or any Islamist representation—as a risky prospect. Yet it was the George W. Bush administration that, despite its discomfort with Islamism, ironically paved the way for Islamists to take power through democratic elections in Iraq—a first in the Arab world. After its January 2005 elections, Ibrahim al-Jaafari of the Shia Islamist Dawa party assumed the prime ministership. Interestingly, Iraqi Muslim Brotherhood members served in various cabinet positions, including as ministers of higher education and planning. In Lebanon, Hezbollah—however much the United States and Saudi Arabia oppose it—has become a fixture of coalition governments. The point here isn’t that these groups are good (Hezbollah is a designated terrorist organization as well as an active participant in the Syrian regime’s mass killing of civilians), but rather that Arab democracy, in practice, often coincides with the normalization of Islamist parties.

Even in Tunisia, where Islamists aren’t yet normalized since the democratic experience is still young, there are similar takeaways. The country’s transition since the ouster of Zine El Abidine Ben Ali in 2011 offers a reminder that democracy can not only survive but produce impressive results—but only if Islamist parties are incorporated into the process. From 2011 to 2014, the Islamist Ennahda-led government and constituent assembly, in partnership with two secular parties, ushered in what the Project on Middle East Democracy called “the most progressive and democratic legal framework for civil society in the Arab World.” These included some of the strongest associational freedoms and human-rights protections in the region. Surprisingly—or perhaps unsurprisingly, depending on your perspective—these gains are in danger of being undermined under the current secular-led government.

Some, like analyst Ibrahim al-Assil, might argue that Tunisia is exceptional because Ennahda is exceptional—an Islamist party that has diluted its Islamism, shed the “Islamist” label, and reconciled itself to a secular state. In my book Islamic Exceptionalism, I argued that these shifts are more the product of an imperative to survive, a fear of repression, and a determined pragmatism than they are the result of some deep ideological epiphany.

In the case of Tunisia, the irony is that Islamists’ willingness to play nice—something that would generally seem quite positive—has contributed to a troubling trend of democratic backsliding on things like police reform, an overly securitized counterterrorism strategy, and the lack of accountability for the crimes and corruption of old regime figures. As the largest party in parliament, Ennahda potentially has considerable power to challenge Prime Minister Youssef Chahed and President Beji Caid Essebsi Essebsi’s priorities. Instead, they have emphasized caution, consensus, and stability, fearing that doing otherwise might summon the old days of polarization and repression. Embracing their role as junior partner in the government, they have, in effect, gained protected status. But this also means that Tunisia is deprived of a cohesive bloc that could serve as an effective lobby for strengthening the democratic transition. The desire for compromise, unchecked, can come at a cost.

These darker undercurrents present real cause for concern. But the bottom line, at least for now, is that the lived practice of democracy can still provide a counterpoint to an authoritarian status quo that often seems unyielding and overwhelming. And in each of these cases, democracy would simply be inconceivable without Islamist participation. That, by itself, should give us pause, particularly at a time when Western democracies appear uninterested or even hostile to either democracy promotion or integrating Islamists, or perhaps even both.

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview