Policy Studies Governance

Our eroding norms

In late February 2017, the words “Democracy Dies in Darkness” appeared on the masthead of the Washington Post and atop every page of its website. Although Post officials said that this bleak, uncompromising motto had been under consideration for nearly a year, its presence in one of the nation’s leading newspapers just weeks after President Donald Trump’s inauguration was a clear sign of the overwhelming sense that America’s civic norms and democratic institutions face unusual danger as the second decade of the 21st century draws to a close.

This sense is understandable up to a point. To see someone violate a norm can be unnerving, because it often means that something previously unthinkable has suddenly happened. And it’s true that many previously unthinkable things have happened since Trump came on the political scene. For example, when candidate and then President Trump suggested prosecuting his political rivals, refused to say whether he would accept the results of the election, encouraged violence by his supporters, and claimed that illegal aliens somehow “rigged” elections, he violated norms. And when Trump’s opponents openly announced that they considered the president illegitimate despite his clear victory in the Electoral College, they did too. All the while, torch-wielding neo-Nazis and masked Antifa protesters have been advocating for and practicing violence against their political opponents. All of this leads to worries—deep ones—about the future of democracy.

But these worries could use some structure. If we are not clear about what we mean by norms in our democracy, we can easily misunderstand or misrepresent the risks we face and the options we have for addressing them. Generally speaking, political scientists point to four basic features essential to democratic culture, all of which involve both rules and norms. First is the belief that people are fundamentally equal and can manage their affairs freely. This is typically expressed at the nation-state level through free, fair, and competitive elections of responsive, accountable representatives. But this element of popular sovereignty expressed through elections is not the whole of democracy.

The second feature is the corresponding belief that democratic discourse calls for a certain kind of restraint, forbearance, and virtue such that significant political groupings (typically parties) can find mutual accommodations and compromises. This requires recognizing that political opponents can’t be made to go away, so that winners of elections understand that their victory is always temporary, and losers can act as a “loyal opposition” when they don’t get their way, even as they organize for their next opportunity to take power back.

The third feature is the idea that the rule of law matters: Peaceful citizens and groups of citizens receive “equal protection under the law,” as well as a package of civil rights that allows a diversity of political, religious, and artistic expressions, protects minority opinions, and provides a reasonable level of personal freedom. Political leaders, while granted special powers in extraordinary circumstances such as war or financial crises, are otherwise treated as ordinary citizens before the law and are subject to penalties should they act above it.

Fourth, and perhaps most important, democracy is a cultural habit that requires an active and engaged citizenry that values democracy itself, understands its civic responsibility, and cares about preserving democratic institutions.

Some actions of those in the Trump administration and of others in high places threaten all of these aspects of America’s civic culture and political institutions. But today’s political storms are assuredly not unique or even particularly severe relative to others that American constitutional government has weathered. Most notably, the 1860s saw a ruinous Civil War in which the Southern states rejected the results of a free and fair election, left the Union, raised their own army, and launched a war that killed hundreds of thousands of Americans on both sides—all instead of accepting a president who questioned slavery. The turn of the 19th century was brutal in its own way: An oft-intemperate populist movement gained steam, while President William McKinley was murdered amid a wave of anarchist assassinations and bombings throughout the world.

President Trump has not forcibly relocated a portion of the population based on race like Franklin Roosevelt did. He has not been shown to have engaged in a massive cover-up of malfeasance like Richard Nixon did, nor has he colluded with the Supreme Court to manipulate the outcome of a consequential case like James Buchanan did in Dred Scott v. Sandford. The horrors of government-mandated Jim Crow are in the past. And our nation has had far more dangerous anti-democratic movements than we do today. While the behavior of alt-right and Antifa groups is disturbing, their scattered, random acts of violence pale in comparison to the Weather Underground of the 1970s, not to mention the state-sanctioned lynch mobs that enforced Jim Crow for nearly a century. And, while past presidents did not have microphones and smartphone cameras recording their every move, men like Andrew Jackson and Lyndon Johnson said things every bit as bad as anything President Trump has uttered. Since standards of behavior have changed over time generally in the direction of decreasing political violence today’s events may be more disconcerting than similar events would have been in the past. Judged purely by the standards of history, however, it is difficult to find anything truly unprecedented about today’s threats to stability or even President Trump’s behavior.

Thus, most of the usual debates and arguments about the unprecedented nature of today’s events are not constructive. They are ways to channel legitimate worries, but they tend to misdirect those worries and mislead us about the nature of the problems we face. Despite the relative youth of the nation, the fundamental shape of American institutions and democratic culture has proven particularly and peculiarly durable relative to other modern forms of government. The primary problem—for which Trump is only a symptom—is the erosion of these institutions and democratic cultural sensibilities that have long undergirded the republic. It is this erosion of any single action or person in office hat imperils democracy. Strengthening it will require efforts to reinforce both these institutions and this culture, and reacting to every presidential tweet is more of a distraction from that essential effort than a spur to it.

The Challenge of Erosion

The fundamental arrangements of American government have proven unusually resilient compared to those of any other sizeable, modern nation-state. According to the Comparative Constitutions Project, the United States has the longest-running national governing constitution in the world. Of the six G-7 democracies with written constitutions, the United States has the oldest by almost 80 years. By the single most important test of democratic governance—governments chosen through fair and free elections—the institutions established in 1789 have worked brilliantly. Two World Wars, a Civil War, depressions, recessions, and some deeply corrupt people in power have not once interrupted a line of 45 entirely peaceful transfers of power from one lawfully chosen civilian chief executive to another. All the while, a bicameral Congress has met, debated laws, and sent its members before voters to face generally free, fair, and competitive elections.

This unusual endurance of our institutions may be best explained by the features that our democracy does not share with most others. Every democracy has parties and elections and open arenas for decision-making. Nearly all countries with written constitutions, even dictatorships like Venezuela and North Korea, have on-paper guarantees of freedom analogous to the Bill of Rights. But there are three unusual features about the institutional structure of the United States government that particularly stand out as the keys to its endurance: a legislature that truly co-governs, a fully independent, life-tenured judiciary with the power to create precedents with the force of law, and a federal system that distributes important powers between the state and national governments. The first two are unique to America, and the third is unusual in the world. All three have evolved in the modern era in ways that have weakened their power and legitimacy.

On the rare occasions that Americans notice the uniqueness of our federal legislative branch, it’s usually to complain about it. But in fact, a Congress that is not only separate from the executive but, under the Constitution, is the chief institution responsible for governing the country, stands out as a source of our system’s strength. This is achieved in two ways: The Constitution grants a much longer list of enumerated powers to Congress than to the executive branch, and it imbues the legislature alone with power over all fiscal decisions and with the authority to introduce legislation. Unlike their peers in most other countries, Americans speak of presidential “administrations” but not presidential “governments.” This is because the government of the United States includes Congress as much as it does the president and his appointees.

Congress is the only legislature in the world that can design and advance complex legislation without help, or sometimes even input, from the executive. Congress also has extensive power to oversee and investigate the executive. While nearly all democratically elected national legislatures do have some oversight authority, no other country maintains a structure quite like Congress’s autonomous investigative arm, the Government Accountability Office, which is entirely independent of the executive. Nor does any other country maintain the extensive quantitative and qualitative analysis capacities possessed by the Congressional Budget Office and Congressional Research Service.

Indeed, for much of our history, Congress dominated government to the near-exclusion of the presidency. While a handful of chief executives like Andrew Jackson and Abraham Lincoln enacted major changes in the way American government worked, through the early-20th century, most legislation that steered the direction of the country was a product of Congress. Accordingly, chief executives—even powerful ones—often saw major plans thwarted by Congress. In the last century alone, Congress has rejected major presidential priorities ranging from climate-change legislation favored by Barack Obama to the League of Nations upon which Woodrow Wilson staked his career. The legislative branch also conducted extensive investigations into several presidents including Bill Clinton, Ronald Reagan, and Nixon, eventually forcing Nixon from office.

When controlled by parties opposed to the president, Congress has often forced the executive branch to advance legislation with which it disagreed. In the past few decades, Reagan vetoed a comprehensive Clean Water Act that nevertheless was implemented by his administration after Congress overrode his veto, while Clinton signed and implemented welfare-reform legislation written by a Republican Congress. Of course, Congress has not always acted to defend democracy: While seven presidents from both parties worked for a federal anti-lynching law over several decades, the Senate continually blocked such efforts, thereby enabling what was arguably the worst single cruelty of Jim Crow. Still, the point is not that Congress has always been the pre-eminent guardian of democracy, but rather that the American system has been distinct for putting Congress first.

Since the Progressive era, however, the independent power of Congress has ebbed. As the executive branch grew larger (first in staff and more recently through contractors) and more powerful, Congress first reacted (as one might expect) by strengthening its own oversight capacities. But in trying to compete with the executive, Congress gradually lost some of its distinct character as a powerful legislature, and this in time led to declining interest even in oversight.

Furthermore, the increasingly polarized and partisan nature of political races has resulted in the growth of communications staffs to the detriment of everything else; according to the Congressional Management Foundation, members spend only about a third of their time doing actual legislative work. By most measures, simple lawmaking productivity declined significantly as well: The last three completed Congresses were the least productive of the modern era, and as of this writing, the current 115th Congress looks likely to produce even less legislation than its predecessors.

As our colleague Kevin Kosar at the R Street Institute has detailed in these pages, Congress has also ceded nearly all substantive war-making power to the executive and has passed bill after bill ceding enormous additional power to the agencies of the executive branch. The Hudson Institute’s Christopher DeMuth put it candidly in these pages: “[T]he agencies make the hard policy choices. They are the lawmakers.” The sheer complexity and range of services expected from modern government make it almost inevitable that some sort of administrative state will, and probably must, exist. But Congress has, time and again, ceded far too much power to the executive and its bureaucracy, while simultaneously restricting its own ability to even monitor executive activity.

Moreover, when recent presidents have disapproved of the actions Congress has taken, they have stretched and expanded the power of executive orders and legislative signing statements so far that they now function the way laws enacted by Congress do. A lawmaking body that does not enact many laws, will not monitor the executive branch, and cannot guard its own power from a grasping executive does not fulfill its function.

Such weakening has been exacerbated further by a significant change in the way the judiciary exercises its power. Our government’s third branch possesses a uniquely powerful role compared to judiciaries in other democracies. First, as the United States has a common-law system—shared with the U.K. and many of its other former colonies but not the rest of the world—judges effectively make law by issuing opinions. So while many countries have some form of judicial review of legislative acts, the American tradition of judicial review is exceptionally robust and enduring.

American judges are also unusually independent because they enjoy a unique system of life tenure, and, by tradition, are only forced from office when they commit blatantly criminal or corrupt acts. Indeed, in the entire history of the republic, only eight out of roughly 3,600 federal judges have been removed from office by the Senate following impeachment in the House. While a few countries with professionalized civil-service judiciaries have something similar to life tenure for judges, no other country gives judges both the effective ability to make laws and life tenure.

Moreover, part of the power of the judiciary lies in a healthy and long-lasting cultural respect for the branch’s authority. For example, in 2016, when then-candidate Trump attacked federal judge Gonzalo Curiel for being biased because of his Mexican heritage, House speaker Paul Ryan directly and publicly rebuked him, not only because it was blatantly racist but also because it attacked the integrity of the independent judiciary.

In many cases, the judiciary has reined in anti-democratic activities carried out by other branches of government. For example, the judicial decision in Brown v. Board of Education played a leading role in accelerating the civil-rights movement that eventually ended Jim Crow, while measures that protected political speech and the coverage of politics in cases like Brandenburg v. Ohio and National Socialist Party of America v. Village of Skokie also resulted from judicial decisions.

Like the legislative branch, however, the judiciary has not always used its powers to uphold democracy. At times, it has even trampled it. But despite the courts’ sometimes coming to the wrong conclusion, their power and independence to check and limit the executive and legislative branches is highly unusual in world-historical terms and contributes to the longevity of the American system.

In some ways, the modern judiciary has grown more powerful than ever before. It has sectioned off important but ill-defined powers for its own use and has become the major decision-maker in many important areas, from the outcome of elections to the debate over same-sex marriage. Judicial supremacy over the meaning of the Constitution has become an implicit assumption of nearly everyone in both parties. This is why judicial nominations—once a more-or-less rote process where Congress simply scrutinized the qualifications and behavior of a judge and then granted approval—have become a major political issue.

That said, even as it has become more powerful in many ways, the judiciary has weakened its own ability to protect key aspects of the constitutional order. Through its heavy use of “rational-basis” testing of government action and “Chevron deference” to executive authority, it has systematically weakened itself (and also hurt the legislature’s authority). Likewise, the increased politicization of judicial nominations has not only kept seats empty but has also kept many of the nation’s best and brightest legal minds off the bench.

The third feature contributing to the government’s endurance is federalism. Granted, this feature is not as distinct to the United States as are our powerful legislature and judiciary. Canada and Switzerland, for instance, both devolve more authority to regional and local subdivisions than the United States does. But in conjunction with the interlocking limits on federal power, American federalism has been an essential feature of our system’s core democratic ethic.

While much attention is paid to the ways states allow for programmatic experimentation, their independent authority has also enabled citizens to exercise democratic rights simply by moving to another state. For example, women could vote in Wyoming in 1869 and in several other western states by the mid-1890s, even though full equal suffrage for women did not become a part of the Constitution until 1920. Likewise, while far from equal citizens in every respect, African-Americans who moved north during the First Great Migration could and did vote, hold office, enter high-status occupations, and launch outspoken publications that drew attention to Jim Crow’s injustices — all rights that were denied to them in the South. Reviled and persecuted for their distinctive faith, Mormon pioneers also moved west and founded the state of Utah.

As Lee Drutman of the New America foundation has argued, federalism is also desirable partly because it lowers the stakes in other areas of politics. If states decide more, what happens in Washington and in presidential election years simply matters less.

Of course, federalism itself has often run against democratic institutions. Most prominently, the Jim Crow system—the greatest sustained American attack on the democratic principle of respect for minority rights—as almost entirely a product of states’ laws, and it ended only when the federal courts intervened to overturn them. There is a strong possibility, however, that things would be even worse without federalism: For example, rather than being mostly confined to a few states, Jim Crow might well have become formal public policy everywhere.

But American federalism, too, has been eroding in our time. As the federal government has increasingly used public dollars to entice the states to cede authority, and as many federal programs have come to be jointly run by Washington and the states, the distinctions between levels of government have broken down, and the states have grown more dependent and less willful.

Congress, the judiciary, and American federalism are hardly unadulterated protectors of democracy, but when considered properly, each has done more good than harm for democratic institutions writ large. And yet, each has eroded in ways more concrete and manageable than the familiar “failing institutions” label suggests. Indeed, the system has worked because each of its elements has its own powers and its own reasons to protect democratic ideals, norms, and institutions—often against the encroachments of the others.

Cultural Erosion

America’s democracy depends not only on constitutional checks, legal protections, and civic engagement, but also on its democratic culture: a willingness to trust, respect, and compromise with people with whom one has significant, ongoing disagreements about important matters of public policy, while also remaining committed to the idea of democracy itself. These norms, which are vital to all healthy democracies, have lost ground in the United States. In a 2017 Pew survey that measured global attitudes toward democracy, just 46% of Americans were satisfied with the way democracy is working in their country, compared with 70% of Canadians and 73% of Germans (although citizens of some other democracies, like France and Mexico, hold the institutions of democracy in even worse regard). Even more concerning, 53% of Americans would support or be willing to consider non-democratic options for government.

Two developments have contributed to the declining political culture in the United States: the state of its political parties and American attitudes toward members of the “other team.” In a two-party system, compromises that allow for governance have regularly taken place both within political parties and between them. The U.S. party system endures in order to help the government of divided powers to act and decide on behalf of a wide range of interests, beliefs, and faiths—all of which are spread over a vast country. Even when a party has a nominal majority, it will often need to reconcile its own various wings and attract at least a few members of the other side.

Between the end of the Civil War and the early 1960s, this process was often quite visible, as neither party had a single, distinct ideology. In the 20th century, Democrats were somewhat more sympathetic to spending, labor groups, and big government, and Republicans were somewhat more pro-business and inclined toward limited government. But these were only stereotypes, as rock-ribbed conservatives voted a straight Democratic ticket in parts of the South, while some liberals were proud Republicans in places like New York and New England.

Members of Congress had a variety of reasons to vote as they did: Sometimes they voted for ideological reasons, sometimes for narrow district interests (a practice made more common by the now mostly eliminated practice of earmarking), sometimes as an act of “logrolling” (in exchange for votes on other issues), and sometimes for corrupt personal gain. Compromises within political parties—liberal northeastern Republicans’ support of tax reduction policies in return for conservative western Republicans’ acquiescence to environmental legislation, for instance—were frequent.

Moreover, the relatively free-wheeling campaign-finance environment of the pre-Watergate era meant that parties were the main conduits for political funds and that party bosses could often discipline wayward and overly ideological members. Winning elections and holding a majority for its own sake would often take precedence over purely ideological goals. Broad parties needed to compromise internally to remain in power, and they disintegrated or splintered when they did not—something that may well be happening today. Further, party leaders like Sam Rayburn, Wilbur Mills, and Tip O’Neill relished the assertion of congressional prerogatives over presidents, even when they belonged to the same party. The work of political scientists Keith Poole and Howard Rosenthal tells the story of this more nuanced view of American politics; their charts of Congress’s ideology in the 1950s show many Republicans to the left of Democrats, and vice-versa.

This pattern has changed dramatically in our time, for two primary reasons. First, in the wake of the civil-rights movement, the parties gradually realigned and became far less ideologically diverse. While only a trivial percentage of all officeholders actually switched parties, it’s only a slight oversimplification to say that the traditional constituent groups of the conservative, southern wing of the Democratic Party became Republicans while the liberal but business-oriented Republicans, common particularly in the northeast, became Democrats. This deprived both parties of internal diversity and obviated the need to compromise internally, which created a self-intensifying cycle of ideological sorting. Today, every single Republican in Congress is more conservative than the most conservative Democrat. This trend of divergence is further exacerbated by generally falling turnouts in primary elections. Although a few unusual years (2008 and 2016) did see meaningfully higher-than-average turnout, the overall trend has been downward in years with a presidential contest and sharply downward for off-presidential years. Party nominees are thus decided by the most ideologically committed voters.

Second, from the Progressive era on, reforms and social changes have worked to personalize politics and empower individual members at the expense of party leaders. Furthermore, as Jonathan Rauch and Raymond La Raja have shown, activist groups have supplanted parties in selecting candidates and have gravitated toward ideologically extreme individuals with little governing experience. At the same time, campaign-finance laws implicitly encourage candidates to raise money and staff campaigns without relying on the formal party system. While this may have reduced some types of corruption, such laws have also made individual members less dependent on party hierarchies and more dependent on narrower political networks and interest groups—often not even in their state or district—to sustain their careers. As political scientist Michael Barber has observed, out-of-state donors to campaigns are the most ideologically extreme of all contributors. Further, many of the often-moderate, broad-based institutions that traditionally supported the parties, such as major labor unions, farm bureaus, and local chambers of commerce, have weakened. Meanwhile, the institutions that have gained strength—including trade associations, PACs, and campaign bundlers—have tended to speak for relatively narrow sectors of the economy and, accordingly, to support individual members rather than a particular party.

Districts are also becoming less diverse for several reasons. First, and most obvious, politicians who draw the districts in most states have used computer programs to build districts designed to protect whatever party is in power. Second, and probably more important, residential segregation by income and social class—a trend convincingly documented by social scientists ranging from the libertarian-conservative Charles Murray to the liberal Robert Putnam—has resulted in a high level of “self-gerrymandering.” As a result, no plausible electoral map can make a Republican candidate for Congress competitive in the city of San Francisco or provide a way for Democrats to win heavily white, rural areas in the South. Finally, statutory and court mandates designed to pay attention to the racial makeup of districts have made Congress more reflective of the country’s ethnic composition, but have also resulted in a significant number of districts where candidates of one party have no chance at all.

Further, partly because of party weakness, extremism, and the increasing unwillingness of their members to compromise, Americans are simply more hostile to members of the other political party than they have been in the past. For this reason, they are generally more disengaged from civic life. Polls taken in 1960 by scholars Gabriel Almond and Sydney Verba showed that almost nobody—only about 4% of the population could seriously object to their son or daughter marrying outside of their political party. By contrast, a YouGov poll taken in 2010 showed that between a third and a half of the population would feel troubled by an inter-party marriage. The Pew survey of party attitudes likewise revealed that, beginning in 2016, most people engaged with one party—whether as a party member or a party-leaning independent—held very unfavorable views of the other party. Data from the American National Election Studies show that, based on people’s policy positions and group identification since the early 1970s, failure to compromise has less to do with opposing sides disagreeing on policy outcomes than with refusing to support outgroup members. Most alarming, a stunning 70% of politically engaged Democrats told pollsters they were “afraid” of Republicans, and nearly as many Republicans felt the same way about Democrats. This makes the norm of loyal opposition far harder to maintain.

Such hostility—toward not only the opposing party itself, but the people affiliated with that party—diminishes Americans’ willingness to follow the laws enacted by the other party and certainly to trust the words and intentions of the other party. Between the summers of 2016 and 2017, as the party in the White House changed, the percentage of Democrats who said that life was getting better for people like them over the past 50 years declined from 52% to 35%, while the percentage of Republicans that said the same thing more than doubled from 18% to 44%. Since there were no major changes in socioeconomic conditions over this period, it is unlikely that this change in opinion resulted from any factor other than the party in the White House.

That is to say, for a non-trivial percentage of Americans, their entire views of history are shaped by whether or not their party is in power—rather than their everyday sense of well-being, national moral standards, or macroeconomic factors. To tie one’s well-being to partisanship also raises the stakes in every political matter. After all, if people truly believe that life will get significantly worse if the other party wins, then extraordinary and even unprecedented measures are justified and perhaps necessary to achieve political victories.

Rebuilding American Democracy

The decline of democratic institutions and culture did not happen quickly and can not be reversed quickly. While a degree of institutional change can be accomplished through changes in laws and regulations, cultural change is often beyond the power of public policy. And, of course, any solution that essentially involves going back in time is highly suspect.

The stronger and more effective Congresses of the 1950s and 1960s were full of alcoholics, men on the take (they included almost no women), and corrupt backroom deal-making. A more constitutionally robust judiciary made some disastrously bad decisions about civil rights, economics, and much else. The powerful, diverse political parties of the early-20th century were also far more corrupt than today’s and, in the case of the southern wing of the Democratic Party, deeply bigoted as well. Some of the “social consensus” and apparent political amity of the 1940s and much of the 1950s was deeply harmful on issues of race, in particular. Even if it were possible to recreate the Eisenhower era, it would not be desirable. Instead, we should consider what limited and focused steps could help Congress, the judiciary, federalism, and the culture of democracy to work better, and thereby repair American democracy in our time.

For starters, Congress should strive to restore the ideal of congressional government. To do this, the first branch must not be shy about enhancing its powers and perquisites, and must be able to stand up to the executive branch in order to assume its rightful role in governing the country. As Kevin Kosar has detailed in this journal, Congress needs bigger, more experienced staffs, wider capacity to review regulations before they go into force, and greater efforts to reclaim its war-making powers and reviewing authority (through up-or-down votes) on the most economically significant regulations written by the executive. It can give itself all of these powers by a simple majority vote, and it should.

The changes in the nature of the judiciary have involved a more insidious and gradual process, so strengthening it will be even harder than efforts to strengthen the legislature. These cannot necessarily be done by spending money or changing policies alone. The weakness of the judiciary (in some, if not all aspects of its influence) stems, in part, from the weakness of Congress, which makes it difficult to address unless and until Congress itself works better. Better laws will make for a stronger judiciary. A stronger Congress that makes more laws in specific terms and punts less to the executive branch would leave far less for the courts to decide in the first place. It may be impractical to all-out eliminate Chevron deference and the reliance on rational-basis testing without involving judges in the second-guessing of policy decisions (which would be politically disastrous). But even a straightforward reinforcement of Congress’s powers to oversee the regulatory state could do a lot to rein in the potential downfalls of Chevron deference, in particular. Judges should not be afraid to take a much greater role in checking both the executive and legislative branches of government when their actions violate statutes or constitutions.

The best way to strengthen federalism, meanwhile, is actually to use it: namely, to devolve more power and authority to the states. That said, a return to the ultra-minimal federal government of the founders is impossible, and probably not desirable anyway. The worst of federalism—a denial of rights by particular states to significant portions of their populations—has an ugly history. In view of this, a better strategy is one we refer to as “democracy-first federalism,” which would focus any federal interference in the prerogatives of states on efforts to enforce democratic constitutional rights, along with a corresponding retreat of federal power on most other fronts.

Democracy-first federalism would involve a greater and, in some ways, larger federal emphasis on supporting the essential political rights of participation and freedom. Such a shift in focus would offer things to both the left and the right. For example, if states restrict voting rights for certain groups, there is no reason for the federal government to avoid interfering. Likewise, existing anti-discrimination laws would also receive enhanced enforcement. After all, if it actually seeks to uphold the Constitution, the federal government should be vigilant against violations of constitutional rights by local police and other agents of the state. All of these things would almost certainly be viewed as favorable by those on the left. On the other hand, there are also aspects of this approach that would appeal to groups on the right, as the federal government would be empowered to stand up for the rights of religious free exercise and to intervene against local laws and policies on university campuses and elsewhere that punish protected political speech.

Moreover, a responsible federal government is a self-regulating one, which means it must engage in willful restraint in key policy areas. To this end, much of the current interference with state and local control regarding issues like primary and secondary education, housing, health care, and mundane law enforcement should end, both because it exceeds the intended constitutional scope of the federal government and because there is little evidence that more federal control has resulted in better outcomes anyway. The federal government might even become more assertive in areas like scientific research and homeland security where it clearly has a dominant role.

States, however, will continue to be short on the money needed to carry out some of these functions locally, and, relatively speaking, the federal government is better at raising money than states. This is because it has a more reliable revenue stream than smaller units of government, spends a smaller percentage of revenue to collect taxes, is not constrained by balanced-budget requirements, and issues debt at lower rates. But a superior ability to raise money does not automatically create a better ability to spend it. Urgent issues being dealt with by states and localities, such as shortages in housing, inadequate access to transportation, and subpar skills among workers cannot be solved by Washington officeholders’ attempts to deduce the appropriate distribution of local budgets. On the contrary, these challenges are best met by people close to the problem, who can innovate more quickly as the situation demands. Congress might consider creating a bigger version of the Nixon-era revenue-sharing program that gave no-strings-attached but carefully audited and publicly disclosed federal grants to local entities in order to better fund localized problem-solving while maintaining revenue stability.

While deliberate acts of public policy can help rebuild institutions like Congress and the courts, fixing the parties is much more difficult. The best way to start is to make parties more powerful. This, in turn, should start with the institutions that engage and involve the most people: the grassroots mechanisms of state and local parties. A 2016 Brookings Institution report by La Raja, Rauch, and Samuel VanSant Stoddard proposes letting state political parties have more freedom to raise and spend money and recruit candidates, as well as establishing a system of state-level tax credits for political-party donations (like the one that currently exists in Virginia). Further, increasing the base and participation of both parties would make sense. A repeal of “sore loser” rules so that primary losers can (with sufficient support) appear on general-election ballots would help secure a more diverse group of candidates. Although far from a cure-all given the level of residential segregation by ideology and statutory mandates meant to ensure racially diverse representation, efforts to encourage and expand nonpartisan redistricting commissions could also help.

But electoral reforms like these will not solve the deeper problems of culture and politics or the behavior of politicians. On that front, simple attention to democratic norms—loyal opposition, good-faith bargaining, and respect for the civil liberties of opponents—could go a long way, as would even a few politicians leading by example. Small gestures of the type taken by conservative Republican senator Ben Sasse and progressive Democratic senator Ron Wyden can make a big difference. Efforts to encourage better fact-based arguments, such as those sponsored by David Blankenhorn of Better Angels and Eric Liu of the Aspen Institute, also hold promise. Efforts by social-media platforms to review, note, and de-emphasize “fake news” also deserve expansion and plaudits. Culture, however, cannot change overnight and will probably not change through any central authority or law. Instead, we must hope that efforts to rebuild institutions will, in time, heal our democratic culture.

President Trump has shown an unusual disregard for democratic norms, and his actions have weakened American democratic culture. But his election and actions in office are neither the cause nor necessarily even the most important symptoms of the decline of American democracy. On the contrary, our democracy has eroded because the institutions and culture underlying it have eroded. Rebuilding those institutions and that culture will allow the nation to strengthen its long and proud democratic tradition and restore public faith in democracy itself.

Image credit: Nikolay Antonov

 

Featured Publications