Skip Navigation
Archive

Is There Too Much Democracy in the Presidential Nominating Process?

When party elites chose presidential candidates, they did a decent job.

September 14, 2015

The political world is reeling from the increasingly respectable notion that events have spun out of control—and This Time Is Different.

Donald Trump, who violates virtually every taboo in politics, is now supported by nearly one-third of all Republicans in the latest CNN/ORC national poll. Bernie Sanders, the once little-known Vermont senator who proudly calls himself a socialist, has a narrow edge over Hillary Clinton in surveys from the opening-gun states of Iowa and New Hampshire.

By all conventional standards, Trump and Sanders—as radically different as they are—would lead their respective parties to electoral drubbings on par with Walter Mondale in 1984 (only carried his home state), George McGovern in 1972 (lost everywhere but Massachusetts) and Barry Goldwater in 1964 (38 percent of the vote).

These may still prove to be, as I believe, shallow summer flings that will vanish when voters get serious and campaigns and Super PACs launch their ad wars. Amy Walter from the non-partisan Cook Political Report makes a strong case that it is far too early to “throw out all we know … about politics … [and] the fundamentals we’ve all taken as political Gospel.”

From the breathless commentary on cable TV, you get the sense that the orthodoxies of how to run for president date back to the days when Thomas Jefferson and Aaron Burr tussled for support in the election of 1800. In truth, ordinary voters across the country have only dominated the selection of presidential nominees in primaries since the 1970s.

Yes, the New Hampshire primary has been a quadrennial tradition since 1916. But primaries, which usually were non-binding beauty contests, were decorative flourishes of democracy on a process dominated by political insiders. Presidential nominees from well before Abraham Lincoln to after John Kennedy were selected at unpredictable conventions often after multiple ballots and intense backroom negotiations.

In 1952, Tennessee Senator Estes Kefauver, campaigning in a coonskin cap, won twelve primaries and collected 3 million votes. Illinois Governor Adlai Stevenson disdained the primaries and was awarded the Democratic nomination on the third ballot. JFK in 1960 demonstrated that voters would support a Catholic by defeating Senator Hubert Humphrey in just two heavily contested primaries—Wisconsin and West Virginia.

(Much of the political history in this column comes courtesy of Elaine Kamarck’s invaluable book, Primary Politics).

Everything changed after the tempestuous 1968 Democratic Convention when the stench of tear gas accompanied the nomination of Humphrey, then Lyndon Johnson’s loyal vice president. Antiwar liberals discovered that politics was a fixed game—party bosses had already selected one-quarter of the convention delegates in 1967 before any challenge to LBJ was launched. Just like the coronation of Louis XVI in France, the bitterly contested nomination of Humphrey proved to be the last gasp of the Ancien Regime.

Out of the ashes of the Democrats’ 1968 debacle emerged the modern presidential selection system. By 1980, 33 states picked their convention delegates in binding primaries. While the political calendar is not yet fully locked in place for 2016, the rough estimate is that about 40 states will be holding presidential primaries.

Nothing better symbolizes the triumph of primaries than Hillary Clinton’s 2008 decision not to take her fight with Barack Obama to the Democratic Convention. Ronald Reagan in 1976, Ted Kennedy in 1980 and Gary Hart in 1984 all fought for the nomination to the last gasp, even though they all had a smaller percentage of the delegates than Clinton did after the 2008 primaries. But Clinton knew that the near universal belief in popular sovereignty meant that she couldn’t use unelected Democratic super-delegates (mostly members of Congress and party officials) to go against the will of the voters expressed in the primaries.

On the surface, the way we now choose our presidential nominees seems like another important step in the mid-20th century march towards greater democracy. It appears to rank right up there with the 1962 one-person-one-vote Supreme Court decision in Baker v. Carr and the 1965 passage of the Voting Rights Act. Who in 2015 could possibly want to go back to the days when party bosses chose Warren Harding for president in a suite thick with cigar smoke at the Hotel Blackstone in Chicago?

Yet Harding aside, party leaders did a pretty impressive job of picking 20th century presidential nominees. As Charlie Peters points out in his 2005 book, Five Days in Philadelphia, the Republican nomination of pro-British Wendell Willkie meant that FDR went into the pre-Pearl Harbor 1940 election without having to worry about isolationist attacks from the GOP. In similar fashion, it is hard to think of a more inspiring electoral choice than the 1952 and 1956 elections that pitted Dwight Eisenhower against Adlai Stevenson.

Needless to say, Willkie, Ike and Stevenson never campaigned in party primaries. Rather than being scrutinized before their nominations by casual voters, they were, in effect, vetted by party leaders who keenly understood the political costs of picking unqualified nominees.

But, still, how could anyone possibly justify shutting out most voters from picking presidential nominees today?

Part of the problem with presidential primaries (and Trump offers a prime illustration) is that many voters see them as an opportunity to cast a risk-free protest vote. Voters are smart enough to realize that the binding choice comes in November, while any verdict from the primaries can be overturned later. So Republicans may be using Trump as a wedge to express their anger over immigration and Democrats may be turning to Sanders as a way to signal their uneasiness with rubberstamping a Clinton nomination.

Primaries also deprive voters of their most useful cues in making an informed decision—party identification.

When confronted with a clotted field of 17 Republican candidates, it is easy to understand why voters might gravitate to figures they know from reality television or because of family name or because of heavy television spending. This explains why campaign commercials (and the people who fund them) are more important in presidential primaries than in the general election because voters are more likely to make shallow decisions based on media imagery in the early going.

Beginning in the 1970s, the press believed that it was obligated to take on the role of vetting presidential candidates now that political bosses no longer tightly controlled the nominations. Aggressive reporting in the run-up to 1988 drove Joe Biden from the Democratic race because of plagiarism and destroyed Gary Hart for taking a boat called Monkey Business to Bimini with a woman who was not his wife. These may have been unfair over-reactions (Matt Bai makes this case in his book on Hart, All the Truth Is Out). But few would quibble when the media took it upon itself in recent years to demonstrate why Sarah Palin and Michele Bachmann did not fit the model of thoughtful national candidates.

But in an era when the media has become obsessed with clicks and ratings points, this traditional vetting system is breaking down. As a result, Trump is on live television twice a day bragging about being a “winner.” At the same time, Sanders’ idiosyncratic positions (like his prior support from the National Rifle Association) tend to get lost in the media blur.

These concerns may seem overwrought in a few months when the presidential campaigns return to their natural state. But at the moment it is relevant to ask: Could too much democracy during the nomination process deprive voters of serious choices in the November presidential election? And do voters ever need protecting from their own shallow decisions?

My response to both questions is “no.”

But these are answers that have not been offered lightly—especially since the summer of 2015 has highlighted the weakness of the way we pick our presidential nominees. It is hard to feel comfortable with a poll-driven campaign dominated by Trump with the biggest avalanche of Super PAC spending in history waiting in the wings.

Also, the rise of presidential primaries was accompanied by near-universal use of the post-Watergate campaign reforms that provided federal matching funds to underdog candidates. Now with the breakdown of campaign regulation and the economic insecurity of the news media, we have entered into a different and far more troubling era.

In the end, the best defense for the presidential primaries is the fear that otherwise political elites would bottle up legitimate protest movements. The memory of Eugene McCarthy and Bobby Kennedy fighting a rigged system in 1968 still rankles. Without Howard Dean in 2004, the Democratic primaries would not have had a candidate who fully challenged the premises of the Iraq War. In similar fashion, the modern conservative movement was shut out of Republican politics until Barry Goldwater finally bucked the GOP establishment under the old caucus system in 1964.

It is easy to say that the cure for the ills of democracy is more democracy. But right now the fear lingers that this slogan may not nearly be as true as usual in the 2016 primaries. 

The views expressed are the author’s own and not necessarily those of the Brennan Center for Justice.

Walter Shapiro is an award-winning political columnist who has covered the last nine presidential campaigns. Along the way, he has worked for The Washington Post, Newsweek, Time, Esquire, USA Today and, most recently, Yahoo News. He is also a lecturer in political science at Yale UniversityHe can be reached by email at waltershapiro@ymail.com and followed on Twitter @MrWalterShapiro.

(Photo: Thinkstock)