A new paper from the National Bureau of Economic Research has confirmed an important but underappreciated fact of American history: in the 20th century, immigration shifted our politics permanently to the left. “European immigrants brought with them their preferences for the welfare state,” authors Paola Giuliano and Marco Tabellini argue. “This had a long-lasting effect on the ideology of U.S. born individuals.”
Long-lasting is right. The paper found that counties with high levels of immigration between 1900 and 1930 were more likely to be liberal today, nine decades later. Even controlling for factors like industrialization and urbanization, residents of these counties were more likely to favor redistribution, oppose spending cuts, and identify with the Democratic party.
This is old news to any student of history. The Democrats have been the immigrant party since before the Irish potato famine. The real puzzle is why. Why do immigrants so consistently favor the left, and why does that effect persist decades later, after they and their children have Americanized in every other respect from language to education to employment?
The sheer longevity of the effect refutes the most obvious theory, that immigrants are poor and poor people favor redistribution. In 1952, most American Catholics were immigrant-descended but far removed from any firsthand memory of tenement life, and Democrats still outnumbered Republicans among Catholics three to one. Today, South Asian immigrants are wealthier and more educated than natives on average, and they favor Democrats as lopsidedly as Hispanic immigrants do.
Giuliano and Tabellini hypothesize that immigrants import political preferences from their home countries. That theory has a long pedigree going back to Thomas Jefferson, who worried that immigrants from European monarchies “will bring with them the principles of the governments they leave, imbibed in their early youth… These principles, with their language, they will transmit to their children.”
But it doesn’t capture the whole picture. Giuliano and Tabellini cite the New Deal as the banner example of their thesis that more immigration means more redistribution, for obvious reasons. It was a massive episode of redistribution, and it happened when immigrant voting power was at its peak just after the Immigration Act of 1924. Their instincts are more right than they know. The closer we look at the New Deal, the more we see just how deeply entangled immigration is with the growth of the welfare state, in ways that go beyond demographics.
The groundwork for the New Deal was laid by two different political movements, both of which owed their existence to immigration: patronage and progressivism. The former was mainly an immigrant invention. Before 1850, patronage in America mostly meant replacing Whig postal clerks with Jackson men and vice versa. It was the Irish bosses of the big city machines that raised patronage from a tool into a system. Tammany Hall pioneered (in America at least) the mass distribution of favors and largesse as a basis for long-term political power.
The Progressives, by contrast, were generally not immigrants. Quite the opposite—and that is the point. Progressivism was a reaction to immigration by old stock Americans. The appalling conditions in slums and sweatshops led the progressives to favor a more active government, and they were more willing to tolerate state intrusion into new facets of life because their proposed beneficiaries were immigrants who, in their eyes, were powerless to help themselves. Jane Addams felt comfortable micromanaging the most intimate habits of Hull House residents because the ethnic difference between them made it easier for her to be patronizing.
Franklin Roosevelt had ample opportunity to observe both these traditions as governor of New York, and when he got to the White House he drew on both to staff his administration. It was not just that immigrant voters were a crucial part of the New Deal electoral coalition, as the NBER paper says. It was that the New Deal was invented and implemented by people whose politics were shaped by immigration, from Tammany veterans like Jim Farley to longtime progressives like Frances Perkins.
What lessons for today can be drawn from Giuliano and Tabellini’s paper? The first is that immigrant attachment to the left has very little to do with anything the opposition party does. Immigrants and their descendants consistently favor Democrats, regardless of whether the GOP candidate has courted them or spent his campaign railing against popery. Both Goldwater and Nixon believed that Catholic voters were natural conservatives, and both were snubbed by Catholics on election day despite their best efforts at outreach.
The second lesson is that immigration’s effect on domestic politics is not talked about enough. Extensive research has been done into how immigrants affect American labor markets, wages, culture, and cuisine, but “to the best of our knowledge,” Giuliano and Tabellini write, “we are the first to systematically document a similar impact on economic preferences and on political ideology.”
It’s no wonder that the topic is under-researched, since it raises doubts about the partisan political advantage that Democrats derive from immigration. The NBER paper suggests that immigrant bias in favor of Democrats has historically been due to factors that preceded their arrival on American shores. If that’s the case, then the problem isn’t anything Republicans have done or failed to do. The liberal advantage is simply built in.
California today is what New York State was a century ago: a portent of the future. Many assume that, as immigrants and their children assimilate into American culture in language and other externals, their politics, too, will come to resemble the American average. Giuliano and Tabellini’s paper suggests that this is wrong. They found that Ellis Island era immigration was still boosting Democrats a century later. If California’s Democratic advantage is similarly durable, it will remain a one-party state for a long time no matter what Republicans do, and so will any other state that follows California’s demographic path.
At a campaign rally in Madison Square Garden in 1936, Franklin Roosevelt told the crowd that opponents of the New Deal were “already aliens to the spirit of American democracy.” Doubters should get out, he thundered. “Let them emigrate and try their lot under some foreign flag in which they have more confidence.” Ironically, the same factors that brought Roosevelt victory have ended up making his rhetorical flourish come true. After a century of immigration pushing America’s political center of gravity to the left, it is big-government advocates who dominate the political spectrum and skeptics of the welfare state who are aliens in their own country.
The post How Early Immigration Shifted Our Politics Permanently To The Left appeared first on The American Conservative.
COVID-19 has ravaged many industries, forcing them into dramatic reinventions. Now, as we gradually re-open, American businesses that have successfully adapted to the restrictions are better prepared to weather the new circumstances. The $150 billion cruise industry, which is heavily dependent on a middle-class customer base, has paradoxically emerged as one of the winners in this new economy, as 2021 bookings surge.
Some 30 million people went on a cruise in 2019. Yet as early as March, analysts were ringing the death knell for ocean liner travel after 700 individuals contacted COVID on the Diamond Princess and an additional 25 ships also reported cases. Dr. Anthony Fauci of the National Institutes of Health warned the public in March not to “get on a cruise ship” during a Meet the Press appearance. Surgeon General Jerome Adams on CNN that same month advised older Americans and those with medical conditions to “rethink taking a cruise.”
The industry suffered another black eye in mid-May when reports surfaced that 100,000 crew members were stranded at sea, some without pay, due to the complicated legal guidelines about releasing and repatriating individuals from ships with COVID occurrences.
While the Centers for Disease Control and Prevention (CDC) has docked cruise travel until at least late July, the top three cruise lines, Royal Caribbean, Carnival, and Norwegian Cruise Line, stated that although bookings for the balance of 2020 are down, and some itineraries have been canceled, bookings for 2021 and 2022 are at parity with historical levels. The volume of advance purchases is especially encouraging. As Mark Kempa, Norwegian’s chief financial officer commented in a recent earnings call, “we have taken a significant amount of new cash…during a period where we had a horrific news flow and we had essentially zero marketing in the market.”
Some analysts say that many are comfortable booking cruises because the cruise companies have loosened their cancelation requirements to allow straight refunds. Others are generous 125 percent credit packages for cancelations.
The cruise industry’s amazing resiliency may seem counterintuitive but it’s not all that surprising when you consider its mass appeal and ability—proven through history—to reinvent itself.
Today, the cruise business caters to all age groups, socio-economic levels, and lifestyles, as well as a plethora of niche special interests. However, this was not always the case. From the late 1800s into the turn of the century, the large steamship lines were predominantly used to transport millions of immigrants to the United States and a handful of wealthy Americans and Europeans on transatlantic voyages.
The most famous of these ships was the ill-fated RMS Titanic, which on April 14, 1912 struck an iceberg and sank within three hours, killing 1,500 of its 2,240 passengers and crew. The 1920s brought new U.S. immigration laws, which slowed the flow of immigrants and abraded the shipping lines’ profits. Consequently, the industry found a new revenue stream by packaging and promoting ocean steam liners as floating luxury hotels with first-class dining in opulent settings. The United States, which in 1952 crossed the Atlantic in a record-setting three days, 10 hours, and 40 minutes, became the ship of choice.
While ocean travel between the 1920s and 1960s was marketed as a vacation option for the middle and upper classes, most passengers were affluent. This all changed in the late 1960s with the rise of commercial air travel. A three-day cruise to Europe could not compete with an eight-hour flight. Suddenly, the industry was forced not only to redefine ocean liner travel but to broaden its customer base. Cruises became more about the journey than the destination, with a greater focus on the shipboard experience and the expansion of amenities and entertainment.
The industry also expanded value propositions and customized prices to appeal to middle-class American families, young marrieds, retired couples, and solo passengers. The paradigm shift started taking place in 1966 with the founding of Norwegian Cruise Line by Knut Kloster, a Norwegian-born shipping magnate, and his partner, Israel-born businessman Ted Arison. Norwegian positioned cruising as a luxury vacation that was a “good value for the money.” Arison ultimately parted company with Kloster in 1972 to establish Carnival Cruise Line, which built a legacy of catering to the middle class.
Popular culture took note of the cruise industry’s changing customer base. In 1969, Paul Gallico published The Poseidon Adventure, a novel about a fictional ocean liner, the S.S. Poseidon, which capsizes after it’s overturned following an underwater earthquake that touches off a 90-foot wave. The ship, a single-class combination cargo-cruise line, is reflective of the industry’s then-recent courtship of everyday Americans.
Since the fictional Poseidon was making the lion’s share of its revenue from cargo shipment, it was able to discount pricing so that a luxury one-month cruise would be accessible. The passenger list was reflective of this sea change. Gallico’s book, which was adapted for screen, with Irwin Allen’s 1972 The Poseidon Adventure featuring college all-American football player-turned-minister Dr. Frank Scott (Gene Hackman), New York city delicatessen owner and his wife Manny and Belle Rosen (Jack Albertson and Shelley Winters), New York City police officer and his former actress wife Mike and Linda Rogo (Ernest Borgnine and Stella Stevens), and the Michigan-based Shelby family’s 17-year-old daughter Susan (Pamela Sue Martin).
In the late 1970s, the cruise industry was further buoyed with the advent of a new television series The Love Boat, which ran from 1977 to 1986. The top-rated Aaron Spelling series, which focused on the adventures of the fictional passengers and crew of a real ship, the original Pacific Princess, has been credited with mass-marketing the cruise vacation as an affordable luxury for main street America. As Bob Dickinson, a longtime Carnival Cruise Lines executive, said in an interview, “The Love Boat was the tipping point, the fulcrum that transformed the entire cruise industry.”
The series, which showcased weekly guest stars from old and new Hollywood, not only made cruising accessible for the middle class, it also glamorized sea employment.
Forty-plus years later, the cruise industry, with 278 ships in operation, a payroll of $50.24 billion, and 1,177,000 global employees, offers enough diverse experiences to satisfy virtually every segment of the population. Or as Jack Jones’ iconic theme claimed, “there’s something for everyone.” The value-based family demographic continues to be critical, with 32 percent of cruisers with children under 18 surveyed saying that they are likely to bring their children with them on vacation, versus 25 percent for non-cruisers. Disney’s Magic (1998) and its sister ships were launched to capitalize on the family market by offering child-centric amenities along with programs for their parents.
Like other industries, the cruise business has been forced to implement new guidelines to service a post-COVID world. Norwegian is the first major cruise line to announce its new health protocols, which include an on-board health officer, pre-boarding health screenings, staggered embarkation and check-in procedures, and the installation of medical-grade air filters. On-ship activities will also be limited to smaller groups for social distancing. And the legendary buffet will be full-service to minimize crowding. Furthermore, as ships are projected to sail initially at 60 percent capacity, new revenue streams will be created.
The cruise industry is poised for a successful return to the waters this summer given its historical resiliency and loyal customer base. It turns out not even COVID can overcome the lure of the sea (and an affordable luxury vacation). As President John F. Kennedy famously said, “We are tied to the ocean. And when we go back to the sea, whether it is to sail or to watch it we are going back from whence we came.”
In a new official strategy of confrontation against the People’s Republic of China, the Trump administration has announced its intention “to compel Beijing to cease or reduce actions harmful to the United States’ vital, national interests and those of our allies and partners.”
Explains the strategy paper:
Given Beijing’s increasing use of economic leverage to extract political concessions from or exact retribution against other countries, the United States judges that Beijing will attempt to convert [One Belt One Road] projects into undue political influence and military access. Beijing uses a combination of threat and inducement to pressure governments, elites, corporations, think tanks, and others—often in an opaque manner—to toe the CCP line and censor free expression. Beijing has restricted trade and tourism with Australia, Canada, South Korea, Japan, Norway, the Philippines, and others, and has detained Canadian citizens, in an effort to interfere in these countries’ internal political and judicial processes.
All true. But which government pioneered the use of economic resources to reward and punish other nations? Hint: it was not China.
The U.S. has long used foreign aid as walking around money for the secretary of state. Countries with American bases have always gotten more cash, as have nations that have made peace with American allies, such as Egypt and Jordan.
In contrast, governments that have crossed Washington have lost money. In 1956, the Eisenhower administration punished Egypt’s Nasser government by revoking its offer to finance the Aswan High Dam. In 1990, Secretary of State James Baker told Yemen’s UN ambassador, “that was the most expensive no vote you ever cast,” after he voted against the UN Security Council resolution authorizing war against Iraq.
Washington has also used trade barriers to reward and punish other states. The U.S. embargoed Cuba six decades ago, and has since applied secondary sanctions that have hit other nations as well. The use of financial sanctions has become Washington’s modus operandi.
Indeed, the Trump administration has dramatically escalated economic warfare, applying “maximum pressure” to Iran, North Korea, and Venezuela, hitting Cuba, Russia, and Syria with multiple new penalties, threatening to sanction Europeans if they try to avoid Iranian restrictions, and targeting Germany’s Nordstream 2 natural gas pipeline to Russia. The White House treats sanctions as the default response to governments that resist Washington’s dictates.
All of these measures were imposed “in an effort to interfere in [other] countries’ internal political and judicial processes.” In fact, despite Washington’s fervent objections to Russian election meddling in 2016, the U.S. has intervened in more than 80 democratic elections in other nations, including the 1986 presidential contest in Russia.
Yet although America remains number one, China’s economic clout is significant, including with important countries such as South Korea. Indeed, without any sense of irony, Matthew Ha of the Foundation for Defense of Democracies recently expressed concern that China was thwarting U.S. pressure on Seoul to follow Washington’s policies. For instance, Beijing “launched an economic warfare campaign that cost South Korean companies operating in China at least $15.6 billion in losses” because the Republic of Korea deployed the THAAD missile defense system.
Complained Ha: “To placate China, Seoul eventually agreed not to deploy further THAAD systems, not to join a U.S.-led regional missile defense architecture, and not to form a trilateral U.S.-Japan-ROK alliance.” Moreover, claimed Ha, “due in part to concerns over Chinese retaliation, Seoul has not completely divested its telecommunications infrastructure from the Chinese company Huawei.” Further, “China’s hand is also evident in Seoul’s aversion to the U.S.-and Japan-led ‘Free and Open Indo-Pacific’ (FOIP) initiative,” instead favoring its own policy directed at Southeast Asia.
If all this is due to a $15.6 billion hit, then Washington should take lessons. The Trump administration has caused economic damage to many countries, yet its wrecking-ball sanctions have so far failed in every case: Cuba, Iran, North Korea, Russia, Syria, and Venezuela all have refused to give into U.S. demands.
The president has been reduced to begging Tehran to negotiate, promising a better deal if it surrenders before November 3 to help his reelection prospects. Iran and Venezuela ridiculed Washington’s threats to interdict Tehran’s tankers. The communists still rule Cuba. Despite two summits, North Korea’s Kim Jong-un is strengthening his country’s nuclear deterrent. No one believes that Russia will give up Crimea.
No doubt, South Korea worries about China’s clout, since the Chinese trade more with them than America and Japan combined. But Beijing is also a good excuse to resist U.S. demands seen as unreasonable, especially given that the current president is Moon Jae-in, a man of the left who has no natural affinity for President Trump.
China sees THAAD as part of a U.S.-directed containment system. And South Korea is not the only ally less than enthused by the administration’s demand to displace Huawei. These issues are about more than money. China will always be South Korea’s neighbor and has a long memory. The U.S.’s national government effectively bankrupt and beset with manifold other challenges, is not likely to stick around Korea forever.
The point is, contra Washington’s delusions, South Korean officials do not believe that taking part in an anti-China campaign serves South Korea’s interests. Ha writes: “Beijing’s sway over this key U.S. ally is especially risky amid growing Chinese aggression and competition with the United States. Most recently, Beijing pushed Seoul to bless China’s new national security law designed to crack down on pro-democracy protesters in Hong Kong. Seeking to avoid conflict, Seoul took a neutral position, thereby undermining the protesters and revealing an alarming inability to support the liberal democratic values that underpin the ROK-U.S. alliance.”
What evidence does Ha have that Seoul wanted to join the complaint? Most of America’s European allies and Asian friends took similarly cautious positions. Even Tokyo ostentatiously refused to join America’s statement on Hong Kong, though the former now says it wants to take the lead on the issue at the next G-7 meeting, to uncertain effect.
Moreover, the U.S. routinely sacrifices other people’s democratic aspirations and human rights for policy ends. Without shame, the administration is assisting the brutally totalitarian and aggressive Saudi dictatorship as it slaughters Yemeni civilians and denies its own people political and religious liberty. Washington stands by as the Egyptian and Bahraini dictatorships brutally crush democracy activists and protesters.
Yet Ha demands action to push—or is that force?—South Korea onto the battlefield against China. He writes: “If its China strategy is to succeed, the Trump administration must counter Beijing’s attempts to undermine U.S. alliances.” Which requires that Washington “assuage ROK concerns about Chinese coercion by committing to proportionately punish China for any attempted coercion and to provide South Korea with immediate economic support to cope with Beijing’s retaliation.”
So Washington, the world’s chief proponent of economic warfare, is going to sanction another country because it organizes a boycott, cuts investment, or restricts trade to another country? And Washington, with a skyrocketing national debt, is going to create a new dole for wealthy countries like South Korea? Imagine the long line of claimants that will develop demanding compensation for following America! But what if Washington’s friends still balk at following U.S. dictates? Will America then sanction them, making them pay for their perfidy?
This bizarre strategy is doomed to fail. Despite Washington’s presumption that it speaks for the world, its allies often disagree. Seoul currently disputes American policy toward North Korea. Unsurprisingly, South Korean policymakers want to preserve peaceful, stable relations with both the U.S. and China.
“If we antagonize China,” observed Moon Chung-in, an adviser to South Korea’s president, “China can pose a military threat to us. Plus, China can support North Korea. Then, we will really have a new Cold War on the Korean Peninsula and in Northeast Asia.” Of course, some Americans don’t care about the possibility of war “over there,” as Senator Lindsey Graham famously put it. South Koreans understandably see it very differently.
When I ask South Korean diplomats whether they are prepared to allow the U.S. military to use their bases against China in a war over Taiwan, they blanch. There ain’t no way their country is going to be turned into a battleground and made an enemy of the Chinese at Washington’s command.
Washington has enough problems dealing with China without creating a new battleground with little practical benefit to America. The U.S. already is running a trade war, seeking to force compensation for the COVID-19 outbreak, and threatening Chinese concerns with sanctions tied to Iran and North Korea.
America will be sorely disappointed if it believes it can convince—or compel with money and threats—its allies into following whatever policies it promulgates. Joining an American campaign against China looks suicidal to Seoul. Demanding that South Korea choose between Washington and Beijing could wreck the alliance. Right now, hubris poses a bigger threat than China to U.S. foreign policy.
Doug Bandow is a senior fellow at the Cato Institute. A former special assistant to President Ronald Reagan, he is author of several books, including Tripwire: Korea and U.S. Foreign Policy in a Changed World and is co-author of The Korean Conundrum: America’s Troubled Relations with North and South Korea.
The post Washington Complains: China is Doing What We Always Do! appeared first on The American Conservative.
This week I’m setting off like I normally do every few months, on a road trip documenting rural America and the little pieces of Americana I find along the way. I typically think of it as modern-day exploration. With exploring however you’re supposed to discover something new. I know what I’m going to find and it hasn’t changed for almost three decades now.
My journey this time is to follow the original 1913 route of the Lincoln Highway from its easternmost point in New York City’s Time Square, to roughly where it passes just south of Chicago. While Route 66 seems to get all the fame, the Lincoln Highway has a much older and much longer story behind it and I’m excited to travel it.
However what really fascinates me about the Lincoln Highway, beyond the fact it was America’s first coast-to-coast highway, is that despite starting off in New York City and then going through Philadelphia, it pretty much bypasses or altogether avoids some the major cities of then and now. Along the exact route west of Pittsburgh, you won’t find a city with a major professional sports team until you hit Denver.
The Lincoln Highway for all intents and purposes is a quintessential rural route. It almost purposefully avoided the cities that had regular rail service at the time and is therefore an absolute must travel destinations for me. Yet I know as I set out to complete the eastern third of this roughly 3,300 mile journey I’m only going to reaffirm a lot of what I have been writing about and photographing for the past 25 years. Small towns and rural areas are dying.
I’ll carry on like I normally do, making a few dozen posts on different social media platforms as I travel along and in the weeks after. Filling them with little tidbits about the things I see, people I talk to, and the history of the places. Hoping I’ll find some compelling story of growth and resilience. Yet I can already begin to write some of these posts even before I’ve covered mile one or finished the research on places I’ll be visiting.
There will inevitably be the county seat that has been stagnant with population growth and the surrounding towns and villages that have witnessed population decline. More than a few folks will have moved from those towns to the biggest town in the county as it has held on to the most employment opportunities.
I’ll photograph a school building that has been abandoned, or is no longer functioning as a school. It was most likely built when the area was booming in the 1910s or 20s. Chances are it closed down sometime in the 1980s or 90s and consolidated with one or more neighboring town’s schools in the county. The town it was it in will for sure have seen population loss, or at least zero growth in the time since.
One aspect that is ever present in most of my posts and writings is that there is both the trend of serious population decline and of the hollowing out of independent retail businesses in the majority of these areas across America.
I’m going to see that the Lincoln Highway on the ground today is vastly different than it was in 1913. And by that I do not mean that it is now paved, as opposed to being the dirt roads it started as, but that it has completely bypassed city centers and business districts. No longer is it going to have stoplights and the ability to turn down any side street as you pass through town. No, that way of traveling was abandoned in favor of bypasses, exit ramps and overpasses. Ironically realignment of the Lincoln Highway to improve travel distances and times started just a decade after it was built.
Now with parts of the Lincoln Highway looking more like an interstate, in some cases actually being part of the interstate, you can filler-up, drive-thru to get food and be back on the road in minutes.
We’ve given up that way of traveling all so we can save a few minutes here and there. We’ve bypassed the journey for the sake of having more time at the destination. One could even say as air travel has increased, America has figuratively built one giant overpass for most of the country.
There for sure will be a post featuring a bunch of vacant storefronts on the main street in some of these towns that have had the Lincoln Highway rerouted around their business district.
While seeing a business district shift from a central location to a road that has exit ramps on a highway may still keep tax revenue and retail option in a given town, don’t underestimate the drastic affects being bypassed can have. Farther west along the original Lincoln Highway sits Medicine Bow, Wyoming. Once a fairly steady growing town of almost a thousand residents, Medicine Bow saw its fate tumble when the main highway through town didn’t just shift the route around the town’s center, but skipped it altogether and moved 20 miles south with the creation of Interstate 80.
While the Lincoln Highway still passes through Medicine Bow, the traffic doesn’t. In just a decade or so after I-80 was extended through Wyoming (1970) the town shrank by 59 percent. Currently it has about 260 people living there.
I’ll be sure to stop in some lovely diners and restaurants that are hold-overs from the golden days of automobile travel. Still holding on in a town of a few thousand people, but sadly I know I’ll be writing about a few places that just recently closed down. They’ll fit a script of being run by the same person or family for years. A staple of the community and missed by travelers and locals alike, but the real news is the fast-food chain off the exit ramp is pulling in more profit in a week than those places pulled in in a month.
That story will repeat itself with the clothing store, the gas station and the corner grocery store. They’re all going to the way side as more of us choose to bypass them to save a little. As a culture we’ve have not only succumb to the idea of saving a few minutes on travel, but we’ve completely given up on taking time to explore.
When traveling we see the same golden arches and yellow discount smiling faces as we do in our home towns and that familiarity saves us from the hassle of exploring to find something different. The irony of it all is that as our parents and grandparents traveled they most certainly didn’t have mini computers in their pockets that would help them search out new and interesting places. They drove a road through a town and took a chance on a sign painted on a the side of a building, and later on a bright neon sign beckoning AC and a pool. They’d come home with stories about the best brisket or cutest little motel. Places you’d look for when you drove through, but could never find because they were gone now.
Yeah, I’ll for sure be writing about the death of the mom-and-pop-owned business and how for every ten dollars they earned, seven of them went back into the community. But maybe what I should really be writing about is how to save five minutes and a buck or two, and we’ve been silently killing the places in-between while also lamenting their disappearance.
The people who traveled through many of these towns were an essential part of its economy. In a way, our former way of traveling for work or for pleasure, was as much an economic driver of small towns as the mouse is to Orlando. Today we spend our money while traveling at businesses that funnel those profits back to the headquarters in the large cities.
Just remember when we return from our trips this summer and talk about how things are changing and nothing is like it used to be, chances are we’ll be complaining about the quality of the food at the same chain fast food places we are so thankful to see off the highway exit. Going on about the amount of time we lost stuck in the drive-thru. Yet on our next trip, we’ll be right back there instead of taking a slower route and exploring the little bits of Americana that are still there.
Vincent David Johnson is a Chicago based photojournalist, filmmaker, and the person behind the Lost Americana documentary project. When he’s not working on Michigan Ave., chances are you can find him in rural America checking out a town you’re not going to see on a travel show.
In 1988, a certain congressman from Texas ran for president on a platform of bringing home our troops from around the world. Even then, more than 30 years ago, U.S. troops were in over 100 countries, and tens of thousands were still in Europe.
That Texas congressman was my father, Ron Paul, who 20 years later ran again for the presidency and was still calling “to begin bringing American troops home from around the world—an absolute necessity if the budget is ever to be brought under control. We’re going broke and we still have 75,000 troops in Germany?”
In his best-selling book The Revolution, my father wrote: “We can either withdraw gracefully, as I propose, or we can stay in our fantasy world and wait until bankruptcy forces us to scale back our foreign commitments.”
This week, President Trump called for a modest reduction of American troops in Germany, reducing them from 34,500 to 25,000 (a great start that will hopefully lead to further reductions there). The Republican neocon caucus responded exactly as you would expect. You’d think the Berlin Wall was still in place and two million Russians were about to invade Germany. Utter nonsense.
With the Cold War now 30 years moribund, the hysteria over removing troops is ludicrous. Meanwhile the very real threat of bankruptcy and menacing debt grows each day. Just this year, the United States will add $4 trillion to the national debt. Can the Germans afford to defend themselves? Without question. Germany actually balances its annual budget every year.
Yet the U.S. still has about 170,000 troops in about 150 countries at great expense in both lives and treasure. Often that puts our soldiers on the front lines of civil wars whose origins we barely even comprehend. The U.S. also becomes allied with governments, such as Saudi Arabia, that are barbaric, despotic, and anti-American. And yet the cycle continues because the war caucus vows to never, ever let our troops come home.
President Trump is also advocating ending our nation’s longest war in Afghanistan. It couldn’t happen soon enough. The American taxpayer is paying $50 billion a year to build roads and bridges in that country, while our own nation’s infrastructure crumbles.
President Trump has also discussed having fewer troops in South Korea, and has actually forced Seoul to pay more for our presence. Possibly the best aspect of the Trump presidency, though, has been his willingness to challenge the bipartisan neoconservative consensus on forever war.
Yet critics, including myself, will admit the Trump presidency has not always practiced what it’s preached here. While Trump has consistently advocated for fewer troops in Europe, he has re-introduced U.S. troops into Saudi Arabia, a mistake that eventually will lead to more war or terrorism or both.
But today, give credit where credit is due. Trump, the disrupter, is right to bring the troops home. And I say don’t stop until we once again have a military whose primary job is to defend America.
Rand Paul is a Republican senator from Kentucky.
The post Trump Deserves Credit for Bringing the Troops Home appeared first on The American Conservative.
On Gen. George Washington’s orders, the Declaration of Independence, signed in Philadelphia, was read aloud to his army. On hearing it, the troops marched to Bowling Green, decapitated and pulled down the statue of George III, and sent the remnants to be melted down into musket balls.
It was a revolutionary act, a symbolic statement. These once-loyal American subjects were now rebels and no longer owed allegiance to the king. They would fight to end his rule in America.
During the recent demonstrations and disorders here, similar acts had about them an aspect of societal rebellion and a repudiation of a heritage.
In Richmond, Virginia, a statue of Christopher Columbus, who generations of American children were raised to revere as the intrepid Italian explorer who discovered the New World, was pulled down and thrown into a lake.
In Boston, the Columbus statue was beheaded.
In a half-dozen states, statues of Confederate generals and soldiers were pulled down. Gov. Ralph Northam promises to remove the huge statues of Robert E. Lee and Stonewall Jackson from their century-old places of honor on Richmond’s Monument Avenue.
In Philadelphia, the statue of fabled Italian American cop, police commissioner and mayor, Frank Rizzo, was desecrated and hauled away.
Retired Gen. David Petraeus has written to urge that all army bases bearing the names of Confederate generals, such as Forts Benning, Bragg and Hood, be renamed. Robert E. Lee, who is everywhere at West Point, says Petraeus, was a U.S. soldier who “committed treason.”
Nancy Pelosi wants 11 statues, including those of Confederate President Jefferson Davis, Confederate Vice President Alexander Stephens, and Sen. and U.S. Vice President John Calhoun, removed from the Capitol.
The purge of historical figures has spread to Europe.
The giant statue of King Leopold II in Brussels, who was enriched by the brutal plundering of his Congo colony, has been taken down.
In Bristol, England, a statue of Edward Colston, philanthropist and patron of the city but also a slave trader, was thrown into the harbor.
At Oxford, students are moving to take down the statue of Cecil Rhodes, the archimperialist and founding father of Rhodesia who created as his legacy the Rhodes scholarships for British and American students.
Resumes of all the once-admired great men who discovered, explored and colonized the New World, as well as all those who created and first led the United States, are being investigated to determine how egregiously these men violated the egalitarian and democratist dogmas of modernity.
The list of malefactors seems impressive.
Who are we talking about?
Nearly half of the signers of the Declaration of Independence and the Constitution were slave owners. So, too, were five of our first seven presidents and two of the four men on Mount Rushmore.
George Washington won the war for independence. Thomas Jefferson doubled the size of the nation with the Louisiana Purchase. Andrew Jackson saved the nation from defeat by the British at the Battle of New Orleans and seized Florida. James Polk took us to war with Mexico and relieved it of what is now the American Southwest and California.
All four of these nation-builder presidents were slave owners.
The systematic dishonoring and disgracing of men once revered has only just begun. But it represents a spreading revolution in thought and belief about the origins and history of America.
How far is this going?
During the London protests in solidarity with Black Lives Matter, there was painted on the Parliament Square statue of Winston Churchill, who historians voted “the greatest man of the 20th century” for his role in leading Britain against Nazi Germany, the word “racist.” The mob wanted Churchill’s statue down.
And was Churchill a racist?
Surely, he was an archimperialist, a lifelong defender of the British Empire who believed in the supremacy of the Anglo-Saxon race and its right to rule what poet Rudyard Kipling called “the lesser breeds without the law.”
Churchill disparaged people of color whom the British ruled, from the Caribbean to Africa, to the Middle and Near East, to South Asia and the Far East, in terms that would instantly end the career of any American or British politician who used them today.
Historian Andrew Roberts writes of Churchill that he was a “white … supremacist (who) thought in terms of race to a degree that was remarkable even by the standards of his own time. He spoke of certain races with a virulent Anglo-Saxon triumphalism.”
Many Americans, especially among the young, view the history of the European exploration, the colonization of the New World, and the creation of Western empires not with pride but with shame and guilt. And they want to make expiation by canceling out all the honors accorded such men, be it in statues or the names of cities, towns, parks and streets.
And their numbers and militancy are growing. The left has the bit in its teeth and is dragging the panicked elites along.
How this ends without permanent division in the country escapes me.
Patrick J. Buchanan is the author of Nixon’s White House Wars: The Battles That Made and Broke a President and Divided America Forever.
“This not who we are,” President Obama used to say when something unbecoming to his progressivism occurred. Few caught the statement’s colossal presumptuousness, casually arrogating progressivism’s pieties to America’s larger sense of self. “So diffuse and pervasive is the progressive outlook,” wrote the critic George Scialabba in 1991, “that merely to articulate it is an achievement.”
In 2020, progressivism appears hale. Will the hordes elect a revanchist president? Per Martin Luther King’s formulation—also invoked by Obama—the justice-bound “arc of the moral universe is long.” In the meantime, let a million lawn-signs bloom, proclaiming fidelity to progressive catechisms and injunctions to “Resist!” (as if Emma Goldman and not some account executive or corporate VP resides within).
Yet it’s also showing signs of wear. Progressivism is increasingly unhinged in its policing of discourse, confounded by the recrudescence of forces like nationalism—supposedly consigned to the garbage can marked “wrong side of history”—and estranged from working-class constituents. The ideology itself has become tangled in conflicting moral imperatives and its confused jumble of causes, both in pursuit of chimerical goals and mired in glum introspection. The highest state to which many progressives aspire seems to be self-awareness of their own privilege (though they’re conveniently obtuse to the status conferred by flaunting their exquisitely modulated penitence).
“Late capitalism” is a phrase du jour, but what about “late progressivism”? Another Brahmin gloss on our times is the Trump administration as “hyperreal” spectacle—a Kremlin/Fox News-inflected gilded simulacrum of reality. But how does some variant of this not also apply to contemporary progressivism, with its conspiratorial claims of Russian skullduggery and unfalsifiable assertions of pervasive discrimination? Or the histrionics of media impeachment coverage, played out before a bored, listless public gallery?
Then there’s a resurgent interest in the works of Christopher Lasch with their astringent critique of progressivism and disinterring of “communitarian” traditions.
All of this is converging on a sense of progressivism as one among, as the English philosopher John Gray put it recently, “plural and contending” value systems, subject to its own folkways, mythos, weltanschauung, and prejudices.
Bradley C. S. Watson’s Progressivism: The Strange History of a Radical Idea had me with the word “strange.” Progressivism today is strange. Meanwhile, Trump’s election has spawned a shelf of histories and ethnographies about the white working class: how refreshing to see progressivism come in for similar treatment. And presumably Watson, a political science professor at Pennsylvania’s Saint Vincent College, didn’t have to repair to Appalachian Ohio to conduct his fieldwork.
Wrong meeting. Actually, Watson’s Progressivism is a history of the histories—refracted through the exigencies of the presents in which they were written—by which received wisdom about early 20th-century progressivism came down to us, and the revisionism underway since the 1980s. Since that time, acolytes of the German émigré scholar Leo Strauss have become associated with the “Claremont School,” a colony of constitutional conservative political scientists, and coalesced at California’s Claremont Colleges, Watson among them.
Historical depictions of progressivism served to domesticate the movement, he writes, emphasizing, variously, its congruence with prior U.S. history, diffuse non-doctrinaire populist character, small-bore nature (rooted in the “status anxiety” of its supposed middle-class tribunes) and—mediated by the New Left—essentially conservative cast as a tool of big business.
The conservative counter-narrative holds that these accounts, oblivious to their own editorializing, resoundingly undersold progressivism. It posits that progressivism—imbued with social Darwinism, pragmatism, Hegel’s exaltation of the state and “social gospel” Christianity—was deeply transgressive of the founders’ Constitution. The older tradition was recast from transcendent holy writ to historical artifact belonging to an earlier, and thus less-evolved, era—a dead letter straitjacketing the Prometheus of government amid the imperative to reform the social ills attending industrialization and urbanization. Extolling an infinitely extensible “living Constitution” and conceiving of man as “morally perfectible” within a Whiggish teleology trending toward ever more “freedom, justice, and truth,” progressivism represented a “pivot point” in U.S. history. It sanctioned the projection of state authority into what had hitherto been considered the preserve of civil society (recast as a redoubt of corruption) and private conscience, elevating a proto-administrative state of technocrats. At the same time, the progressives ushered in today’s heroic conception of the presidency as a seat of enlightened moral agency, as it judiciously marshals “popular will” and the forces of history.
Fixated on the figure of Woodrow Wilson (with his glinting pince-nez, priggish Victorian Dad mien, and anti-suffrage segregationist views, a suitably unambiguous villain), this is the wrong-turn narrative espoused in the Tea Party-era pedagogy of Glenn Beck. And Watson’s Progressivism is in part an account of the academics working upstream of Beck and his chalkboard. But it’s also a chronicle of the Straussian reckoning with progressivism: a cadre of scholars, governed by the conviction that “moral-political understandings” can transcend “time and place,” who accorded progressivism’s architects the dignity of taking them at their word, rather than reflexively discounting this as a product of self-interested historical actors’ “false consciousness.” It’s a reminder of one of progressivism’s blind spots—in English soccer parlance, its inclination to play the man, not the ball.
Many of Watson’s historical observations about germinal-stage progressivism could have been written of its current form. He remarks on the juxtaposition between its eyes-on-the-prize goal orientation and disdain for attaining popular assent to its reform agenda, witnessed in Wilson’s withering condescension toward “public criticism” as a “clumsy nuisance, a rustic handling delicate machinery.” And he draws a throughline from the God-bothering messianism of early progressives like Walter Rauschenbusch to sanctimonious social-justice activists.
But how far is today’s progressivism really descended from the 1900s version? University of Virginia political scientist James Ceaser has described the former as a compound of original progressivism, multiculturalism, and postmodernism, with an admixture of countercultural emphasis on personal growth. Still, Watson crystalizes an inalienable aspect to progressivism past and present: its protean, remorselessly acquisitive nature, ever on the lookout for the next moral improvement project (and the political clients this yields).
Progressivism is an uneven book. Claremont Review of Books editor Charles R. Kesler contributes a foreword and figures in an exploration of the intellectual genealogy of the conservative challenge to the liberal consensus on progressivism, but excerpts from Kesler’s book, I Am the Change, materialize in the text as if delivered from on high, sending the reader to the endnotes for their provenance. One learns much from Watson’s survey of the literature about the historiography of progressivism, but soon wises up to his modus operandi of arraigning its works—finding each in error for slighting progressivism’s subversion of the Constitution. And Watson’s otherwise felicitous prose is marred by occasional archaic locutions. The obscure Latinate “in fine” is preferred to “in short,” and I thought “desuetude” had passed into…desuetude. The Dwight Macdonald line about a work having “enriched my vocabulary, or, more accurately, added to it,” comes to mind.
But ultimately Progressivism is insightful and rewarding. And Watson owns the prejudices of his cohort, referring to the “deep attachment to the Constitution and to the regime that is experienced by the revisionists.”
This is more than can be said for progressives with their avowals that their creed is reality itself. “[I]n truth,” Watson writes, “liberalism was all about theory from the very beginning.”
Stephen Phillips has reviewed numerous books for The Spectator, Economist, Weekly Standard, Wall Street Journal, and Times Literary Supplement.
I stopped my car in the middle of the street and cried at the sight of what lay before me. Joy usually marked the moment when my tires touched onto Adams Street and my childhood home came into view. Only, this time, I reacted the way I did when, as a boy, I neared my grandma’s casket and caught sight of her sunken face—her familiar beauty marred by the sting of strangeness.
Weeks earlier, Hurricane Michael had hacked through my hometown of Sneads, a rural farming community in the Florida panhandle. For miles outside of town, the woods that flanked the roads were once so dense with slash pine and mossy oak that deer crossings were a frequent danger to drivers. Now, the woods are awkwardly exposed as thousands of trees lay snapped in half, their jagged bottoms thrust heavenward like pikes on some ancient battlefield.
As I drove into town, a sea of blue FEMA tarps stretched out before me, covering the homes and businesses that had been pummeled by the Category 4 winds. Though some homes were hit harder than others, none were spared—certainly not my childhood home.
After I regained my composure and pulled into my parent’s driveway, the full extent of the damage became clear. The sight of trees littering the yard affected me more than anything else. Countless times, I had conquered the heights of those trees and now all but one of them lay forever conquered by the storm. All throughout the neighborhood, near every bend and hollow I once explored, mounds of debris were cobbled together like funeral pyres for my memories. Like the sight of my grandma in her casket, seeing my hometown in such a foreign condition left me feeling disillusioned and out of place.
The term “place” carries at least three meanings. First, at a shallow level, we can think of place as the site where a person or thing can be found. Every physical thing that exists can be found some-where. It is “placed” in the sense that it presently occupies a particular, physical location. In the case of my childhood home, its site could be represented in a number of ways, such as its street address or its latitudinal and longitudinal position on a map.
Though sites are individual, they are not isolated—they either overlap or exist within concentric circles of one another. While my childhood home is an individual place, it’s situated within a larger place—Sneads—which itself is situated within a yet larger place—Jackson County. In this sense, my childhood home is a place-within-place. It’s simultaneously distinct from and united to other places.
Second and more intimately, place has to do with a person’s or thing’s setting—the features that give a place its particular character. Like threads to a tapestry, the historical, cultural, ethnic, social, economic, religious, political, and other features of a place are woven together to give each place a setting that is absolutely and indissolubly unique.
Yet “unique” is not how some would choose to describe the setting of my childhood home. Like dozens of other so-called “drive-by” towns in the Florida panhandle, Sneads is virtually unknown to those outside of Jackson County. Many Florida tourists know it only as an anonymous name on a green sign marked “Exit 15” as they flock down the interstate toward the beach. Were a tourist to take that exit in search of gasoline, he would see cow pastures and crop fields peppered with a few homes before arriving at a small stretch of town that’s not immediately distinguishable from similar-sized towns with their farm stands, hardware stores, baseball parks, and churches.
But to me and others, Exit 15 represents home. Pulling into town, I see that it’s not just any farm stand, but Buddy’s—the place that provided the watermelon for my family’s afternoons at the lake. It’s not just any hardware store, but Beauchamp’s—the place where my father taught me the meaning of “Phillips-head” and the value of work. It’s not just any baseball park; it’s the place where my best friends and I chased girls on the playground and grounders on the ball field, blistering and sweating for years until we mysteriously grew into men. It’s not just any church; it’s the place where the God of my fathers became my Father too.
To someone like me who has been privileged to live there, Sneads isn’t just an anonymous name on a green interstate sign, but a humble, one-syllable description of the unique place that has served as the setting for my life, the soil where the seeds of my experiences and dreams have germinated and grown to make me into who I am today. Though similar towns have similar features, no other place has the precise collection and configuration of features that Sneads has. The features (or lack thereof) that cause tourists to drive by Sneads are the very features that tell me I’m where I belong—I’m home.
The third and deepest, most intimate meaning of place has to do with this sense of place—a person’s sense of belonging to a particular site and setting. If having a site is like having an address (“I am somewhere”) and having a setting is like having a unique address (“I am here”), then having a sense of place is like belonging to that unique address (e.g., “I belong here”).
Moreover, the strength of a person’s sense of place is directly related to their familiarity with and commitment to that place. A strong sense of place would describe a person who is intimately familiar with and perpetually committed to that place. On the contrary, a person would have a weak sense of place if they were unfamiliar with or uncommitted to that place.
Yet even those who are at home can feel out of place when its features are altered enough to render them unfamiliar. In my case, when Hurricane Michael literally ripped many of Snead’s features out of place, I was left feeling out of place. I was exactly where I belonged, but my sense of place had dramatically weakened as the familiar gave way to the foreign.
Though Hurricane Michael was a tipping point for me, the reality is that I began to feel out of place in Sneads years earlier. It began when I left home and moved hundreds of miles away to attend an out-of-state university. Each time I returned home, I found that I had forgotten yet another street name, the directions to somewhere, or the name of a cashier at McDaniel’s grocery. The longer I was away, the more I seemed to forget and to be forgotten, becoming something like a tourist in my own hometown. Yet I still had my childhood home, my family who lives there, and a trove of memories embedded in the physical features of the town itself—until the hurricane came to challenge my final claims to that place.
In hindsight, I realize that the hurricane affected me so deeply not simply because it damaged my home, but because that home was the only one I had ever truly known. Since the day I left for college, my life had been too transient for me to develop a strong sense of place anywhere else. University life was stereotypically frenetic and, after graduating, I lived in three different states over the course of three years. So, my sense of no longer belonging in Sneads was exacerbated by my sense of not belonging anywhere. The hurricane left me “placeless,” with no place where I could go to feel at home.
My experience is similar to one recounted by Gertrude Stein in her 1937 memoir, Everybody’s Autobiography. Stein describes returning to her childhood neighborhood in Oakland, California, only to be dismayed by the transformations that had rendered it virtually unrecognizable. She summarizes her thoughts in her infamous epitaph: “there is no there there.” Because the features that had anchored Stein’s memories were eroded, so too was her sense of place—her sense of belonging. She goes on to compare the loss of her home to the loss of her very name. Her moral is clear: to lose one’s place is, in a way, to lose oneself.
It should come as no surprise that our identities as humans are somehow intertwined with the places we inhabit. We are earthy people, enrobed in fragile flesh that’s composed of borrowed soil. Before we return our bodies to the ground, we offer thanks to our Maker by cultivating the ground upon which we stand. At least, this was once the standard view of the self. Throughout history, people had always lived in place-centered communities where familiarity with and cultivation of one’s place was considered a basic rite of civilization and survival.
Today, however, a growing number of people’s lives are characterized by a loss of place-identity and the corresponding pain of placelessness. If feeling out of place describes having a weak sense of place somewhere, then placelessness describes having a weak sense of place everywhere. In other words, a placeless person is one who feels as though there’s nowhere she truly belongs.
Placelessness often occurs for reasons that are outside of a person’s control. Natural reasons might include the death of loved ones or natural disasters such as hurricanes, wildfires, and extreme droughts that can ravage places, forcing people to find homes elsewhere. Unnatural (i.e., man-made) reasons might include crime, war, genocide, discrimination, economic changes, or a host of other reasons that might erase much of what’s familiar about a place.
Yet, too often, placelessness is self-inflicted by our gluttonous taste for mobility. One form, “physical mobility,” refers to that quintessentially American notion of leaving one’s old place in search of better opportunities someplace new. Because the focus is on physical places, physical mobility can paradoxically cause a person to become more place-oriented when their goal is to plant deep roots in their new place. The problem arises—as it did with me—when such mobility becomes transience, a state of perpetual movement that makes it impossible to cultivate a strong sense of place.
Another form of mobility refers to the relentless connectivity that we experience across places through the use of various technologies. In contrast to physical mobility, this is a “virtual mobility” that sees disassociation from one’s place as the goal. Virtual mobility offers many benefits, of course, like the ability for a traveler to video chat with family or keep up with news from back home. The danger lies in its abuse: using technologies not to connect with home but to get away from it.
Examples of abuses are as numerous as they are commonplace, such as paying more attention to our phones than our surroundings, habitually preferring headphones to nearby sounds, or following national events to the neglect of local ones. Though we’re here at this site with this setting and these people, we prefer not to be. So we use myriad technologies to achieve virtual distance from our physical realities.
According to the French philosopher Paul Virilio, this distance from reality results in “action-at-a-distance.” In an interview for CTheory, Virilio explains: “Action-at-a-distance is a phenomenon of absolute disorientation. We now have the possibility of seeing at a distance, of hearing at a distance, and of acting at a distance, and this results in a process of de-localization, of the unrooting of the being….Our contemporaries will henceforth need two watches: one to watch the time, the other to watch the place where one actually is.”
Virilio’s description, written back in 1996, now pales in comparison to the virtual mobility that we experience today. Mere action-at-a-distance has given way to a techno-utopian vision of relationship-at-a-distance, as seen in our dependence on social media. When Mark Zuckerberg, the founder and CEO of Facebook, was honored as Time magazine’s “Man of the Year 2010,” Lev Grossman penned the following words about the company:
Facebook wants to populate the wilderness, tame the howling mob and turn the lonely, antisocial world of random chance into a friendly world, a serendipitous world. You’ll be working and living inside a network of people, and you’ll never have to be alone again. The Internet, and the whole world, will feel more like a family, or a college dorm, or an office where your co-workers are also your best friends.
Facebook’s eschatological vision of relationship-at-a-distance is a microcosmic example of what is promised by today’s religion of mobility: intimacy without proximity—a sense of place without a corresponding commitment to that place. We want a place to belong to us without us having to belong to it. The assumption is that our physical settings ultimately hinder us from living the good life, so we must be liberated from the constraints of physical proximity to a place and its people.
Far from liberating us, a loss of physical proximity inevitably leads to a loss of place-identity. When we view ourselves and our happiness as perhaps related to but ultimately separate from the places where we live, the effect is that we treat our places as exchangeable commodities—locales to be consumed as we’re passing through them.
Those who live in tourist destinations like the Florida Gulf Coast know that a “passing through” mentality is the hallmark of a tourist. As litter-strewn beaches and other messes show, the goal of many tourists is to get what they can while they can. Locals tolerate this because of the benefits that tourism brings to local economies, but no local wants a tourist for a neighbor. Likewise, when our lust for mobility causes us to adopt a “passing through” mentality, we not only tend to treat places like commodities, but we risk being treated as commodities in return: exchangeable consumers who are valued for what can be extracted from us.
In this cycle of commodification, we pass through places—apartments, schools, workplaces, coffee shops—without fully being there. Then, having gotten what we wanted, we leave these places with few people noticing—or caring—that we’re no longer there. By living as though we don’t belong to a place, we make it impossible for a place to belong to us in return and we inevitably suffer the pain of placelessness.
Stopping the cycle of commodification requires that we see our places with new eyes—not as consumers but as cultivators of place. For a cultivator, place has less to do with external features—though still important—and more to do with the internal relationship between a place and its people. This is a relationship born out of familiarity, nurtured by commitment, and resulting in a life of mutual belonging that says, “I am part of my place and my place is part of me.”
As Wilfred McClay puts it in Why Place Matters, “‘place’ is not just a physical quality obtained by mechanical means. You can spell out every one of the objective and structural aspects of place, and never get to the heart of the matter. It is at bottom a quality of spirit, existing more in the eyes and hearts of the beholders than in the permanence of glass and stone and asphalt.”
Though I once saw Sneads with this quality of spirit, it’s no longer possible for me because I don’t live there. As the farmer-poet Wendell Berry writes, “a house for sale is not a home.” By choosing to sell my hometown for some “better” place, I eventually began to see it through a tourist’s eyes, thinking of Sneads less as my place of mutual belonging and more as the sum of its physical qualities. So, when Hurricane Michael made landfall and tore apart the town’s glass and stone and asphalt, it was able to tear apart my sense of place as well. For me, there was no longer any “there there.”
But for the people of Sneads, something paradoxical happened: Sneads became more there. Because Sneads is primarily a quality of spirit for them, the hurricane was unable to touch their sense of place. Rather than causing them to flee, the hurricane stirred them up to care for Sneads and each other in unprecedented ways.
The people of Sneads are cultivators who know in their bones what G.K. Chesterton writes in Orthodoxy: “the world is not a lodging-house at Brighton, which we are to leave because it is miserable. It is the fortress of our family, with the flag flying on the turret, and the more miserable it is the less we should leave it. The point is not that this world is too sad to love or too glad not to love; the point is that when you do love a thing, its gladness is a reason for loving it, and its sadness a reason for loving it more.”
For communities throughout the Florida panhandle, their suffering caused by the hurricane will continue in the form of economic decline as tourists are repelled by the sad physical conditions of these towns. For these tourists, there is no longer any there there because they were never truly there—they were only ever passing through.
But the people who live in these communities are not just passing through. The sad physical conditions compel them to love their towns more. They are cultivators who belong to their places and whose places belong to them in return. And they’ll weather yet more hurricanes, wearing their places on their bodies until their bodies are buried there.
Since leaving Sneads, I haven’t found another place like it. But I’ve learned that a strong sense of place is not something found but something made. It’s made by familiarity and commitment, by seeing and loving one’s place the way that the people of Sneads do. One day, if I belong to a place long enough, perhaps that place will belong to me too.
Timothy Kleiser is a teacher and writer from Louisville, Kentucky. His writing has appeared in The American Conservative, Modern Age, The Boston Globe, Fathom, and elsewhere. This New Urbanism series is supported by the Richard H. Driehaus Foundation.
At a park in New York City, I witnessed something odd. A group of women silently formed a circle in the middle of a large lawn. Their all-black outfits contrasted with the surrounding summer pastels, and they ignored the adjacent sun bathers as they began to kneel and slowly chant. They repeated a three word matin. The most striking feature of this scene was its familiarity. Any half-decent anthropologist would label this a religious ritual.
Yet, few are willing to explicitly describe these events as part of a religion. The women may have been kneeling in a circle while chanting, but they repeated the words “black lives matter.” Politics obscures the obvious. Wokeness is a religion, and conservatives must act as if large parts of our institutions are run by this cult.
Americans are united in their disgust at what happened to George Floyd. Everyone agrees: A minor run-in with the police should never lead to death. Yet, the past two weeks do not actually seem connected to the events in Minneapolis. Most East Coast yuppies would have trouble placing Minneapolis on a map. Does it really make sense to gather in a mass crowd during a pandemic because of something that happened a half-continent away? It does when you recognize that it’s a religious movement.
Wokeness has been identified as a religion by several writers and commentators. Linguist John McWhorter wrote an article on “Antiracism, Our Flawed New Religion” several years ago. Harvard professor Adrian Vermeulle wrote a must-read analysis of the liturgical nature of liberalism in 2019. And all the way back in 2004, historian Paul Gottfried wrote a prescient book on the topic with the subtitle “towards a secular theocracy.” The increasing intensity of woke culture suggests that this is no longer just a curiosity, or a point of ridicule. It is the most clear-eyed way of viewing current politics, and this is most obvious when viewing the protests.
The nationwide protests are best understood as religious ceremonies, and this can be seen in the way they keep engaging in off-brand Christianity. In Portland, Maine, protestors lay stomach down on the sidewalk in order to ritualistically reenact Floyd’s arrest. They prostrated themselves in the exact way Catholic priests do in their ordination ceremony. Journalist Michael Tracey noted the religious feeling in New Jersey protests. Protestors knelt and held up their hands in a mirror image of how Evangelicals pray over each other at revivals. The Guardian ran an article on how people must keep repeating the names of police victims, and protestors routinely chant a list of names as if it is a litany of the saints. It is a transparent attempt to transform the victims into martyrs. And while Floyd’s killing is a tragedy and an outrage, he had no agency over his death.
Perhaps the appropriation of Christian liturgy is just coincidental, and not evidence that the woke have become a cult. It’s not like they’re trafficking in classic cult behavior, like trying to separate devotees from their family, right? Wrong: Taking a cue from the Scientologists, The New York Times ran an op-ed encouraging readers to stop visiting, or speaking to family members until they pledge to “take significant action in supporting black lives either through protest or financial contributions.” Very normal! Shaking down family members for money by threatening not to talk to them is classic cult behavior and is not how well-adjusted adults voice political opinions. The insidious engine of this religious impulse can be seen in the most egregious ripoff from Christianity so far.
In North Carolina, a pastor organized an event where white police officers knelt before her and washed her feet. She claimed God told her directly to do this. Only the most delusional would try to call this a protest. This is a pathetic perversion of Christian liturgy. To state the obvious: washing feet is a Christian tradition with Biblical origins. Washing feet was a chore reserved for the lowest servants. Jesus, God himself incarnate as man, washed the feet of his disciples at the Last Supper. The disciple Peter objects to this and doesn’t want Jesus to lower himself. Jesus replies “if I don’t wash you, you don’t really belong to me.”
The white people washing feet are only pretending to lower themselves. In reality, they’re symbolically placing themselves in the role of God. For white people, woke anti-racism offers a way to worship themselves. “White privilege” is a purely subjective concept that allows unremarkable white people to recast their own ordinary lives in a flattering light. It’s not enough to simply point this out and laugh at it. The religious nature of the woke has real policy implications.
The woke make policy decisions in reference to the values of their religion. Back in January, it was considered racist to be concerned about the coronavirus. CNN ran headlines about how racism was spreading faster than COVID, Al Jazeera ran an op-ed with a headline suggesting racism was the more dangerous epidemic, and New York City politicians encouraged people to join crowds in Chinatown. Now, after months of stringent social distancing, suddenly the “experts” are telling us that massive crowds gathering in every city around the globe won’t impact the ongoing pandemic. A certain type of person pretends to be above all culture war topics, and always wants to get back to the “real issues.” Yet it should be clear that in any long and protracted economic struggle with China, the woke cult has the ability to distort priorities and jettison all good sense. You may not be interested in the culture war, but the culture war is interested in you.
In 2014, and 2015, many conservative pundits made a name for themselves laughing at the “SJW” phenomenon on college campuses. Older conservatives loved to make jabs about “snowflakes” who they predicted wouldn’t be able to tough it in the real world. This was a complete misreading of the situation. Woke Yale graduates do just fine in their careers, and these extremist students are now rising through institutions of power. Ivy League-educated lawyers are throwing molotov cocktails in New York. The scholastics grew out of an institutional arrangement where Christianity was the official religion of the university. Wokeness is the scholastic form of anti-racism. It is enshrined in our institutions because the Civil Rights movement coincided with the formation of our new upper class.
In the 20th Century, corporations and government grew to unforeseen scale. Experts, managers, bureaucrats, and new types of lawyers were required to run these organizations, and this changed the nature of the middle class, and how people achieved power. As Fred Siegel argued in his book “Revolt Against the Masses,” this new class became conscious of itself as a distinct class through the Civil Rights movement. The South was a poor and backwards place, and the new class of experts could use their position to correct a grave injustice.
Civil Rights legislation then needed more lawyers, managers, and bureaucrats to enforce. The concrete forms of discrimination in the Jim Crow south slowly disappeared as racism was openly confronted, but we are left with a class structure that still defines itself around these issues. Those with power have a vested interest in finding ever new forms of racism because this allows them to create new instruments to fight racism. Universities and corporations create more and more administrative jobs that produce a brahmin class whose only purpose is to keep vigilant for bigotry. This is why the woke capital phenomenon cannot be dismissed as posturing. One implication of this is that striving political leaders who seek to enter the upper class must prove their anti-racism bonafides again, and again. Another, much darker, implication is that we may live in a theocracy.
Wokeness is a gnostic cult that asks its sectaries to adopt a platform of national self-loathing. These are not protests. They are religious celebrations. The cult needs to be consistently classified as a religion, and conservatives must resist the temptation to view it as merely a silly sideshow distraction. Its bizarro liturgy is increasingly enshrined in all of our institutions, and conservatives must act as if a cult has hijacked the nation.
James McElroy is a New York City-based novelist and essayist, who also works in finance.
The post These Aren’t Protests, They’re Religious Ceremonies appeared first on The American Conservative.
The aftermath of COVID will present its own grievous set of problems, some of which we are already seeing in a newly declared U.S. economic recession, much like the Spanish Flu outbreak of 1918 was the prologue to the Great Depression, which in turn led to World War II. Once the wheels of history begin to turn they tend to feed off of each other.
The economic devastation of the COVID-19 pandemic will no doubt aggravate existing long-term trends and set the stage for instances of collapse. To appreciate this one need only study the Middle East and India, parts of which are becoming a literal tinder box.
India, for example, is home to well over a billion people. By 2027 it is expected to overtake China as the world’s most populated country. Yet at the same time government officials report that half of the country—about 600 million people—suffer from “high to extreme water stress.” Likewise during the next decade, the demand for water in India will grow to over twice the available supply. To make matters worse, by the end of the century temperatures in the region are expected to reach levels which are “intolerable to humans.” As in somewhere around 140 degrees Fahrenheit.
Dry, crispy. Toast.
At this point optimists usually broach the topic of quick fix solutions like solar powered air conditioners. What’s left unsaid is that most people in India still need to go outside to make a living. Over half of India’s population works in agriculture, and all those solar powered heat exchangers are only going to make toiling in the fields more unpleasant.
But heat and water scarcity aren’t the only existential threats. Indeed, there’s no guarantee that India will even survive long enough to witness Mother Nature’s blast furnace at the end of the century. That’s because India’s next door neighbor, Pakistan, also faces high levels of water stress. Pakistan gets much of its fresh water by means of the Indus System of rivers, which flows into the country from—you guessed it—India. Roughly 90 percent of Pakistan’s agricultural production depends upon Indus System waterways.
As the Himalayan glaciers that feed the Indus System shrink, the corresponding increase in demand will ensure that access to potable water becomes a vital issue. If India were to cut off Pakistan’s supply, as Indian officials have already threatened to do, the outcome would be disastrous.
India and Pakistan both possess nuclear arsenals. At least a couple hundred warheads each. Peer reviewed scientific research indicates that if these weapons were used to target highly populated urban centers, it would send fallout into the atmosphere and result in a nuclear winter. Scientists calculate that surface sunlight would decline by 20 percent to 35 percent and precipitation by 15 percent to 30 percent.
What might seem like a limited regional conflict would end up being a global incident as the planet becomes enveloped in high altitude streams of fine radioactive soot. Given insufficient sunlight and rainfall, starvation would almost certainly kill more people than the initial nuclear exchange. Pakistan is already dealing with the specter of famine. And we’ve had a taste of what minor supply chain disruption can do to fragile distribution networks here in the United States. Imagine what would happen to the Middle East if all of the major cities in Pakistan and India were consumed by fireballs.
It goes without saying that there would be an exodus from affected areas that would dwarf what happened during the Syrian war. Leading to increased tension, surrounding governments will have to wrestle with how to manage wave after wave of refugees. And while the military collision of Pakistan and India wouldn’t be an extinction level event for the human race, it would be traumatic enough to produce severe social turbulence.
Saudi Arabia would be particularly susceptible. This gulf state is the canonical example of a regime that’s living on borrowed time. Sooner or later the kingdom will implode regardless of whether or not Pakistan and India annihilate each other. Largely because Saudi Arabia’s economy is dependent on oil which is used to pay for unsustainable government programs. Moreover the kingdom’s leaders have made limited progress in transitioning to a different economic model.
On the home front, approximately 70 percent of all workers in Saudi Arabia are employed by the government and 90 percent of the government’s revenues come from oil. Not to mention the thousands of members of the Saud royal family whose yearly stipends amount to five percent of all public spending —Approximately $2 billion. Making the House of Windsor, which costs the United Kingdom less than $100 million per year, seem downright thrifty.
Saudi Arabia is also the single largest arms importer in the world, straining its treasury with purchases of well over $15 billion between 2014 and 2018. In theory this investment in weaponry affords Saudi Arabia “iron clad” security guarantees from the United States. In practice the Saudi Monarchy uses its military to destabilize the surrounding region. With regard to its foray into Yemen, the kingdom clearly bit off more than it could chew, spending hundreds of billions before decision makers realized their mistake and frantically began petitioning for a ceasefire.
The Saudi approach to international relations is based heavily on its checkbook, which is leveraged to play an elaborate double game. As one journalist aptly put it, Saudi Arabia is both the arsonist and firefighter. On one hand a State Department memo refers to Saudi Arabia as “the most significant source of funding to Sunni terrorist groups worldwide.” The kingdom’s role in supporting insurgents in Syria is well documented. These groups were dominated by radical Islamists. On the other hand, the CIA has formally recognized Saudi Arabia for its counterterror efforts. Hence it should come as no surprise that Saudi money in Afghanistan concurrently funded both the Taliban and the U.S.-supported government.
All of these stratagems rely on identifying pliable stakeholders and discreetly paying them off. When the oil money finally starts to peter out, which is expected to happen in a matter of decades, the House of Saud will no longer be able to buy its way out of trouble. The generous government subsidies, the free healthcare, the cushy public sector jobs, the political donations, the arms purchases, the proxy wars; the party will be over. Just in time for climate change to cook the region sunny side up.
Suffice it to say tourism will be a hard sell.
The Saudi elders know where the status quo is headed and have tried to reinvent the kingdom by proposing sweeping reforms that include adopting a private sector model as well as selling shares in the state’s oil monopoly, Saudi Aramco. These initiatives have been collectively branded as “Saudi Vision 2030.” However, with the cost of oil dropping thanks to a price war with Russia and the spread of COVID-19, there probably won’t be enough funding to complete the makeover in time.
This may explain why the Saudi elites have been snapping up property in other countries. It’s their royal exit strategy.
The cycle of history rises and falls with a rhythm that emerges on different levels of granularity. Sometimes major realignments take only a decade or two. Other times they take centuries. But there’s always a return to equilibrium. Thousands of years ago the Middle East was a cradle for early human civilization. Perhaps it’s only fitting that the Middle East should ultimately serve as a casket.
The plight of nations like India, Pakistan, and Saudi Arabia underscore the extent of the terrestrial changes underway. Broad swaths of the planet are about to become sweltering, hungry, and desperate. You don’t have to be Thomas Malthus to guess how this story ends. Take away enough seats during a game of musical chairs and things quickly degenerates into a brutal zero-sum affair. The narrative of civilization can be defined in terms of nations competing over resources. War, plague, famine, and death are the recurring themes of this narrative.
Bill Blunden is an independent investigator focusing on information security, anti-forensics, and institutional analysis. He is the author of several books, including The Rootkit Arsenal and Behold a Pale Farce: Cyberwar, Threat Inflation, and the Malware-Industrial Complex. Bill is the lead investigator at Below Gotham Labs.
Perhaps the most outrageous police killing of the year continues to be almost completely ignored by the American media.
Montgomery County, Maryland politicians and government officials have loudly lamented police killings in Minneapolis and elsewhere while continuing to cover up a no-knock raid that is difficult to distinguish from an extrajudicial killing. Since banning so-called no-knock raids has been included in the new House Democratic proposal for police reforms, let’s take a look at a recent one that ended in the death of a 21-year-old man.
At 4:30 a.m. on March 12, a Montgomery County police SWAT team commenced a no-knock raid by firing into a bedroom window and fatally wounding Duncan Lemp as he lay in bed next to his pregnant girlfriend. Police then stormed the house, using flash bangs to intimidate Lemp’s mother and other relatives living in the house. Lemp bled to death while family members were handcuffed on the floor nearby.
Lemp was a savvy I.T. guy who was volunteering to assist gun rights groups in setting up secure websites and communications systems. But Lemp had no security to protect himself against police bullets coming through his bedroom window before dawn that morning.
During the raid, police officers repeatedly shouted at family members that everything they said and did was being recorded. However, Montgomery police may have either destroyed any videos or never made a recording. On June 5, lawyer Rene Sandler, representing the Lemp family, sent a letter to Montgomery County prosecutor Haley Roberts: “We have been advised that Police Chief Marcus Jones made an ‘on the record’ statement that no body cameras existed for the raid of the Lemp home and the killing of Duncan Lemp.” Sandler sought confirmation that the raid video footage existed and requested its immediate release. She received no response.
After seeing the Sandler letter, I emailed Montgomery County chief executive Marc Elrich and Montgomery County Police Chief Marcus Jones asking: “Can you confirm or deny that there is no body cam footage of the Lemp shooting?” I received no reply. I sent the same question multiple times to county prosecutor Roberts, the same lawyer who threatened Lemp’s parents if they attended a protest over his killing at County police headquarters in April. Roberts replied on June 12: “This matter is an open criminal investigation being handled by the Howard County State’s Attorney’s Office, and as such any inquiries should be directed to that office.”
The coverup of the Lemp killing is being aided and abetted by the Orwellian-named “Law Enforcement Trust and Transparency Act” which the county council enacted last year. Montgomery County and Howard County have an agreement to conduct reciprocal investigations of police shootings. Individuals I have spoken to involved in this case have zero confidence in the independence of the Howard County investigation—which conveniently permits Montgomery County officials to shirk all questions. Perhaps some months or a year or two from now, an “official report” will reveal the following: “We investigated our law enforcement friends and neighbors and found out that they did nothing wrong except for a glitch where one policeman’s finger accidentally bumped a trigger and inadvertently killed a dastardly gun owner who was also guilty of tweeting ‘The Constitution is Dead.’”
Montgomery County officials are offering endless dollops of piety in lieu of revealing how and why Duncan Lemp was killed, while the state government is perpetuating a “stay-at-home” dictate that is one of the nation’s strictest and has helped destroy tens of thousands of jobs. But county officials have nonetheless cheered mass rallies to protest the Floyd killing and the racial injustice. The county police shut down a major road to assist protest Black Lives Matters marches in the heart of Rockville, Maryland, right outside of D.C..
In a June 4 Washington Post op-ed, county chief executive Marc Elrich declared, “The killings committed by members of the police force are truly horrible, without justification, often explained away and seldom punished appropriately.” But his fervor on this issue does not extend to revealing facts about killings by police under his command. Elrich has said nothing on the Lemp case.
On June 8, Chief Jones and the police chiefs of Rockville, Gaithersburg, and the chief of the National Park Service local division, issued a joint statement: “We…are angry and outraged over the killing of George Floyd by police officers in Minneapolis, Minnesota….We realize that we must work toward greater transparency and accountability in order to hold the public trust.”
The police chiefs then declared that they “hereby commit” to a set of reforms, including a pledge to “improve training in cultural competency for our officers.” “Transparency” was nowhere in the reforms.
Two days later, Chief Jones bewailed: “Over the past couple of weeks, I have been beyond angry. I’m sick to the core of my soul” over Floyd’s killing. But there is no evidence he has lost a moment’s sleep over a killing by his own SWAT team. While Jones has had plenty of time to publicly condemn the action of the Minneapolis police, he has refused to meet with the mother and father of Duncan Lemp, who his own officers killed.
Selective outrage extends to the top law enforcement official in the state. On June 10, Maryland Attorney General Brian Frosh, speaking on a Montgomery County panel organized by Communities United Against Hate, lamented that “these past few weeks have been awful,” referring to the deaths of George Floyd in Minneapolis and Breonna Taylor in Louisville. Frosh declared, “It’s not a surprise that many members of our community have lost trust in law enforcement when they see live, on videos, these events occurring.”
Floyd was brutally killed by an eight-minute-knee-on-the-neck after police sought to arrest him for passing a counterfeit $20 bill. Breonna Taylor was killed during an unjustified no-knock raid in the middle of the night. Police charged into her apartment seeking a drug suspect who she had dated years earlier but was nowhere near the scene. Taylor’s boyfriend fired at the police, hitting one officer in the leg. Police fired a volley of shots that left Breonna dead.
People are justifiably outraged by Taylor’s killing. But in the Lemp case, there were no shots fired at police who apparently began their assault by shooting into a bedroom window. On June 12, I emailed an inquiry to Frosh’s press office, asking whether he had made any public comments on the Lemp case and why he would “publicly comment on a Kentucky case that sounds similar to a case under his own jurisdiction?” Frosh’s office did not respond.
Conservatives sometimes accuse liberals of being virtue signaling zealots who are more outraged by prejudice than by wrongful killings. The Montgomery County Council could serve as Exhibit A.
In May 2019, a Montgomery County police officer used a racial slur during the trespassing arrest of four men who refused to leave a McDonald’s plagued by loitering and drug dealing. (Two arrestees had marijuana.) The policewoman who was caught on tape told the suspects that she had just used the same term to describe the defendants that they had just used. Less than a week later, all nine members of the Montgomery County Council demanded the release of all body cam footage from the incident. MyMCMedia reported that the council also sought “recordings of calls related to this stop… the number and locations of all trespassing citations issued in the last two years… as well as demographic statistics about the residents stopped and frisked.” Three hours of police video on the incident were posted on the police YouTube channel two months later amidst much wailing and gnashing of teeth.
For the Lemp case, not a single County Council member has requested body cam footage. Actually, not one Council member has shown any interest in the case. The Council has a Public Safety Committee but they appear to have never heard of Duncan Lemp. But the Council did take decisive action on June 11 to declare racism a “public health emergency.”
Who actually killed Duncan Lemp? Nobody in Montgomery County appears to care. The police have not even disclosed the name of the officer or officers who killed Lemp. In a season when vast protests have occurred alleging racial bias by police, Montgomery County has gotten away with refusing to disclose whether Lemp’s killer(s) was white, Black, Hispanic, Asian, or native American. Montgomery County preens over its progressiveness but its police department procedures on disclosing the names of officers who kill are worse than Philadelphia, long renowned for police brutality.
Even more important than the name of the cop who killed Lemp is the question: Did the SWAT team intentionally turn a search warrant into a death warrant? If not, then why did they start the raid by firing into Lemp’s bedroom? Will we ever learn the facts?
We hope that liberal Maryland hasn’t adopted some kind of warped double standard in which the color one of one’s skin determines whether there is a fierce investigation into a police killing or officers get away with murder. Duncan Lemp deserves more. As American citizens, we all do.
The post Virtue Signaling Maryland Officials Ignore Brutal Killing By Their Own Cops appeared first on The American Conservative.
The neoliberal era of global governance has given us a centrist establishment in favor of free-market fundamentalism, austerity, open borders, and political correctness. It is far right on economic policy often displaying the worst aspects of woke virtue signaling. Time after time, economic crises throughout the past three decades have resulted in bigger corporate bailouts, more jobs being shipped off to China, accelerated destruction of American communities, and the continued enrichment of both America’s and China’s ruling class at the expense of workers.
These issues have come into clearer focus with the COVID-19 crisis. Millions of Americans have filed for unemployment in a country where a vast majority of its people already lived paycheck-to-paycheck. Amidst the chaos, a bipartisan voice-vote on the Hill secured the largest corporate bailout in U.S. history. Many of these corporations that applied and received bailouts are corporations that have, at one point in time, closed their American factories and shipped them to China.
It is clear that the problem here isn’t just the American establishment, which is obviously selling out its people, but also with the ultra-authoritarian and hyper-capitalist People’s Republic of China.
Donald Trump was the first U.S. president who so explicitly criticized America’s trade disadvantages with China. Many in the Republican party have since begun to support more hard-line measures against China, with Senators such as Josh Hawley, Tom Cotton, and Marco Rubio leading the way. But well before Trump began his trade war with China, it was Bernie Sanders in the nineties who vehemently opposed normalizing trade relations with China in the first place on the pretext that it would destroy American jobs and communities. He was right then, and he is right now.
As right-wing populists lead the way in Congress to challenge the dominance of China, populists on the left should work with them hand in hand; only a bipartisan opposition can defeat a bankrupt bipartisan establishment.
China is, as mentioned earlier, an authoritarian capitalist giant. There is nothing progressive, and nothing conservative, in supporting a surveillance state that bans trade unions and forces Uighur Muslims into concentration camps. Securing the interests of American workers should be the utmost priority for every populist, left and right.
However, this should be done carefully. Napoleon once said about China that, “[she] is a sleeping lion. Let her sleep, for when she wakes she will shake the world.”
The Thucydides Trap, an idea proposed by Graham Allison, is the idea that when one great power is rising, it will inevitably threaten to displace the established power, which will consistently result in war. China being an authoritarian state means the democratic peace theory, the idea that democracies tend not to go to war with each other, does not apply. A new Cold War with China could be on the horizon.
Hence, it is essential to put the interests of the American worker first, and not the military-industrial complex that may seek to exploit this situation.
There are many things that the populist right and left can agree on. First of all, left populists must work with their right-wing counterparts to bring back good-paying union jobs for local communities. These jobs must ensure labor rights, good pay, and quality production. It is not only essential for a post-COVID financial recovery, but it’s also essential for national security and the strength of American families. Josh Hawley’s “Rehire America” plan is the closest this country has come to a jobs guarantee, an idea the left loves. Pramila Jayapal, a prominent progressive congresswoman, came out with a similar program.
While these plans were in response to the need to protect workers during COVID-19, it shows that it is very much possible for both sides to agree on a new consensus that brings good jobs back home. This can further allow populists on both sides to tax and regulate predatorial monopolies and uber-capitalist firms that often place workers in less-than-desirable conditions, to say the least.
Second, America must bring back its essential supply chains to ensure that no other country can threaten the existence of American lives. With China not allowing American companies based in China to sell masks to America, and its state media threatening a blockade of medical supplies, populists consider this policy as a top priority. This will also help the populist coalition to regulate big pharma more effectively at home, and it can possibly be essential in pursuing the popular Medicare for All policy that the populist Left champions.
Lastly, America must either seek to reform or outright leave international organizations like the World Trade Organization. Such Bretton-Woods style organizations have upheld the global neoliberal order at the expense of the American worker, to the benefit of both American and Chinese oligarchies. While this is something one would hear in anti-capitalist leftist circles, it also has appeal on the populist Right. Josh Hawley (R) wrote an enlightening piece about this issue fairly recently. An attack so harsh on the global neoliberal order must be supported by a populist left.
But whether such populist unity will be possible, remains unclear. Many on the left label any criticism of China as “sinophobia,” while many on the right lead into outright racism with the same issue; such infantile attitudes must be put aside for the benefit of America’s workers and its security. Organizations such as the Democratic Socialists of America have attacked China’s economic and cultural imperialism in the past, and it is important that now more than ever that DSA encourages its members to cooperate with some right leaning groups that are today more vocal in their opposition to China. Organizations such as the DSA and Justice Democrats must also encourage progressive legislators to adopt a more openly tough stance on China.
Similarly, conservative organizations that are sympathetic to the economic arguments motivating Trump’s voters must recognize the valid left-wing critiques of China and realize a similarity in some aims with the DSA. Nothing substantial can be done if populists don’t work together. It’s important now, more than ever, to prioritize the American people over corporate profits. The protection of American workers, jobs, supply chains, and national security are immensely important at this point in time. The centrist neoliberal order sold out the working class of this country, and it’s time for left and right populists work together to bring it back.
The post A Left-Right Populist Agenda To Take Jobs Back From China appeared first on The American Conservative.
Few ancient curses were more heartrending than the one Apollo is said to have visited on Cassandra. He gave her the power to predict future disasters, but decreed that no one would believe her predictions. American history is full of Cassandras. Time and again, prophets have warned that our social and political fabric was fraying because of injustices we have perpetrated at home and abroad. They urged the United States to change course. Victims of Apollo’s curse, they were dismissed or outvoted.
The most obvious of our unheeded Cassandras are civil rights advocates who have warned that the United States will remain forever hobbled if it does not confront the legacy of its founding covenant with slavery. Others are soothsayers who foresaw that oligarchs would seize hold of our political system—that “malefactors of great wealth” would squeeze the essence out of our democratic institutions and turn them into servants of a “military-industrial complex.” None have proven more prescient, though, than those who warned that pursuing empire abroad would ultimately bring grief at home.
For nearly two centuries, Cassandras in the United States have warned that lording over the weak in faraway lands would serve as a rehearsal for doing the same at home. If we take every distant challenge as a threat, and respond with bristling shows of force, we condition ourselves to react the same way when our own people challenge official power. If we care little about “collateral damage” that results from our operations abroad, it’s logical not to care much about it at home either. American power has often been a knee on the neck of foreign countries.
In 1898 the United States had the chance to take control of Puerto Rico, Cuba, Guam and the Philippines. Senator George Frisbie Hoar of Massachusetts, who like most Cassandras has since been lost to history, passionately warned Congress against succumbing to the imperial temptation. If the United States began projecting military power overseas, he warned, it would be “transformed from a Republic founded on the Declaration of Independence, guided by the counsels of Washington, the hope of the poor, the refuge of the oppressed, into a vulgar, commonplace empire founded on physical force, controlling subject races and vassal states, in which one class must forever rule and the other classes must forever obey.”
Another of that era’s now-forgotten Cassandras, the former senator and interior secretary Carl Schurz, warned Americans that if they began seizing foreign lands that they had promised to liberate, they would sacrifice their country’s moral authority. “What could our answer be,” he asked, “if the world would say of the American people that they are wolves in sheep’s clothing, rapacious land-grabbers posing as unselfish champions of freedom and humanity, false pretenders who have proved the truth of all that has been said by their detractors as to their hypocrisy and greed, and whose word can never again be trusted?”
The United States ignored those warnings. It set out on a long century of seeking, often quite violently, to shape the fate of peoples around the world. The result has been much as the Cassandras predicted. Many people in other countries have indeed come to see the United States as a “commonplace empire” and an exemplar of “hypocrisy and greed…whose word cannot be trusted.”
After World War II, Americans were encouraged to believe that an “American century” was dawning, and that other nations would have to yield to our superior power and wisdom. Among the Cassandras who protested was Vice President Henry Wallace. “We ourselves in the United States are no more a master race than the Nazis,” Wallace insisted. “And we cannot perpetuate economic warfare without planting the seeds of military warfare.” As punishment for advocating cooperation with the Soviet Union, Wallace was dumped from the presidential ticket in 1944 and replaced by the more reliable Harry Truman. He was another victim of Apollo’s ancient curse.
Around the same time that the Democratic Party was cleansing itself of dissenters from the Cold War catechism, the Republicans were doing the same. In 1949 Senator Robert Taft, who unsuccessfully sought the Republican presidential nomination three times, voted against creation of the North Atlantic Treaty Organization, which he called “an undertaking by the most powerful nation in the world to arm half the world against the other half.” He foresaw that “the building up of a great army around Russia” would divide the world “into two armed camps” and set off “an inevitable arms race.”
More than a century ago, one of the bitterest American Cassandras, Mark Twain, disgusted by our first wars of overseas conquest, wrote what today reads like an advance obituary for the United States. “It was impossible to save the great Republic,” Twain lamented. “She was rotten to the heart. Lust of conquest had long ago done its work. Trampling on the helpless abroad had taught her, by a natural process, to endure with apathy the like at home.”
America’s political system now seems less able than ever to address urgent concerns that grip millions of citizens. Governing institutions that were established in another age have proven unable to withstand assaults from faction and private interests. Our current political crisis is not an aberration or the result of a single election gone wrong. It is the product of forces that have been building in American society for generations. By ignoring our Cassandras, we have allowed our foreign-policy mentality of conquest and domination to shape our approach to our own people. Now the reaction is unfolding not just far away, but ever closer to home.
Stephen Kinzer served as a former foreign correspondent and bureau chief for The New York Times for more than 20 years, reporting from over 50 countries on five continents. He is the author of several books, including Overthrow: America’s Century of Regime Change from Hawaii to Iraq in 2006. His latest, Poisoner in Chief: Sidney Gottlieb and the CIA Search for Mind Control, was published in 2019.
WASHINGTON—It’s summer in a presidential election year, which means it’s time for a proper veepstakes.
Whatever else may be lights out this season—Camden Yards, Coachella or much of the Carnival Cruise Line—we know at least one show will go on. As Barack Obama said on the eve of the 2016 election, “the sun will rise in the morning.” And the presumptive Democratic nominee for president will need a running mate.
Vice presidential selections are in some ways the papal conclaves of secular politics. The deliberations are done in secret (though with a generous sprinkling of leaks, of course), and the selection is not the choice of the laity or the polity but the upper echelon. In Rome, the guiding force is the Holy Spirit. In Washington, it’s the Electoral College.
For former Vice President Joe Biden, the choice just got a lot easier. He has said, prudently, that he will wait until at least August to announce his pick. But the supernova of outrage that has spilled out onto American streets in recent weeks has considerably narrowed Biden’s room for maneuver.
He has already pledged to pick a woman. But for Biden, failing to select an African-American woman is to risk forfeiting something his often staid campaign has at last picked up: enthusiasm. Biden hopes to be swept into power as a statesman at the helm of a nation hungry for fresh reflection —a place in dire need of a reckoning with its racial sins following the slaying of Minnesota man George Floyd—even as it looks for the light at the end of the pandemic tunnel.
The Associated Press reports that Biden’s camp is narrowing the contenders. The top tier: California Senator Kamala Harris, Massachusetts Senator Elizabeth Warren and former National Security Advisor Susan Rice. Lagging behind: Minnesota Senator Amy Klobuchar, Atlanta Mayor Keisha Lance Bottoms, Former Georgia Senate Minority Leader Stacey Abrams, Florida Representative Val Demmings and New Mexico Governor Michelle Lujan Grisham.
As the first half of 2020 has made unmistakably clear, much can change in three months. But his commitment to pick a woman, the fierce urgency of now to elevate an African-American, and Biden’s own perspective on his old job mean that only one prospective vice president meets the criteria for selection.
Consider first that Biden has emphasized the need for his lieutenant to be able to fill his shoes without hesitation. As he said in Iowa during the Democratic primary, whoever he picks must “be capable of immediately being a president because I’m an old guy.”
The importance of national security, both to the executive generally and to Biden personally, provides another clue. The former Vice President cares deeply about the issue—he played an outsized role in shaping the Obama administration’s foreign policy (particularly when compared to previous Democratic Vice Presidents, like Al Gore or Walter Mondale). He rivaled old Senate pals Hillary Clinton and John Kerry, Obama’s secretaries of state, in influence. And before 2008, he headed the Senate’s influential Foreign Relations Committee.
And given that his national security advisors from his days at the Naval Observatory, Antony Blinken and Jake Sullivan, are now arguably the campaign’s most powerful advisors, it’s clear he still cares about it. This is bad news for Gov. Lujan Grisham and gubernatorial runner-up Abrams, both state-level politicians with barren foreign policy credentials. The Sarah Palin fiasco in 2008, in which questions about foreign policy revealed to many her unsuitability for high office, also weighs heavily on the minds of Biden advisors keen not to repeat history.
It’s also true that Biden, having spent 36 years in the upper chamber, prefers Senators, although that alone should not rule out Congresswoman Demmings, whose résumé includes time on the Subcommittee on Defense Intelligence and Warfighter Support, the Subcommittee on Intelligence Modernization and Readiness, Subcommittee on Crime, Terrorism, Homeland Security, and Investigations and—perhaps crucially, when combatting immigration hawk Donald Trump—the Subcommittee on Border and Maritime Security. However, Demmings likely has another problem.
This is because Biden also favors figures who have run in national campaigns before, figures like himself in 2008, for instance. The enormous exposure, so the thinking goes, reveals battle-readiness (or lack thereof)—free vetting, in essence. Demmings may have attracted attention as an impeachment manager, but that pales in comparison to the celebrity and political experience that comes from a serious attempt at the White House. Moreover, that time working on impeachment could be plausibly portrayed by the Trump campaign as a distraction, a striking example of Congress frittering away time on partisan games when it could have been preparing America for the coming pandemic.
Susan Rice may pass the foreign policy test, but she’s even more vulnerable to national scrutiny than Demmings. A career foreign policy official, she has never faced election and her previous time in the national spotlight was far from positive. Whatever one thinks of her actual conduct during the Benghazi affair, the political fallout cost her the job of secretary of state when Obama won a second term. That she’s back in the news as a party in the “ObamaGate” imbroglio is also unhelpful. Even though Biden is running, to some extent, as a restorationist (painting the 45th president as firmly outside the American tradition), running with Rice may fuel perceptions of the campaign as a stale Obama-era remix, even as the Democratic Party—and, perhaps, the nation—has moved sharply left.
Since the dawn of the crisis this spring, Biden has said he thinks the country needs a presidency in the style of Franklin Delano Roosevelt. But running with one of Barack Obama’s most controversial subordinates would signal to the left, as well as to disaffected independents who voted for Obama, but then gave Trump a chance, that he’s unlikely to deliver such a transformative administration.
Before May, Amy Klobuchar was probably the front-runner for this role. But the tide of history has flown with rapid and unforgiving speed, and her chances are now essentially nil. Not only is Sen. Klobuchar a longstanding member of the political establishment of the state at the center of America’s burning racial crisis—Minnesota—but she’s a former prosecutor with past cases flagged by activists. For Biden, she’s gone from safest bet to largest liability. The first rule of vice presidential selections is do no harm. He will not pick her.
In all likelihood, this now narrows the field to two principal front-runners: Elizabeth Warren and Kamala Harris. When Biden contemplated a run for the 2016 nomination, Warren was reputed to have been his first pick for VP—and given the bad feelings festering from the Bernie-Biden showdown, selecting her might help heal the wounds between the party’s moderate establishment and its radical (and ever-louder) counter-establishment. It would also signal new bonhomie between Warren and Biden on a personal level, after a more acrimonious relationship in the 2000s.
Somewhat ironically, Warren’s most salient feud is now with Bernie Sanders, the only candidate in 2020 more left-wing than her, and a target for charges of sexism. Depending on the memories of Bernie supporters, this may depreciate the conciliatory value of the Warren selection, though likely only slightly. Warren would still be a valuable deputy, a signal to the progressive wing that Biden means business about all that FDR stuff.
And that’s why he probably won’t choose her. Wall Street has signaled ad infinitum its revulsion with Warren. While Biden has (by all accounts sincerely) moved to the left over the last year, this is still the man who a year ago told a New York fundraiser he will refuse to “demonize” the rich as president and that “nothing would fundamentally change.”
And besides, there is now a third, compelling criterion for Biden’s vice president. With almost 60 percent of Americans holding a favorable view of the movement, Black Lives Matter is branding powerful enough to rival Make America Great Again, and perhaps to overwhelm it. The protests in America’s streets are effectively shock troops for the removal of Donald Trump.
Failing to capitalize on this energy and enthusiasm would be a rookie’s political mistake—and Biden, whatever else he may be, is no rookie. Significantly for a business-friendly nominee, corporate America has also endorsed BLM without equivocation. And of all the candidates for the deputy spot, Kamala Harris would be the best-positioned to convert this into winning political momentum.
Biden’s quiet base in the primary was arguably Hollywood—he’s a long-standing champion of the American motion picture industry, including aiding studios with their work in China. As the primary campaign reached its apex, celebrities helped him seal the deal against Sanders. But if Biden’s a made man in SoCal, Harris completes the circle by bringing in NorCal. There and elsewhere, she’s a fundraising juggernaut. Although the state’s electoral votes may not be in contention, its cash is—and San Francisco tech giants have been somewhat cool to Biden.
But Biden picked up $4 million earlier this month through the fundraising of Tom Steyer, the San Francisco hedge funder and Biden’s former presidential rival. LinkedIn founder Reid Hoffman and Laurene Powell Jobs are also reputedly helping Biden close his financial gap with Trump. And the Los Angeles Times reports that previously recalcitrant California donors have grown increasingly alarmed with the president and are finally ponying up.
An attendee at tech billionaire Sean Parker’s wedding in Big Sur seven years ago told me that on a political level, the event was a spectacle, with then State Attorney General Kamala Harris and future Gov. Gavin Newsom competing for donor attention. Harris won, the story goes, presaging the arrangement with Newsom that Harris would go to Washington and Newsom would reign in Sacramento. Harris would also get the first bite at the presidency. Though a distant memory now, it’s significant that Obama included Harris, and not Biden, in the list of future party standard-bearers he offered the New Yorker’s David Remnick in the immediate aftermath of Trump’s election.
To critics who complain that California, with its insane inequality, rampant homelessness, and politically correct culture, is a liability—the failed archetype of liberals’ vision for America— the Democratic Party could counter as Newsom does: California is a successful nation-state, “the fifth largest economy in the world, 40 million strong… as diverse a state as exists in this country.” They think they’re sitting pretty. And they probably are. And in this campaign, at this moment, so is Biden.
Expect a Kamala Harris selection come August.
Since the pandemic began, I’ve been described as a so-called “COVID warrior,” which makes some sense. After all, I’ve defended the shutdowns of large gatherings. I’ve insisted that it’s wise to temporarily close churches and postpone funerals and other ceremonies. I’ve argued that extreme caution is necessary—that to do anything else would be to blatantly and selfishly ignore the scientific information at our disposal. I’ve held the opinion that, although it has caused irrevocable harm to the economy and caused millions of people to suffer, business owners who close up shop for fear of spreading contagion are in the right.
Now I feel like a fool.
By no means am I a coronavirus denier—more than 100,000 and counting have died from the COVID. But with conflicting reports about everything from wearing masks to the spread of the virus through surfaces coming out of the World Health Organization and the CDC almost weekly, my head is spinning. Nothing seems to make sense anymore.
For fear of spreading the virus, health experts have consistently recommended shutting down and avoiding public spaces, including schools, playgrounds, public pools, and public transportation. They’ve also advocated for limiting large gatherings and closing anything that might draw crowds. It’s advice that’s been repeated for months—to the point that those ignoring it have been reviled and accused of experimenting with “human sacrifice.”
That’s because asymptomatic carriers of the virus, though they may feel all right themselves, can become mass spreaders of the deadly contagion, especially in large groups. This is why Michigan residents protesting their state’s lockdown in Lansing were deserving of shame—they likely caused mass immiseration and sickness, right?
Wrong. Turns out, health officials didn’t really believe any of that.
Just last week, the WHO announced that it’s extremely rare for asymptomatic spreading of the coronavirus to occur. If you feel fine, then you’re probably not a grave threat to anyone, especially if you’re wearing a mask and gloves. Then the WHO backtracked on that statement, ultimately arriving at the completely unhelpful determination that “this is a major unknown.” Health experts simply don’t know to what extent the disease is transmitted by asymptomatic carriers—yet they still feel confident that the risks of the coronavirus shouldn’t impact our protesting of police brutality.
One rightly wonders how, within a span of weeks, we went from shaming people for being out in the streets to shaming those who won’t join the crowd.
What’s more, contact with infected animals and surfaces is unlikely to cause COVID-19 to spread, and chlorine kills the virus upon contact, so clean pools are also safe. But of course, many schools, playgrounds, pools, and businesses were forced to close.
And now some journalists from prominent publications—the same ones that have been demanding oh-so-extreme caution—are performing breathtaking gymnastics in an effort to backtrack, explaining that there’s no evidence of outdoor coronavirus spread. Now, it’s “prolonged indoor close contact” that we have to worry about.
They may be right. Maybe protesters really shouldn’t worry (though they probably should). But that doesn’t excuse what seems to be a disgusting hypocrisy that trampled on the livelihoods of more than 30 million Americans. Understandably, many are outraged and have lost all faith in the experts.
Health advice can’t shift with politics—COVID-19, cancer, and the flu don’t know party lines. The virus is either unmanageable or manageable. That’s it.
Now, with Trump aiming to restart his so-called “MAGA rallies,” we’ll inevitably have—and already have had—another round of tut-tutting from the media about how horribly irresponsible it is to gather in crowds. But who can possibly blame those who shrug these warnings off? MAGA rallies very well could spread COVID-19, but in the event they do, the George Floyd protests will be equally culpable. Expert credibility has been lost.
Maybe we should, as many of my more classically liberal friends have been saying all along, allow people to make their own choices, take their own risks, open their own businesses back up, hold their own protests against injustice.
Whatever the case, given the whiplash the public has experienced over these past few weeks, we certainly won’t be running to health experts as readily as before. Certainly, social distancing practices have helped flatten the curve, but living your life based on the inconsistent messaging of the WHO and the CDC is a recipe for disaster. If a second wave does appear, it will be cautious individuals and community innovation that provides the solutions—not those who have done nothing to earn our trust.
Anthony DiMauro is a freelance writer based in New York City. His work has appeared in The National Interest, Real Clear Media, and elsewhere. You can follow him on Twitter @AnthonyMDiMauro.
The post I Warned About the COVID and Now I Feel Like a Fool appeared first on The American Conservative.
Six months ago, when journalist Christopher Caldwell published a book asserting that the Civil Rights Act of 1964 had grown into a “rival constitution” that superseded the old Constitution, everyone laughed. The New York Times review accused Caldwell of rehashing old segregationist arguments. Singled out for particular ridicule was the sentence on the penultimate page of The Age of Entitlement where Caldwell advises conservatives that “the only way back to the free country of their ideals was through the repeal of the civil rights laws.” How could the fate of American freedom depend on something so radical, and so unlikely, as the repeal of the Civil Rights Act?
No one is laughing now.
Justice Neil Gorsuch, in his majority opinion in Bostock v. Clayton County, has decreed that the anti-discrimination protections afforded to women under Title VII of the CRA must be extended to gays, lesbians, and the transgendered, because all of these are discrimination “on the basis of sex.”
This is not a narrow ruling that just means you can’t fire a person for being gay. Extending civil rights law to protect a whole new category carries with it a host of ancillary protections.
Harassment is a form of workplace discrimination. An employee can’t be subjected to a “hostile work environment” because of their membership in a protected class. Under Bostock, an LGBT employee could allege a hostile work environment if a coworker expressed the wrong opinion about Prop 8 or said he believed a person’s sex is determined at birth. Some employers are already justifying firing workers who won’t use someone’s preferred pronouns because discrimination law requires it. Misgendering, they say, is harassment.
Diversity training is a multi-billion dollar industry because of Title VII. Companies hire consultants to give seminars on “white fragility” not because they are progressive but because it protects them from lawsuits. They have a better chance of prevailing in an employment discrimination case if they can point to diversity training programs as evidence of their commitment to civil rights.
De facto hiring quotas are another inevitable consequence of civil rights law as it has been interpreted. If a company doesn’t employ a minority roughly in proportion to its share of the population, someone from an underrepresented group can use that disparity as evidence that the company discriminated against them. (Gallup estimates that 4.5% of the population is gay.)
It is no use protesting that the text of Title VII doesn’t mandate any of this, or that the Bostock opinion limits itself to outlawing explicit policies against hiring LGBT workers. The whole story of employment discrimination law, from 1964 to today, is an endless parade of new mandates not specified in the statute being hatched by human resources departments, adopted by companies eager to fend off lawsuits, and ultimately incorporated into case law.
Anti-discrimination law is kept vague for precisely this reason. It gives the activists more room to get creative. In the 1970s, the federal Justice Department begged the Equal Employment Opportunity Commission to issue a specific rule on how closely a company’s workforce had to match broader community demographics to avoid a discrimination charge (they suggested a cutoff of 80 percent). The EEOC preferred to keep its rule vague.
Title VII doesn’t require performance evaluations, grievance procedures, written job descriptions, speech codes, minority hiring targets, or diversity bonuses—yet all of these have been extrapolated from it. More than 80,000 charges of discrimination are filed with the EEOC in an average year, and tens of thousands of those eventually become lawsuits or five- or six-figure settlements. Employers have good reason to want to act defensively.
And of course the Bostock ruling won’t stay confined to employment law. The majority opinion protests, disingenuously, that “sex-segregated bathrooms, locker rooms, and dress codes” are “questions for future cases.” But federal law is full of prohibitions on sex discrimination (Justice Alito’s dissent lists over 100 such statutes), and every one of those will have to be reconsidered in light of today’s ruling.
Gorsuch claims that yesterday’s ruling was grounded in judicial modesty. It doesn’t matter that sodomy was illegal in 49 states when the Civil Rights Act was passed, he says. If you can’t fire a woman for marrying a man, you can’t fire a man for doing the same without discriminating on the basis of sex, simple as that. Alito’s dissent accuses such rigid textualism of treating laws “as if they were messages picked up by a powerful radio telescope from a distant and utterly unknown civilization.” Both justices invoke Antonin Scalia to support their arguments.
Conservatives are split on the question of which justice is the real judicial activist, but both sides agree that in this case the solution is obvious: amend the law. It would be a curious silver lining to this massive defeat for conservatives if its ultimate effect were to be the rollback of the Civil Rights Act, which seemed so unthinkable when Caldwell’s book came out in January, a real political possibility. Once conservatives start thinking about what changes would have to be made to civil rights law before the left’s grip on our country’s institutions can begin to be loosened, it won’t stop with clarifications to the definition of sex.
On April 2, 1865, in the dying days of the American Civil War, President Abraham Lincoln wandered the streets of burnt out Richmond, the former Confederate capital. All of a sudden, Lincoln found himself surrounded by scores of emancipated men and women. Here’s how the historian James McPherson describes the moving episode in his magisterial book Battle Cry of Freedom:
Several freed slaves touched Lincoln to make sure he was real. “I know I am free,” shouted an old woman, “for I have seen Father Abraham and felt him.” Overwhelmed by rare emotions, Lincoln said to one black man who fell on his knees in front of him: “Don’t kneel to me. That is not right. You must kneel to God only, and thank Him for the liberty you will enjoy hereafter.”
Lincoln’s legacy as the Great Emancipator has survived the century and a half since then largely intact. But there have been cracks in this image, mostly caused by questioning academics who decried him as an overt white supremacist. This view eventually entered the mainstream when Nikole Hannah-Jones wrote misleadingly in her lead essay to the “1619 Project” that Lincoln “opposed black equality.”
Today, we find Lincoln statues desecrated. Neither has the memorial to the 54th Massachusetts Infantry, one of the first all-black units in the Civil War, survived the recent protests unscathed. To many on the left, history seems like the succession of one cruelty by the next. And so, justice may only be served if we scrap the past and start from a blank slate. As a result, Lincoln’s appeal that we stand upright and enjoy our liberty gets lost to time.
Ironically, this will only help the cause of Robert E. Lee—and the modern corporations who rely on cheap, inhumane labor to keep themselves going.
The main idea driving the “1619 Project” and so much of recent scholarship is that the United States of America originated in slavery and white supremacy. These were its true founding ideals. Racism, Hannah-Jones writes, is in our DNA.
Such arguments don’t make any sense, as the historian Barbara Fields clairvoyantly argued in a groundbreaking essay from 1990. Why would Virginia planters in the 17th century import black people purely out of hate? No, Fields countered, the planters were driven by a real need for dependable workers who would toil on their cotton, rice, and tobacco fields for little to no pay. Before black slaves did this work, white indentured servants had. (An indentured servant is bound for a number of years to his master, i.e. he can’t pack up and leave to find a new opportunity elsewhere.)
After 1776 everything changed. Suddenly the new republic claimed that “all men are created equal”—and yet there were millions of slaves who still couldn’t enjoy this equality. Racism helped to square our founding ideals with the brute reality of continued chattel slavery: Black people simply weren’t men.
But in the eyes of the Southern slavocracy, the white laboring poor of the North also weren’t truly human. Such unholy antebellum figures as the social theorist George Fitzhugh or South Carolina Senator James Henry Hammond urged that the condition of slavery be expanded to include poor whites, too. Their hunger for a cheap, subservient labor source did not stop at black people, after all.
Always remember Barbara Fields’s formula: The need for cheap labor comes first; ideologies like white supremacy only give this bleak reality a spiritual gloss.
The true cause of the Civil War—and it bears constant repeating for all the doubters—was whether slavery would expand its reach or whether “free labor” would reign supreme. The latter was the dominant ideology of the North: Free laborers are independent, self-reliant, and eventually achieve economic security and independence by the sweat of their brow. It’s the American Dream.
But if that is so, then the Civil War ended in a tie—and its underlying conflict was never really settled.
Michael Lind argues in his new book The New Class War that many powerful businesses in America today continue to rely on the work of quasi indentured servants. Hungry for unfree, cheap workers, corporations in Silicon Valley and beyond employ tens of thousands of foreign workers through the H-2B visa program. These workers are bound to the company that provided them with the visa. If they find conditions at their jobs unbearable, they can’t switch employers—they would get deported first. In turn, this source of cheap labor effectively underbids American workers who could do the same job, except that they would ask for higher pay.
America’s wealth rests on this mutual competition between workers—some nominally “free,” others basically indentured—whether it be through unjust visa schemes or other unfair managerial practices.
Remember that the next time you read a public announcement by the Amazons of this world that they remain committed to “black lives matter” and similar identitarian causes.
Fortunately, very few Americans hold the same racial resentments in their hearts as their ancestors did even just half a century ago. Rarely did we agree as much than when the nation near unanimously condemned the death of George Floyd at the hands of a few Minneapolis police officers. This is in keeping with another fortunate trend: Over the last 40 years, the rate of police killings of young black men declined by 79% percent.
But anti-racism as an ideology serves a perfect function for our corporations, even despite the evidence that people in this country have grown much less bigotted than they once were: As a management tool, anti-racism sows constant suspicion among workers who are encouraged to detect white supremacist sentiments in everything that their fellow workers say or do.
We’re getting turned into rats. Naturally, this is no fertile soil for solidarity. And with so many jobs precarious and subcontracted out on a temporary basis, there is preciously little that most workers can do to fight back this insidious managerial control. Free labor looks different.
And so, through a surprising back door, the true cause for which Robert E. Lee chose to betray his country might still be coming out on top, whether we remove his statues or not—namely, the steady supply to our ruling corporations of unfree workers willing to hustle for scraps.
It’s time to follow Abraham Lincoln’s urging and get off our knees again. We should assert our rights as American citizens to live free from economic insecurity and mutual resentment. The vast majority of us harbor no white supremacist views, period. Instead, we have so many more things in common, and we know it.
Another anecdote from the last days of the Civil War, also taken from Battle Cry of Freedom, might prove instructive here: The surrender of Lee’s Army of Northern Virginia to Ulysses S. Grant at Appomattox Court House on April 9, 1865 essentially ended the Civil War. The ceremony was held with solemn respect for Lee, though one of Grant’s adjutants couldn’t help himself but have a subtle dig at Lee’s expense:
After signing the papers, Grant introduced Lee to his staff. As he shook hands with Grant’s military secretary Ely Parker, a Seneca Indian, Lee stared a moment at Parker’s dark features and said, “I am glad to see one real American here.” Parker responded, “We are all Americans.”
Gregor Baszak is a PhD Candidate in English at the University of Illinois at Chicago and a writer. His articles have appeared in Los Angeles Review of Books, Public Books, Spectator USA, Spiked, and elsewhere. Follow Gregor on Twitter at @gregorbas1.
Meet Pastors Corey Brooks and Jasper Williams. In the Middle Ages, they might have been called “fools for Christ.” The Sermon on the Mount assured that such men were storing up “treasures in heaven, where moth and rust do not destroy, and where thieves do not break in and steal.” Equal part pastors, diplomats, and prophets, they are men of action who have turned the conventional narrative about race and poverty on its head in order to strengthen black America. Some might even call them saints.
Pastor Corey Brooks stood at the altar of New Beginnings Church on the South Side of Chicago. The sanctuary sits at 6620 S. King Drive, just south of “O-Block,” known as the most dangerous neighborhood in Chicago. It was the summer of 2011, and Brooks was tasked with burying yet another victim of gang violence—the city averages over 2,000 shootings per year. This time, Brooks had had enough.
As he surveyed the congregation, he decided to do something that many pastors do all the time, an altar call. But this time, instead of asking people to bend a knee, he sensed that some were carrying illegal firearms in the sanctuary. He pleaded with his congregation to lay down their weapons. The room fell silent. For a moment, he feared for his life. But then, the silence was broken when a young man came forward and placed his gun on the altar. After the memorial, he discovered several other guns left beneath the pews by men who had been too ashamed to stand up before the community, yet clearly wanted to amend their lives.
As he left the church, he looked down the street towards a motel whose presence weighed heavily upon the members of his congregation. The Super Motel was an epicenter for sex trafficking, drug use, and prostitution. For the past 18 months, he had led protests with over 100 people on Friday and Saturday evenings, calling on law enforcement to shut down the den of iniquity. Eventually, Chicago police heeded his request, but the abandoned property still attracted illicit behavior even after its closing.
That’s when Brooks felt another nudge on his heart similar to the one he felt at the funeral. The situation required action, and he felt that God was knocking on the door. He responded by taking to the rooftop of the motel in prayer and protest. For the next 94 days, during a blisteringly cold Chicago winter, Brooks camped out and refused to come down until he raised $450,000 to purchase and demolish the structure. His efforts attracted national attention, and he was successful in accomplishing his goal.
Fast forward to 2020: Brooks now oversees both his church and a vibrant community center on the site of the old motel, Project H.O.O.D., whose mission is to empower community members with the “tools necessary to become peacemakers, problem solvers, leaders, and entrepreneurs.” Serving over 2,000 people, his team has helped to launch countless start-ups, provide job training for certified construction workers, start a violence prevention and conflict mediation program, offer financial coaching to thousands, and countless other programs from summer camps to providing a safe place for teens to gather on Friday evenings.
O-Block is safer on account of his efforts, and he’s developed street cred with gang members who respect the investments he’s made in the community. When asked what drives his seemingly endless ambition, Brooks responded with two things: “I really believe in God and I really do believe that one day I’ll have to stand before him and give an account for my time on earth…And then secondly…I want people to see that you can be conservative, and you can be black, and you can be in the hood, and make these principles still work.”
Pastor Jasper Williams took the podium at Greater Grace Temple in Detroit and ignited a powder keg. It was August of 2018 and the Queen of Soul, Aretha Franklin, was being laid to rest. The marathon, eight-hour service was attended by an A-list of celebrities and politicians including Ariana Grande, Jennifer Hudson, Whoopi Goldberg, Stevie Wonder, Bill and Hillary Clinton, Eric Holder, Jesse Jackson, and Al Sharpton. They did not know what Williams had in store.
Williams was a longtime friend of the family, having eulogized Aretha’s father, Rev. C. L. Franklin, a civil rights leader and organizer for Dr. Martin Luther King, Jr. A family tradition, Williams was invited back to pay his respects to Aretha, which he did with aplomb. What they didn’t anticipate, however, was that he’d take the occasion—in front of a national audience—to issue a stirring wake up call to black America.
“Black America has lost its soul,” he declared. It was time for the community to “come back home to God.” Now Williams is an eloquent orator—and an even better singer—so he seasoned his fiery sermon with sweetness and fatherly affection to build support from his audience, at least up to the point when he revealed his core concerns.
There was a time, he said, when black Americans had their own booming economy. Segregation was a grave injustice and positive evil, yet it forced the community to depend on each other. Black Americans owned their own grocery stores, barber shops, banks, and hotels. The community was thick. While integration was the chief accomplishment of the civil rights movement, Williams lamented that it was accompanied by “the loss of the black economy and the loss of the black man’s soul.”
Williams went on to lament that there are no fathers left in black homes and no men around “to raise a black boy to be a black man.” A revival of the family, specifically the home, said Williams, would mean more to the black community than any house that “Big Government” or “Big Business” wants to give them. In one of his most celebrated and often repeated lines, he proclaimed, “As the home goes, so goes the street. As the street goes, so goes the neighborhood. As the neighborhood goes, so goes the city. As the city goes, so goes the county. As the county goes, so goes the state. As the state goes, so goes the nation. As the nation goes, so goes the world.”
At the pinnacle of his sermon, Williams pushed the boundaries further by grabbing another third rail of American politics: black-on-black crime. He lamented that nearly 6,000 black people are killed by each other every year, according to a study from the Tuskegee Institute. “Do black lives matter?” he asked. “No, black lives do not, will not, ought not, should not, must not matter until black people start respecting black lives.” We can hear Aretha’s voice speaking to us today, he concluded, “It’s time now that my race turns around and comes back to God.”
The backlash was fierce. “Aretha Franklin’s family slams pastor’s ‘very, very distasteful’ funeral eulogy” read a headline at USA Today. NBC wrote that he “stirs controversy.” NewsOne called his sermon “a disgrace.” And Essence said, “Many were not happy with the eulogy.” According to AP News, “Williams was blasted on social media for misogyny, bigotry and the perpetuation of false science on race.” Unflinching, Williams stood behind his remarks.
Were these the words of a madman or a prophet? Had Williams himself lost touch with the soul of black America? Outsiders might ponder these questions, but his actions tell a different story. Known as “a man with a heart for the people,” and “Son of Thunder” for his service to the local black community and magnetic preaching, Williams and his son pastor a congregation of approximately 10,000 people at two locations in Atlanta. When his flight got back from Detroit, Williams told me that he was heralded as a hero by his congregation.
If anyone is in touch with the needs of the black community, it’s Williams. Born in Memphis, Tennessee, he attended Morehouse College in Atlanta on account of Dr. Martin Luther King Jr., who would later become a friend and even solicit his advice on how to become a more effective preacher. For the past 70 years, Williams has been preaching and serving his community week in and week out.
In 2014, after the shooting of Michael Brown in Ferguson, Missouri, Williams said that “I heard the voice of God: Something needed to be done…and…[it] needed to be done through the church.” He considered protests and boycotts of white businesses, but after seeking the counsel of a close friend, he chose to call a meeting at the Commerce Club in Atlanta for civic leaders from every county in the metro area.
His goal was to “turn black America around.” After hearing from community leaders and conducting a listening tour during the next six months, he collected his data and met with Dr. Mary Langley at the Morehouse School of Medicine, who prepared a qualitative analysis of his findings. The results showed that the three greatest needs in the African-American community were family, home, and parenting.
Over the past six years, he’s developed a parenting curriculum that’s widely taught in churches and schools in the Atlanta area through his non-profit organization, A.A.C.T.S. (African-American Churches Transforming Society). Furthermore, his church also provides a wide range of services including welfare assistance programs, childcare, counseling, eldercare, and support for single-parent households.
If Williams emphasizes the reforms that can be made at the local level in African-American families, he also acknowledges the legacy of racism, and its impact on the black community. In his autobiography, It Ain’t But One, when discussing the issue of black-on-black crime—the lightning rod that sparked most of the criticism for his eulogy of Aretha Franklin—Williams cites “covert racism,” more than “overt racism” as the source of the problem. He explains that “the spirit of slavery still lingers” today. He notes that “white slave owners pitted their slaves against each other as a way of keeping them from uniting in opposition to slave holders.” This “atmosphere of suspicion,” he notes, “was a system of divide and conquer, and its terrible effect can be seen in our culture to this day.” Nevertheless, when looking for solutions to the grievances voiced by protest movements today, he believes that the most fruitful place to start is in the home.
What motivates Williams? Where does he get the courage to preach with such boldness as he did at Aretha’s funeral? “All you got to do is drive through our community and see how destitute our community is,” he said. He described the ethos as “lackadaisical,” “apathetic,” “nonchalant,” and lamented that there’s “no desire for true life.” Their pain is evident, and Williams understands it first hand. In the 1980s, Williams found himself addicted to cocaine while simultaneously trying to pastor his church. He hid his habit from the congregation, but his life began to spiral out of control. After battling with the addiction for years, he finally checked himself into rehab. He’s been clean ever since, and his testimony serves as a powerful example to those in his community struggling with the same issue. The situation on the ground is intolerable. “That’s drive enough for me,” he concluded.
There is a stigma in the black community against being conservative or Republican. Presidential candidate Joe Biden, who later apologized for his comments, provided the case in point when he recently stated: “If you have a problem figuring out whether you’re for me or Trump, then you ain’t black.”
Pastor Corey Brooks is all too familiar with this presumptuous mindset. When he surprised his community by endorsing a Republican candidate, Bruce Rauner, for governor of Illinois in 2014, the costs were beyond anything he could have ever imagined. He received death threats and had to hire private security. His family had to go into hiding, and his church was broken into and burgled. Seventy-five percent of his 2,500-member congregation left (taking their tithes with them), and unions were bussed into his neighborhood to protest day after day.
While he was taken aback by the reaction, he can personally relate to their feelings. When he was a freshman in college, he recalled, a professor wrote a bunch of political positions on the board without any labels. Each student had to identify where they stood. He found the set of principles that most closely aligned with his values and waited for the professor to reveal the labels. He “almost fainted when he saw his views were conservative and Republican.” His family had always been liberal and Democrat. How could this be?
Pastor Jasper Williams, on the other hand, resists labels altogether. When I asked about his political affiliation during our video interview, he said that “I’m not a Democrat or a Republican,” and prefers to call himself a “theo-crat-con based on what God says about the issues.” He elaborated on this point by explaining his frustration with how both parties treat African Americans: “The Democrats take us for granted, and the Republicans don’t think enough of us to ask us.” While Williams eschews partisanship, there’s no question that his views about the family and personal responsibility align more closely with beliefs conventionally held by conservatives than by liberals.
When I pressed Brooks to define conservatism, he gave the following principles: “fiscal responsibility,” “less governmental involvement,” “family focused,” “entrepreneurship and free enterprise,” and “not need[ing] all these social programs…to succeed.” Brooks notes that “Blacks embody conservatism…we have conservative views and thoughts, but it doesn’t translate into our political party affiliation.” For example, the church is a recognized source of authority in black neighborhoods, even for those who do not attend services regularly. And the importance of family, and intergenerational bonds, is widely esteemed in their communities. In short, black conservatism is a way of life more than an ideology.
As I spoke with Brooks, I was curious about his thoughts on the ideological shifts that have taken place on the Right since Trump’s election in 2016. Many have talked about a conservative realignment on issues like the Iraq war, immigration, and globalization. Mitt Romney’s platform looked a lot different than Donald Trump’s. The party of hedge fund managers became the Republican Workers Party, or so many have claimed. Did he identify with one brand of conservatism more than the other?
Brooks told me that the “workingman’s message is more appealing…[but] Trump himself makes it hard for the message to be received.” Reflecting on the issue of globalization, he said that it became hard for families to take care of themselves when they lost the strong economic bases in their communities. He also noted that “a lot of African Americans do believe that…[on account of] being so loose with immigration…a lot of jobs that young African Americans could have, they’re not available.” He discussed the pervasive myth among American elites that “nobody wants those jobs anyway.” He rejects that notion emphatically, saying “that’s not true,” and personally knows many people who would do these types of jobs if they had the opportunity.
When I asked Williams about the shifts in conservatism, he was more measured. While he agreed with Brooks’s analysis about the economic impact of immigration, he thought that the GOP was “too harsh and hard on keeping people out,” and that we should encourage immigration provided it’s done legally. And he noted that, when it comes to healthcare, it seems like Republicans “don’t have a plan.” On foreign policy, however, he acknowledged that Trump may have been right to a degree: “we have been too involved in other nations’ business.”
Where did it all go wrong? Both pastors pointed to the period immediately following the civil rights era. Speaking of Martin Luther King Jr., Williams said, “You can’t judge a man out of his times. Martin Luther King contributed a lot to the aggrandizement of the African-American community. No question about that. But with us looking back, we have 20/20 vision.” The civil rights movement accomplished many good things; however, the loss of the black economy and black-owned business that followed had far reaching consequences.
Brooks points to liberal policy: “Government got too involved…in our families, and they started to fall apart.” He acknowledged that there are “underlying racial issues” that disparately affect blacks, “but even more so it’s about poverty.” He complained that all of America’s largest cities have the same issues affecting African Americans, and they are all run by Democrats: “High unemployment,” “high incarceration,” “lack of businesses,” “high single-parent households,” and “high abortion” rates.
He was very excited to see Republicans pick up the criminal justice reform issue, saying that it’s “real important…we have a lot of people who were incarcerated for way too much time for non-violent offenses.” He blames Clinton and Biden-era liberals for locking up a lot of men. “Hillary Clinton called them predators,” he said. Their policies “led to them [black men] being locked up more than anybody.” Williams had a slightly different take on the criminal justice issue. “If we change the culture of black America through parenting it would eliminate the justice system from being tilted the way it is,” he says. Politics is downstream of culture. Focus on the home and the politics will take care of itself.
When asked about the upcoming election, Brooks said that Joe Biden is popular “only because of one person: Obama…He gets the black card because [of] Obama.” Despite Biden’s success in the primaries with black voters, however, he thinks that he could lose the black vote the more that the incarceration issue is discussed on the campaign trail or at the debates. But don’t count on them pulling the lever for Republicans. While Brooks said that “I wish they would go out and vote for Trump and conservative values,” disenchantment with Biden over the crime issue is more likely to lead to them staying home, driving low voter turnout in black neighborhoods.
What about Kanye West? Perhaps the most influential black entertainer on the planet, and now, a Trump supporter. Brooks said, “I like Kayne,” specifically, “I appreciate his transformation. I like what he’s trying to do, that he’s being vocal about it. Speaking out about some of the ills of the music industry…I like that he’s talking about some conservative principles, especially as it relates to mass incarceration.” Are Brooks’s views about Kayne widespread in his local community? Not so fast. “They think Kanye is crazy, but [he] gets a pass because he’s a major entertainer, and he has great wealth.”
If someone with Kayne’s status and cultural capital is nevertheless written off for his rightward turn, Brooks and Williams face nearly insurmountable odds—at least as far as politics is concerned. Thankfully for their communities, and the country, their message is bigger than politics.
What does the Gospel mean to you? The answer that both men gave surprised me. Ask that question in other circles and you’re bound to get a lengthy explanation about different theories of the atonement. For them, the answer was simple: action.
Pastor Brooks responded with enthusiasm: “Living it out in the flesh…It’s about loving God but it’s also about loving your neighbor. The Gospel has to be translated into action.” Similarly, Pastor Williams noted, “Seeing the need and then doing something to fulfill the need that you see.”
Regardless of their politics, this is what makes their ministries so powerful. Both men possess seemingly boundless energy and unflinching courage. They’ve set out to change the world family by family and block by block. Their goal is to build a nationwide movement to provide resources for black communities to fight poverty by strengthening families, teaching job skills and entrepreneurship, and providing a place of refuge for those seeking to escape gang violence.
Conservatives like to retreat to seminars and conferences to debate ideology. While Brooks and Williams could no doubt hold their own in such a setting, they’ve taken a different path. Being men of action, they chose to spend their time building institutions that will continue to meet the spiritual, material, and intellectual needs of their communities long after they’re gone.
Those in politics like to think in terms of election cycles, be it two years or four years. Brooks and Williams take the long view. What they’ve built will pay dividends for a century. In light of the ephemerality of our politics, we could learn from their example.
This article was originally published in the July/August 2020 issue of The American Conservative.
John A. Burtka IV is the executive director and acting editor of The American Conservative.
The post These Black Conservative Pastors Are Men of Action, and Soul appeared first on The American Conservative.
American diplomatic and military support for Taiwan has grown dramatically during the Trump years. The administration has taken steps to boost that support, but Congress also has pushed its own initiatives. One key measure was the passage of the Taiwan Travel Act in 2018, which not only authorized but encouraged high-level defense and foreign policy officials to interact with their Taiwanese counterparts.
That was a dramatic change from the policy adopted when the United States shifted diplomatic relations from the Republic of China (Taiwan) to the People’s Republic of China (PRC) in 1979. U.S. policy thereafter had confined all contacts to low-level officials only. More recent congressional measures have sought to emphasize that the United States is firmly in Taiwan’s camp. The trend is not merely a matter of academic interest, since under the 1979 Taiwan Relations Act (TRA), the United States is obligated to regard any attempt by Beijing to coerce Taiwan as a “grave breach of the peace” in East Asia.
The U.S. determination to resist China’s attempts to exert its power in the Western Pacific has grown still stronger after Beijing imposed a new national security law on Hong Kong in May, greatly diluting (if not negating) that territory’s guaranteed political autonomy. The Trump administration, with bipartisan congressional support, rescinded Hong Kong’s special trade status and adopted other punitive measures.
U.S. leaders also sought solidarity from America’s allies in both Europe and East Asia for a joint statement of condemnation and the imposition of sanctions in response to the PRC’s erosion of Hong Kong’s autonomy. The lack of support from European capitals creates serious doubts about how much assistance Washington could expect if a showdown with China emerges at some point over Taiwan’s de facto independence. Allied backing on the Hong Kong issue was tepid and grudging, at best.
Among the European powers, only Britain (Hong Kong’s former colonial ruler) joined the United States in embracing a hardline approach. Receptivity to a confrontational policy was noticeably lacking among Washington’s other European allies. The German government’s reaction was typical. German Foreign Minister Heiko Maas contended that the best way for the European Union to influence China on the Hong Kong dispute was merely to maintain a dialogue with Beijing. That stance fell far short of being an endorsement of the U.S. strategy.
France appeared to be even less eager to join Washington in trying to pressure Beijing. The South China Morning Post reported that in a telephone call to PRC Foreign Minister Wang Yi, Emmanuel Bonne, diplomatic counselor to French President Emmanuel Macron, stressed that France respected China’s national sovereignty and had no intention to interfere in its internal affairs about Hong Kong.
The European Union itself adopted an anemic response to the PRC’s passage of the national security law. Anxious not to become entangled in America’s escalating rivalry with China, EU foreign ministers on May 29 echoed Germany’s preference and emphasized the need for dialogue about Hong Kong. After a videoconference among the bloc’s 27 foreign ministers, EU foreign-policy chief Josep Borrell said that only one country bothered to raise the subject of sanctions. Borrell added that the EU was not planning even to cancel or postpone diplomatic meetings with China in the coming months. So much for Washington’s goal of a common diplomatic front by the Western allies against Beijing’s actions in Hong Kong.
Washington did receive one apparent endorsement of its effort to gain allied cooperation for a stronger stance against the PRC. In early June, NATO Secretary General Jens Stoltenberg insisted that alliance members needed to adopt a more global approach to security issues, unlike the Europe- and North America-centric tack that he contended had usually shaped the alliance’s agenda. With an implicit reference to China, Stoltenberg stated that “as we look to 2030, we need to work even more closely with like-minded countries, like Australia, Japan, New Zealand and [South] Korea, to defend the global rules and institutions that have kept us safe for decades.” Highlighting those nations for special mention was hardly coincidental. And in an unsubtle slap at Beijing, he contended that the greater cooperation with the noncommunist Pacific nations aimed to create an international environment based on “freedom and democracy, not on bullying and coercion.”
Stoltenberg is swimming upstream, given the strong indications from leaders of the EU and such key EU powers as France, Germany, and Italy that they have no wish to adopt a confrontational policy toward China. And even Stoltenberg emphasized that NATO cooperation with China’s East Asian neighbors would not be primarily military in nature. However, nonmilitary support will be of small comfort to the United States if a showdown over Taiwan materializes.
The reaction of key Asian allies to Beijing’s new restrictions on Hong Kong was not measurably better than the level of support Washington received from its European allies. Japan’s response likely disappointed Washington the most. After more than a week of internal debate, Prime Minister Shinzo Abe’s government declined to join the United States, Britain, Australia, and Canada in issuing a statement condemning the PRC’s actions in Hong Kong. Press reports indicated that the decision “dismayed” U.S. leaders. South Korea seemed even more determined than Japan to avoid taking sides in the dispute between the United States and China.
The bottom line was that with the exception of Australia, the United States could not count on its East Asian allies for even diplomatic and economic support against the PRC in response to its actions regarding Hong Kong. Such an outcome does not bode well if Washington seeks stronger backing—especially military backing—in the event of PRC aggression against Taiwan.
Unfortunately, the prospect of such aggression is increasing rapidly. Beijing has explicitly removed the word “peaceful” from its stated goal of inducing Taiwan to accept unification with the mainland. Equally troubling, PRC military exercises in and near the Taiwan Strait are becoming ever more numerous and menacing. On June 9, Chinese fighter planes once again violated Taiwan’s airspace, causing Taipei to send its own planes to intercept the intruders. The overall level of animosity and tension between Beijing and Taipei is at its worst level in decades.
Washington faces the prospect of being called upon to fulfill its implicit commitment under the TRA to defend Taiwan’s security. The trigger could come in the form of a PRC attack on some of Taipei’s small, outlying island holdings directly off of the mainland or in the South China Sea. Even a frontal assault on Taiwan itself cannot be ruled out. Such developments would immediately test the seriousness and credibility of the U.S. defense commitment.
Worse, the United States might well be waging the military struggle alone. The European allies almost certainly would not embroil themselves in a U.S.-China war. The reaction of Australia, South Korea, and Japan is somewhat less certain. PRC coercion against Taiwan would constitute a far more serious disruption of East Asia’s security environment than Beijing’s decision to tighten its grip on Hong Kong. All three countries would face an agonizing dilemma. If they joined a U.S.-led military defense of Taiwan, they would face severe retaliation. However, if they left the United States hanging, U.S. leaders, enraged at such a betrayal, would likely terminate Washington’s security alliances with those countries.
In any case, the United States cannot count on military support from its allies in a showdown with the PRC over Taiwan. It is yet another risk factor that Washington needs to take into account as it does a badly needed, long overdue, risk-benefit calculation regarding America’s commitment to Taiwan’s defense.
Ted Galen Carpenter, a senior fellow in security studies at the Cato Institute and a contributing editor at The American Conservative, is the author of 12 books and more than 850 articles on international affairs.
Rod Dreher, TAC senior editor: Last summer, on a vacation in England, I found myself in a bookstore skimming through The Order of Time, by the physicist Carlo Rovelli. It’s a dense but lovely book about the fundamental nature of Time—lovely because Rovelli writes about scientific abstractions with a poet’s gift. I bought a number of books on that trip—if you plan to visit London this year, bring an extra bag for your haul from Daunt Books—which meant that back home, Rovelli’s slim volume quietly took a place on my bookshelves, in the “Unread, But I’m Going To Get To It, I Swear” section.
It took a television show to make it happen. I’ve lately become addicted to the Netflix series “Dark,” a German-language sci-fi drama that’s like what you would get if David Lynch had directed “Stranger Things.” The series is about Time, and what the past and the future have to do with the present. This is a venerable theme in science fiction, of course, but “Dark” explores it in ways that have surprising moral and metaphysical weight. After one episode introduced the character of an elderly physicist who muses on the nature of Time, I found myself asking, “Is what he says real, or is it just something they made up for TV?” And so I dug out the Rovelli, and began to read.
As it turns out, the book is more difficult than I expected, especially in the second half—but also thrilling at times. And yes, Rovelli confirms at least part of the TV series’ metaphysical conceits. For example, an investigator looking into the disappearance of a boy switches from wondering where the boy is to when he is. According to Rovelli, that is a more accurate way of thinking about the universe.
“We cannot think of the physical world as if it were made of things, or entities. It simply doesn’t work,” he writes. “What works instead is thinking about the world as a network of events.”
Rovelli says that scientists now believe that the causal structure of the world—that is, past, present, and future—describe reality, but only in a limited way. In one chapter, Rovelli discusses how our grammar is simply inadequate to convey the true complexity of reality: that the past and the future do not have universal meaning. This is a fundamental insight of Einstein’s, but over a century later, the implications of it still have the power to overwhelm our capacity for reckoning with them.
In one of his most moving chapters, Rovelli muses on the intersection of time with a person’s identity—that is, how the nature of our individual personhood depends on memories. A person who wakes up in a new world every day remains a human being, but who is that person? There is no “who” without a series of “whens”—and this is something the series “Dark” takes up, though in an exaggerated form that makes for good dramatic television. It’s not surprising that a sci-fi drama requires a leap of faith beyond what is scientifically credible. What is surprising about “Dark”—and I’ve confirmed this beyond consulting Rovelli’s wonderful little book—is how close to scientific truth the series sticks.
Reality is genuinely that bizarre. It was a delightful surprise to have an eerie sci-fi TV drama send me to a book on the real-life mysteries of time and personhood, and to open up the book in a new way to me. Having just finished both the existing two seasons of “Dark” and Rovelli’s book, I have been thinking about the recurrence of certain themes across generations in the life of my family and my hometown, and reflecting on the metaphysical implications of William Faulkner’s well-known observation, “The past is not dead. It’s not even past.” Now, about all those Faulkner novels on my Unread, But shelf…
If there is one thing I have learned about making a career out of studying North Korea it is this: the Kim regime loves to tell those who pay attention to its unique propaganda many times what it is going to do before it does it—sometimes even telegraphing for us in clear language the next crisis it intends to start.
If that holds constant now, then Pyongyang seems once again hellbent on sparking a crisis with the U.S. North Korea, angry that multiple summits with Seoul and Washington have failed to grant sanctions relief despite what they feel were big concessions on their part, could very well bring us back to the days of fire and fury and a crisis perhaps worse than the dark days of 2017.
The only question in my mind is the time and the place and type of escalation it could choose. But again, North Korea has handed us their plans well in advance, all we must do is listen and prepare for what is coming.
And, at least for now, the latest challenge with North Korea is a squabble between Seoul and Pyongyang that could get quite serious. The Kim regime, using the excuse of anti-regime leaflets that have been sent by South Korean activists on and off for decades, has been warning for several weeks that such action could spark a response. Kim Yo-jong, Chairman Kim Jong-un’s sister, has been the lead figure in putting out press release after press release attacking South Korea. And, with a warning that such action was coming, North Korea on Tuesday blew up the Inter-Korean liaison office, sparking another crisis on the Korean Peninsula.
And from here, things are bound to get worse, as North Korea clearly has plans for President Trump, which could very well mean a test of a long-range missile that, in theory, could hit the U.S. homeland, or an ICBM. In fact, true to form, North Korea keeps dropping hints that this very well could happen—and soon.
Going back to October of last year, Pyongyang has continued to increase the magnitude of threats it has made, especially after failed talks last fall in Stockholm, continuing to signal some sort of aggressive act was coming.
Recall there was the North Korean threat to give the U.S. a “Christmas gift” late last year, perceived by many to be an ICBM test, which thankfully never occurred. Then, at the end of last year, Pyongyang vowed to show the world a “new strategic weapon,” which many believed was also another reference to a new ICBM test. There have been statements late last year and earlier this year that Pyongyang does not feel bound by any pledges made to America in regard to halting ICBM as well as nuclear weapons tests.
To be frank, only up until the last few days, I did not think North Korea would dare test an ICBM this year and I dismissed such talk, knowing that they would be recreating the very conditions that sparked a nuclear showdown with the U.S. back in 2017. However, in a recent press statement timed to the second anniversary of the Singapore Summit, North Korea explained that “[T]he secure strategic goal of the DPRK is to build up more reliable force to cope with the long-term military threats from the U.S.”
While that can clearly mean many things, I would argue that North Korea has now threatened the U.S. once again with testing ICBMs. At present, the Kim regime’s nuclear and missile forces are clearly not “reliable” enough to ensure a successful nuclear strike on America. Pyongyang, at least according to open-source data, has never tested a dummy warhead that can penetrate Earth’s atmosphere. At least for now, Pyongyang has only tested the range of its Hwasong-14 and Hwasong-15 ICBMs back in mid-to-late 2017. The only way North Korea can develop a ‘reliable’ ICBM is to keep testing—and that means testing a working warhead design in the field–not confined to a lab.
And with reports last week that North Korea built and displayed more ICBMs, Pyongyang could very well spark a crisis later this year with a fresh round of testing. One cannot even rule out a July 4 launch, the exact date back three years ago when North Korea tested its very first ICBM or even sooner, say on June 25, the date marking the 70th anniversary of the Korean War.
All of this foretells a very dangerous few weeks or months to come. If North Korea were to test an ICBM, Trump would lose one of the signature diplomatic accomplishments he loves to brag about time and time again. It stands to reason Trump would take great offense to such a move, knowing the Biden campaign would use the launch as ammo in the campaign. And that can only mean one thing: Trump getting ultra-aggressive in trying to get North Korea to heel. In the pressure cooker that is a U.S. presidential election, there is no telling what Trump will say or do to push back against North Korea’s act of defiance.
Welcome to the North Korea crisis of 2020. North Korea warned us it was coming.
The post Kim Jong Un’s Well-Timed ‘Surprise’ Portends Another Nuke Crisis appeared first on The American Conservative.
A grievous blow to international security has just been dealt. President Donald Trump has announced that American troops are to be withdrawn from…Germany. Yes, Germany. Why are American troops in Germany? Because we have to fight them over there so we don’t fight them here, you see, and there are few generators of terrorism and chaos in the world today quite like the Berlin club circuit.
The real surprise isn’t that we’re pulling troops out of Germany; it’s that they’re still there 75 years after World War II ended. And according to the White House, they aren’t even all being withdrawn, just a quarter of them, a reduction of 9,500 that will leave 25,000 American boots on German soil. And of the 9,500, a senior administration official tells Reuters that at least some will be redeployed elsewhere. Poland, with its closer proximity to Russia and plans to build a Fort Trump panderopolis, is a likely destination.
Which is to say: this is less a withdrawal than another Trumpian “withdrawal,” a stagey bit of realist theater that ultimately leaves America just as militarily overextended as she was before. It’s similar in that way to Trump’s “withdrawal” from Syria last year, which saw American forces leave only to boomerang back in on a cynical mission to protect that country’s oil fields. Yet that hasn’t stopped the usual suspects from shrieking about how Trump is playing demolition man to the postwar order. Critics have accused him of initiating the Germany pullout to spite Angela Merkel. Twitter this week was hot with speculation that he was acting at Putin’s behest. House Republicans, channeling the Hans Blix puppet in Team America, sent Trump a letter.
Hans Binnendijk, a fellow on the Atlantic Council, warned at DefenseNews that pulling American troops out of Germany could undermine NATO attempts to protect Europe against Russia. The alliance’s “deterrent posture,” he intones, “is already fragile.” Worse, “a withdrawal would be a clear signal that Trump is not serious about defending Europe. It would undercut the very deterrent strategy that both the Obama and Trump administrations have put in place to contain an aggressive Russia.”
But who’s really undermining deterrence here? Who’s really unserious about defending against spooky Russian imperialism? Is it the United States, which is burying itself in debt to maintain tens of thousands of troops on the European continent? Is it Donald Trump, who has beefed up America’s military presence in Poland and sent an additional 20,000 soldiers to Europe for anti-Russian military exercises? Or is it Germany, which since the Cold War has slashed its armed forces to fund its benevolent welfare state and balance its budgets? Is it the Bundeswehr, the German military, which internal reports have found to be plagued by deterioration and dysfunction? Is it Angela Merkel, whose government announced two years ago that not only would it not meet NATO’s required 2 percent of GDP on defense spending, it wouldn’t even clear its own downscaled goal of 1.5 percent?
Germany now says it intends to hit the 2 percent mark by the lickety-split deadline of 2031. If Russia really is the primed-to-blow menace that the foreign policy establishment claims, then such foot-dragging ought to have elicited outrage from Arlington to Foggy Bottom. Instead the response was mostly muted. The elite narrative still holds: Donald Trump is steering America towards ruin and Angela Merkel is the new leader of the global sisterhood of the traveling pants. That the reality, at least on military spending, looks like the molecular opposite matters little. That America’s troop presence has clearly enabled the problem, entitling the Germans to protection without ever making them pay for it, is rarely acknowledged.
Elsewhere, at the Free Beacon, the hawkish writer Matthew Continetti has his own dire assessment of the “withdrawal.” Most of what he writes is the usual Kagan-esque noisemaking: as the U.S. pulls out, chaos moves in; neurotic “host governments” are in constant need of reassurance; et cetera. But Continetti also makes a more striking claim: pulling troops out of Germany, he says, is of a kind with the protests and riots that followed the killing of George Floyd. The reason? Both are symptoms of “a loss of national self-confidence, an outbreak of intellectual and moral uncertainty, and an unpredictable, erratic, and easily piqued chief executive.”
Continetti is certainly correct that Trump is erratic. And I suppose he’s right about “intellectual and moral uncertainty,” if only because international relations done right rarely offers up absolute certainties. But the baton twirler in his parade of horribles, “a loss of national self-confidence,” now that’s interesting. It, too, is technically correct—hefty majorities of Americans tell pollsters we’re headed in the wrong direction—but Continetti skips over the reasons why. He says nothing about our invasion of Iraq and subsequent failures there, which brought to an end the credibility and swagger that America enjoyed internationally after the Cold War. He’s mum, too, on how our wars in the Middle East have served to distract us from simmering problems closer to home, which have lately come to a boil.
It isn’t a marginal downsizing of America’s empire that’s shown a crisis of confidence; it’s the fact that the empire is still there, long after it stopped being a net positive. That the United States is still supplying boots and bases to the most powerful country in Europe is preposterous. That it’s still trying to mend the Middle East more than a decade after Iraq fell apart is lunacy. That it’s still covering the defense of South Korea, another wealthy powerhouse, is self-defeating. It isn’t that Americans don’t want to bring the troops home; Trump was elected on a platform to do just that. It’s that Washington lacks the mettle to change course. It doesn’t want to accept that change is necessary; it certainly doesn’t want to undertake the discomfiting business of shuttering bases and ruffling allies. Far easier to let the thing run on autopilot, flying drones on borrowed money, relegating it all to faint background noise.
If we really had our national confidence about us, we wouldn’t be afraid to respond to shifting circumstances. We would follow Dwight Eisenhower’s example after the Korean War and set about bringing the military-industrial complex in line with our needs. We would demand that other countries step up, content that multilateralism need not be incompatible with leadership and even a little nationalism. We would stop pretending to be the global savior. Instead the establishment seethes because Trump has acknowledged the Third Reich is no longer a threat. May God have mercy on the cabarets.
Speaking to a national television audience on June 14, French president Emmanuel Macron was crisp and to the point about the possibility of vandalizing riots coming to his country: “The republic will erase no trace or names of its history, it will forget none of its works, it will tear down none of its statues. We must instead lucidly look together at our history, and in particular our relationship with Africa.”
In those two sentences, Macron covered the concerns of both the right and the left: To the right, he offered law and order, and to the left, he offered an examination of racism. Yet Macron was careful to add, too, that any such examination of the past would not lead to a “hateful” rewriting of French history. In selling this message, it helps that Macron’s press secretary is the Senegal-born Sibeth Ndiaye, whom France 24 describes as daily “earning a reputation for blunt speech and a willingness to put the press in its place.”
Yet undeniably, Macron was putting more of his political chips down on the side of law enforcement, adding, “Without republican order, there is no security or freedom, and this order is ensured by police officers and gendarmes.”
Macron is right, of course; more questions are, indeed, settled by blood and iron than by papers and speeches.
Still, Macron’s political artfulness—he was elected as a centrist in 2017, leading a party that had come into existence only the year before—is to combine concepts dear to both the left and the right in the same message.
And yet Macron’s political effectiveness is based on much more than just the man himself; it’s based on the credibility of the French state. Indeed, as we think about how that state keeps order, we might better appreciate why France has existed as an identifiable nation for the past 1,500 years—the French must know something about statecraft.
So at a time when America seems to be coming apart, drifting maybe even toward civil war, we might seek to learn from a political model that enshrines solidarité.
After all, these days, we’re getting a lesson in the impact of non-solidarity. And we might conclude that there’s something about Trumpism that is so provocative—so much of a red flag to so many—that it’s painful to imagine what would happen to the country if this president is re-elected. And yet at the same time, we can observe that the rise of the Woken is so fearsome, to so many, that a second Trump term, as a purported antidote, is still conceivable.
So maybe Americans can learn something from a country whose president can make a pronouncement about public safety, and the public mostly goes along—with confident cops there to make sure everything stays in place.
Unsurprisingly, the roots of French order run deep. In 2017, this author recalled Cardinal Richelieu, who dominated France in the early 17th century. To be sure, even the slightest brush with Richelieu’s biography reveals a man whom wokesters today would seek to cancel, for any number of reasons—although as we are learning, such cancellation could apply to just about any historical figure. The difference is that in France today, the statues of Richelieu are safe.
Of course, in between Richelieu’s time and ours, France has been rocked by its share of civil wars, revolutions, riots, and desecrations. And yet in each instance, France has been renewed, by figures such as Jean-Baptiste Colbert, Adolphe Thiers, and, of course, Charles de Gaulle. And so to walk around Paris, or any city or town in France, is to see plenty of intact history, watched over, as need be, by les flics. (There’s no such thing as a French Civil Liberties Union.)
Of these renewing French historical figures, the most proximate is de Gaulle (1890-1970.) A hero in both world wars, de Gaulle led the country briefly in 1944-6 and then, again, from 1958 to 1969. And even now, a half-century after his death, his memory still looms over French politics.
De Gaulle’s life and influence were ably captured by historian Julian Jackson in his 2018 biography; Jackson portrays his subject as cold and imperious, befitting a 6’4” man of ramrod military bearing and intense ambition. Yet at the same time, de Gaulle was well-read and oft-published. And he always possessed, as he himself wrote, “a certain idea of France.” He was emphatically on the right, and yet de Gaulle was no ancien régime monarchist, nor was he an anti-Semite. Instead, he was inspired by the French poet-turned-war hero Charles Péguy, who extolled a Catholic-influenced “patriotic faith.”
It was this love of country that made de Gaulle flexible: He knew that a nation could survive only by adjusting, as required by events and circumstances.
As Jackson details, de Gaulle led the French government-in-exile in London during World War Two, working closely with Resistance groups in Nazi-occupied France. Indeed, in 1943, his agent, Jean Moulin, pulled eight different groups into a Conseil National de la Résistance. Its goal was not only to coordinate a sabotage campaign against the Germans during the war, but also to outline a new social contract for the post-war nation. That accord combined both market forces and strategic nationalizations, thus outlining the sort of mixed-economy balance that enabled all groups to feel invested in the system. The result, starting in 1945, was the trente glorieuses, the thirty glorious years of unparalleled prosperity.
De Gaulle proved flexible again in 1968, when student unrest threatened to metastasize into outright rebellion. In response, the president first reinforced his governing coalition by bringing in law-abiding left-wingers. Next, he arranged for a counter-demonstration on the Champs-Élysées—spearheaded by Gaullist labor unions—that outnumbered the protestors. And then he held a snap election, giving him an even larger majority.
Indeed, like Macron a half-century later, de Gaulle played the protestors two ways, brandishing a stick and offering a carrot. On the one hand, he called the rebels in the streets a “totalitarian enterprise,” a not-so-sly way of calling them communists—which, of course, many of them were. And yet at the same time, he co-opted the term “revolution,” declaring, “If a revolution consists of fundamentally changing what exists, and notably the dignity and conditions of the working class . . . I am not at all upset to be called in that sense a revolutionary.”
As The New Yorker’s Adam Gopnik wrote two years ago of de Gaulle and his enduring influence, “With his love of honor and pageantry, de Gaulle might seem to offer a very dated model of politics. And yet in an odd way there’s an urgent, living lesson for the twenty-first century in what de Gaulle accomplished, one that can’t be overlooked—indeed, President Macron spends every day trying not to overlook it.”
Continuing, Gopnik added that de Gaulle’s sense of nationhood transcended the familiar ideological and partisan differences: “What de Gaulle’s example reminds us is how valuable an insistence on the shared symbols of a common fate can be if carried out with integrity and a residual deposit of democratic values. The politics of grandeur, he shows, need not be the exclusive province of bullies and gangsters and crooks and clowns. It’s a fine French lesson.”
Today, Macron would agree that one specific lesson of de Gaulle’s life is that, yes, sometimes, power must be used to protect the state. As he wrote in 1938, “All the virtue in the world is powerless against firepower.” Indeed, just this week, after a moment of indecision, Macron’s government announced that the police would continue to use chokeholds as a tactic to subdue miscreants. The French may sometimes talk a good game about free expression, but for them, always, the bottom line is hard-nosed raison d’etat.
Still, the French model isn’t only force majeure, it’s also about liberté, égalité, fraternité. That is, France has a comprehensive government, and so the face of the state is more likely to be that of a school teacher, or a social worker, than that of a police officer. Of course, this thickness of governance doesn’t come cheap: The French state consumes some 55.6 percent of GDP, which is higher, even, than the European Union average of 45.8 percent—and far higher than the U.S. share of the economy going to government, 37.8 percent.
Most Americans would likely be appalled to think of a government as big as France’s. But then, again, they are likely to be more appalled—if not downright horrified—by the strife we’re seeing today. So if there’s a better model for law, order, and justice, made in the U.S.A., this would be a good time to hear about it.
The post <i>Liberté, Égalité, Fraternité</i>: What We Could Learn From France appeared first on The American Conservative.
Culture war is often derided as an American innovation, but as with so many things, the French got there first. From 1897 to 1899, France was torn apart by the arrest and trial of Alfred Dreyfus, a Jewish staff officer who was wrongly accused of spying for the Germans. Now a minor historical footnote, the Dreyfus Affair was once an all-consuming controversy that divided a country and captivated politicians, journalists, and intellectuals across Europe.
Historical analogies can be crude and imprecise, but there are some irresistible parallels between Dreyfus’s ordeal and the eruptions that followed the killing of George Floyd by a Minneapolis police officer. The United States finds itself in the position of France, a declining power that still has immense cultural sway. Our domestic culture war has gone global, inspiring protests as far afield as Dublin, Berlin, and Mexico City, just as foreign observers once obsessed over the spectacle of France at war with itself.
But the most striking parallel is the depth and intensity of the cultural animosities on both sides, who are not fighting over policy proposals but profoundly different political visions. In The Proud Tower, Barbara Tuchman writes that Dreyfus, never a particularly notable personality to begin with, became an “abstraction” to his supporters and detractors. She summarized the stakes of the Affair thusly:
Each side fought for an idea, its idea of France: one the France of Counter-Revolution, the other the France of 1789, one for its last chance to arrest progressive social tendencies and restore the old values; the other to cleanse the honor of the Republic and preserve it from the clutches of reaction.
It is hard to think of a more apt comparison to the current moment. The language of existential conflict was mainstreamed on the American right by the 2016 election. A now-infamous essay, “The Flight 93 Election,” compared voting for Donald Trump to a desperate attempt to retake a hijacked plane from the 9/11 terrorists. On the left, the incremental liberalism of the Obama administration has given way to something more radical, a thoroughgoing critique of American institutions and history that suggests—and sometimes says outright—that revolutionary change is the only path forward. From health care, where plans for universal coverage have supplanted the modest reformism of Obamacare, to the American War for Independence, which has been reimagined by The New York Times as a struggle to preserve slavery, the trend is unmistakable.
The killing of George Floyd has thrown this existential conflict into stark relief. On the left, the slogan du jour is “Defund the Police,” a call to action that would have been dismissed as crankish or absurd just a few years ago (indeed, a common complaint prior to Floyd’s death was that minority communities suffer from a toxic combination of intermittent, state-sanctioned brutality and under-policing). But abolishing police departments isn’t a platform—it’s a provocation. There could be no peace between the France of the Counter-Revolution and the France of 1789, just as there is no reconciling conservatives who still revere America’s history and institutions and a newly ascendant left that questions their very legitimacy. The liberalism of yesteryear, which emphasized fealty to the country’s ideals while pressing the urgency of reform, seems antiquated and obsolete.
In 1897, the Affair was fueled by the popular press, creating a feverish environment that anticipated our own obsession with social media. The claims and counter-claims of the Dreyfusards and their opponents were amplified until the institutional legitimacy of the French army, which compounded its wrongful conviction of Dreyfus by refusing to admit error, was on trial.
“It was the press which created the Affair and made truce impossible,” writes Tuchman. From reactionary monarchists to revolutionary anarchists, every fringe political faction had its own public bullhorn. This journalistic free-for-all has since been replaced by the insanity-inducing algorithms of Twitter, Facebook, and Instagram. Outrageous video clips and inflammatory quotes, often edited or ripped out of context, are circulated online in the same manner that Frenchmen once passed around broadsheets and editorials.
There is one final parallel between the Dreyfus Affair and our current moment that is particularly frightening. “Give me combat” was the battle cry of a prominent Dreyfusard, a sentiment surely felt by many quarantine-weary Americans, who have been primed for confrontation by a toxic combination of social media, unevenly distributed prosperity, and a certain disenchantment with bloodless late-period liberalism. A Frenchman of La Belle Epoque, another era characterized by inequality and ideological upheaval, would have understood our predicament.
But that glamorous and uneasy period eventually came to an end on the killing fields of Flanders. In 1914, the French Army marched to war in its traditional red trousers, confident that Plan 17, a mad dash into the teeth of German fortifications in Alsace-Lorraine, would carry the day. The terrible slaughter that followed was far more consequential than any of the petty disputes of the pre-war era. Dreyfus was a harbinger of an impending catastrophe. It is hard to escape the feeling that our own culture war is merely a prelude to something far more destructive.
Will Collins is an English teacher who lives and works in Eger, Hungary.
The post The Killing of George Floyd is America’s Dreyfus Affair appeared first on The American Conservative.
When Prime Minister David Cameron led the largest ever British trade delegation to China, he launched a decade-long push from Conservative governments to establish a “Golden Era” in relations between China and Britain. This era might be coming to an end in the aftermath of the Coronavirus pandemic.
David Cameron’s efforts culminated in President Xi Jinping’s state visit to Britain in 2015 and his declaring Britain to be “China’s best partner in the west.” After a state banquet with the Queen, where she described China and Britain as “stewards of the rules-based international system,” a slew of deals were struck. They covered key parts of the British economy such as nuclear power, car manufacturing, and higher education.
Conservatives across the spectrum saw China as a source of untapped economic and financial potential for Britain. Brexiteers became enamored with the idea of a sovereign Britain cutting a trade deal with China among other growing economies, succeeding where Brussels had failed. The theory went that opening up China to the world would ensure its rise as a benign superpower in the global order. Scant attention was paid to the advance of mass surveillance and human rights abuses perpetrated by the Chinese government in Tibet and Xinjiang.
The election of President Donald Trump and his trenchant criticism of Beijing delivered the first shock to this complacent attitude. In the early months of her tenure in office, Prime Minister Theresa May appeared to share this skepticism of China and questioned Chinese involvement in sensitive parts of Britain’s infrastructure. This led to a review in 2016 of China’s role in building the Hinkley Point C nuclear power plant, but the project was ultimately still approved. Two years later, Theresa May would go on to also agree a controversial deal with Huawei to help build non-core parts of Britain’s 5G network.
Prime Minister Boris Johnson’s election as Conservative leader marked a break with his predecessors in a number of ways but with China being an important exception. Boris Johnson spent the first months of his premiership emphasizing his “pro-China” credentials, even going so far as to praise the Belt and Road Initiative, as part of his brinkmanship with Brussels in the EU withdrawal negotiations. When Britain finally left the European Union earlier this year, China appeared to be a promising alternative source of trade and investment.
Although Conservative governments have largely maintained a friendly stance with China, scepticism of Beijing’s behaviour has entered Britain’s political mainstream. Boris Johnson’s decision to give the greenlight for the 5G deal with Huawei, albeit with some restrictions, has been met with significant opposition from some Conservative MPs. A former leader of the Conservative party, Iain Duncan-Smith MP, led a group of MPs opposed to the Huawei deal, and has joined the Inter-Parliamentary Alliance on China. When Parliament has to give its approval to the legislation for building 5G later this year, there could be a larger rebellion against Boris Johnson if he pushes ahead with the current plans.
While British Conservatives remain pro-free trade by and large, they have become more aware of how opening the Chinese economy has not led to a liberalization of China’s politics. China’s economic retaliation against Australia after its calls for an international enquiry into the origins of the Coronavirus pandemic has helped this shift in perceptions of Beijing’s motives. Some Conservative MPs have formed the China Research Group to push for a change in Britain’s China policy and to scrutinize China’s behavior, specifically its industrial policy, use of technology, and foreign policy.
The protests in Hong Kong over the past year have accelerated this trend. Under the May Government, Britain adopted a critical but restrained approach towards China’s treatment of the former colony. British diplomats asked Beijing to respect the Sino-British Joint Declaration that secured Hong Kong’s rights in 1997 when sovereignty transferred from Britain to China. While the acts of police brutality and restrictions on civil liberties continued and were condemned by all major political parties in Britain, there were no acts of retaliation.
There are limited choices for Britain in this situation. However, this does not mean the British government has to stand by idly as China tears up the agreement that protects Hong Kong’s autonomy. With the recent passage of a new national security law, China appears to have crossed the line. Boris Johnson has now promised that if the national security law is implemented then Britain will respond by providing a visa extension and path to citizenship for British National Overseas (BNO) passport holders in Hong Kong. As China’s grip around Hong Kong continues to tighten, this could become an essential escape route for almost three million Hongkongers.
China-sceptic Conservatives have spent the past year making the case for unilaterally granting British citizenship to Hongkongers with BNO passports. For months, the May and Johnson Governments refused to follow this path, believing it would only provoke China into further undermining the Sino-British Joint Declaration. By changing course so dramatically, Boris Johnson has shown the first sign of a new direction in Britain’s dealings with China.
At this point, this China skepticism has not turned into outright hostility, but it is gaining strength in response to the Coronavirus pandemic. China’s attempts to cover up the initial outbreak in Wuhan province has made Conservatives more aware of the dangers posed by Beijing’s behavior. The immense pressure put on the National Health Service (NHS) also exposed how vulnerable British supply chains have become due to dependency on imported Chinese goods. This has required a significant scaling-up of the domestic manufacture of ventilators and other medical devices to support the NHS.
There is now work being led by the Foreign Secretary within government to review the extent of Britain’s reliance on China and the vulnerabilities this has created within the national security infrastructure. After a decade of growing ties with China, Boris Johnson is preparing to reset Britain’s economic and diplomatic relations with China on a more balanced basis. Boris Johnson might revert to his predecessors’ point of view, but China skepticism within the Conservative party and the broader political mainstream looks set to only grow.
Free from the European Union, Britain has an opportunity to redefine its role in the world. With a strong majority in Parliament, Boris Johnson has the power to ensure that taking back control from Brussels does not lead to simply handing it over to Beijing.
David A. Cowan is a writer based in London, UK, and is a graduate of the University of Cambridge.
Broad and sweeping sanctions inevitably harm the entire population of a targeted country, and in many cases that is exactly what they are meant to do.
When they are joined to maximalist policy goals, they are guaranteed to fail according to the standards of their supporters. The ongoing failure of sanctions is then cited as a reason to expand them and make them even more obnoxious. A piece of sanctions legislation targeting Syria is a case in point. The Caesar Syria Civilian Protection Act has greatly expanded the scope and reach of U.S. sanctions on the Syrian economy, and the first sanctions authorized by the law come into force this week. That practically guarantees imposing further hardship and deprivation on a country that has already been ravaged by eight years of conflict. It is just the latest piece of evidence that the U.S. needs to renounce its use of broad sanctions.
In their recent analysis of the legislation, Basma Alloush and Alex Simon explain how the Caesar Act will likely stifle Syria’s economic recovery, interfere with humanitarian relief and reconstruction efforts, and drive away businesses that might be willing to invest in the country. They emphasize the legislation’s “vast scope” as a reason to fear that it will simply add to the burdens that the civilian population has had to bear:
Within that continuum, the Caesar Act’s novelty lies in its vast scope. Previous measures have targeted a mix of individual actors and selected sectors, and have applied almost exclusively to Syrian and American entities. By contrast, the Caesar Act promises to slap so-called “secondary sanctions” onto businesses of any nationality that are found transacting with sanctioned actors in multiple sectors of Syria’s economy — notably energy and construction. As such, the bill aims to deepen Damascus’ isolation by deterring investment by any businesses from Beirut to Dubai to Beijing.
Sanctions are not the primary cause of Syrians’ hardships, and the Syrian government bears significant responsibility for the wreckage of the economy. Even so, further strangling the Syrian economy now will succeed only in starving the country of investment and commerce for no real purpose. Sanctions will fuel inflation and make even basic necessities unaffordable for millions of people. The U.S. can choose to assist the people of Syria, or it can choose to grind them down even more. The Caesar Act is the latter. The people of Syria are being made to suffer more in the vain attempt to weaken the Syrian government.
The Caesar Act’s destructive effects won’t be limited just to Syria, but are already spilling over into Lebanon:
The ramifications of Caesar are rippling through Beirut, where traders retain lucrative ties to Syrian officials that are barely keeping Lebanese state revenues ticking over.
“This is a disaster for the [Lebanese] government, said one Lebanese banker. “They will sanction Lebanese traders and banks. Our currency will plunge as far as theirs. One of the few places we can trade is Damascus. If that’s shut down, we’re doomed.”
Like any other coercive intervention, sanctions have destabilizing, negative consequences for the targeted country and all of its neighbors.
The Syria example is a reminder that sanctions are easy to apply but remarkably difficult to remove later. It is politically advantageous for politicians to endorse sanctions bills because it allows them to claim that they are being “tough” on some despised foreign leader, and no one will hold them accountable for the destructive effects of sanctions in the years that follow. There is usually much more political risk in opposing sanctions or calling for their removal, because this is wrongly cast as “rewarding” another government’s abuses. It is also often the case that sanctions legislation includes conditions for sanctions relief that are so ambitious and far-fetched that they will never be met. Alloush and Simon comment on some of the unrealistic conditions contained in the Caesar Act:
As a result, the Caesar Act’s true force may lie less in its immediate impact and more in its long-term implications. The law’s five-year sunset clause means that these measures are likely to stick until 2025 — possibly longer. In principle, the president could suspend the sanctions sooner if Damascus and its allies fulfill a set of seven criteria. However, several requirements — including “releasing all political prisoners” and “taking verifiable steps to establish meaningful accountability” — are so unrealistic as to render this stipulation meaningless.
The U.S. tends to impose many overlapping and reinforcing sets of sanctions on the same governments, and that makes it even less likely that all sanctions on a government will ever be lifted. As a result, sanctions on another country become a permanent fixture of their economy, and the targeted government has no incentive to make any concessions on any issue. Writing at the Lawfare website, Edward Fishman makes an excellent observation about how sanctions pile up and then lead to effective policies of regime change:
The static nature of sanctions not only makes them toothless; it also produces harmful effects on U.S. policy. Because sanctions are rarely lifted, they tend to accumulate over time at a steady, if intermittent, pace. As sanctions snowball, so do their objectives, worsening the convoluted problem outlined above. The net result is that, almost by default, nearly every sanctions program eventually aims for regime change. (It’s hardly surprising that one of the only times America has ended a sanctions program in recent history—when President Obama did so with respect to Burma in 2016—came after Aung San Suu Kyi’s National League of Democracy won a majority of seats in Burma’s parliament.) With a tortuous web of sanctions and policy objectives, most adversary regimes rightly assess that the only way out of sanctions is to call it quits. But no government will commit political suicide to undo sanctions.
When the U.S. seeks major changes in regime behavior or the overthrow of the regime through sanctions, the policy is most likely to fail. But it will also necessarily harm the civilian population in the meantime. Fishman cuts to the heart of the matter:
Policymakers and experts need to disabuse themselves of shibboleths that sanctions are precisely targeted at government officials and spare civilian populations and accept that America’s most ambitious sanctions programs aim to cause systemic economic damage—which, by definition, is felt by most if not all members of society.
Sanctions advocates often cast themselves as supporters and allies of the people in the country whose economy they want to destroy. This has never been credible, and it is long past time that we stop tolerating these deceptions. If you seek to ruin another country’s economy, you seek the ruin of the people living there. Sanctions advocates should be held responsible for the results of the policies they promote.
We have seen this story unfold many times over the last three decades. First, the U.S. imposes sanctions to punish a government for its behavior. Then the government’s leadership and its cronies use the economic difficulties created by the sanctions to enrich themselves and buy loyalty by controlling access to limited goods. Legitimate commerce is strangled, smuggling flourishes, and the government and its cronies exploit that to their advantage as well. Meanwhile humanitarian organizations that try to help the people find themselves bogged down in paperwork and struggling to get the simplest items approved, and humanitarian relief ends up being delayed or blocked all together. Financial transactions with the outside world become all but impossible, and essential humanitarian goods can’t be brought into the country. Collective punishment strikes down the poor and infirm, and it leaves the well-connected and corrupt to prosper. The Caesar Act sanctions seem very likely to repeat the same pattern. Alloush and Simon add:
The impact will go far beyond deterring individual companies, trickling down to ordinary Syrians seeking to get on with their lives. For instance, the Caesar Act targets Syria’s construction sector, which has sparked concerns among aid organizations working to support small-scale infrastructural rehabilitation — from fixing up damaged water networks to helping rebuild bombed-out schools or apartments.
The U.S. increasingly relies on a coercive policy that does a terrible job of advancing American interests, but it excels at impoverishing and killing ordinary people in many countries around the world. Economic sanctions have been a favorite tool for politicians and policymakers to use against many governments in response to a range of undesirable activities, because it seems to offer a low-cost option that allows the U.S. to “do something.” The record clearly shows that they fail on their own terms, and they end up costing much more than their advocates will ever admit. It would be bad enough if this were simply a matter of repeating the same error over and over and never learning anything, but the consequences of sanctions have been devastating for millions and fatal for tens of thousands of people.
Hurting the weakest and most vulnerable people is what sanctions usually do. The broader the sanctions are, the more harm they do to innocent people. Instead of trying to “fix” or reform how the U.S. uses tools of economic warfare, our government should abandon the use of broad, sectoral sanctions entirely. Just as we have sought to limit and restrict the use of force to reduce the harm to civilians in warfare, we need to limit and restrict the use of economic coercion when it comes to sanctioning other governments. Rather than refining tools of collective punishment, the U.S. should stop trying to police the behavior of other states.
The post The Imperious Caesar Act Will Crush the Syrian People appeared first on The American Conservative.
It is sometimes said that there is no authentic tradition of American nationalism. Indeed, as nationalism has gained strength in the United States in recent years, some writers have gone so far as to say that nationalism is “un-American”—a claim we’ve heard from Bret Stephens of the New York Times, Kim Holmes of the Heritage Foundation, and Elan Journo of the Ayn Rand Institute, among others.
Nevertheless, this view of nationalism in America is mistaken. The truth is that America produced a great, home-grown nationalist political tradition, and indeed a ruling nationalist party: the Federalist Party, which advanced a set of principles and policies that were obviously nationalist, and in fact can serve as a model and an inspiration to nationalists today. First forged into a distinct political grouping with a set of common ideas during the 1780s by the failure of the Articles of Confederation, the American nationalists were headed by figures like George Washington, John Adams, John Jay, Alexander Hamilton, Robert Morris, Gouverneur Morris, James Wilson, Oliver Ellsworth, Rufus King, John Marshall and Noah Webster. They regarded America as one nation characterized by a single political and cultural inheritance, in 1787 spearheading the constitutional convention in Philadelphia, the adoption of a new constitution, and its subsequent ratification. They then went on to lead the American government for its first twelve years under the new Constitution. In this period of ascendancy, the nationalists established the principal executive, economic and judicial institutions of the nation, as well as shaped the leading judicial interpretation of the national Constitution until the 1830s. In fact, we may say that to a great degree, the Federalists founded America as we know it.
Our purpose in this essay is to reacquaint readers with America’s founding nationalists. We’ll retell the story of the nationalist side of the American founding, and then describe the principles that made the Federalist Party one of the most important and successful nationalist movements in history—and a relevant model for American and other nationalists today.
I. The Federalists, the American Nationalist Party
The thirteen British colonies declared independence from Britain in 1776. But for most of the 1780s, the newly formed United States were prevented from addressing the many political challenges they faced due to the weakness of the first American constitution, the “Articles of Confederation and Perpetual Union.” Adopted by the Continental Congress in November of 1777, this constitution regarded the United States as an alliance of thirteen independent republics, under which “each state retains its sovereignty, freedom and independence.” Having no unified executive or judiciary, the only national institution was the Congress, which required a unanimous vote of all state delegations to take any action at all. Although nominally responsible for overseeing the war effort against Britain, Congress lacked the ability to conscript soldiers for the Continental forces fighting the British under George Washington, or even to raise the taxes needed to arm them or pay them. Indeed, when the moment came to land the decisive blow at Yorktown in 1781, Congress was broke and Robert Morris, the newly hired Superintendent of Finance, had to write personal checks to cover the costs of moving the army into battle.
The nationalist party in American politics was born out of these experiences, with much of its leadership consisting of soldiers, businessmen, and lawyers who had witnessed firsthand the inability of the American national government to act in a decisive fashion in matters of war, diplomacy, and finance. Even before the Treaty of Paris formally ended the war, both Washington and Morris, as well their young protégé, Alexander Hamilton, then a member of Congress, had gone on record calling for a revision of the first American constitution, which they blamed for having needlessly prolonged the war and almost lost it. In this view they were joined by John Jay, the celebrated architect of the peace with Britain, who had discovered that in the absence of a national government with appropriate coercive powers, the terms of the treaty could not be enforced on the American states. These nationalists urged a unification of the American nation under a government with the authority to conduct national finances and diplomatic and military affairs. But they found themselves opposed by a large anti-nationalist or confederationist majority, which regarded proposals to establish a national government possessing significant coercive powers, a standing national army, national taxes, and a national bank, as a betrayal of the ideals of the revolution and a return to the “monarchical” government of Great Britain.
Although the divide between nationalists and confederationists seemed, at first, to be a disagreement over practical proposals for how best to govern post-revolutionary America, it quickly became clear that the argument was far deeper than that. In fact, the sides in this argument were inspired by competing visions of American identity and citizenship, which drove the formation of two clearly opposed political parties. The first of these, which came to be called the Federalist Party, wished to see America become a unified nation and an industrial, commercial, and military power—in effect a republican version of Britain. Nationalist and conservative, the Federalists admired British constitutional structures, including the British political tradition of a strong executive and judiciary alongside the elected legislature; the common law heritage that had governed Americans in the 150 years preceding independence, upholding property and liberty as inherited rights; and the Protestantism that was still the established religion in most states. They were, in other words, Anglo-American traditionalists, who regarded national identity as rooted in the particular traditions of a people, and expected newcomers to adopt these traditions as a prerequisite to becoming citizens. For the most part, they looked forward to the decline of slavery and its eventual abolition.
Against this nationalist vision of America, there emerged a confederalist vision that was eventually called “democratic republicanism”—and finally gave its name to the Democratic Republican Party. This view, whose greatest spokesmen were Thomas Jefferson and Tom Paine, regarded the American Revolution as having been fought not only against British monarchy and aristocracy, but more generally against Britain’s centralized government, established religion, and financial system. On this view, political society was founded on the virtue and natural rights of the consenting individual, who owed little or nothing to national and religious tradition. Such a society needed little government besides local government, an arrangement as close as possible to the small republics of the ancient world, with no armed forces beyond the local militia except in times of emergency. For democratic republicans, the ideal citizen was the independent farmer, to a great degree self-sufficient, even if this sometimes involved owning slaves to work his fields if he was successful; whereas large-scale commerce, manufacturing, and public debt were regarded as threats to the independence and virtues of the individual. In a country as large as America, the only way to maintain such a regime was by creating a loose confederative arrangement of individuals, cooperating within a larger confederation of states.
The first American constitution, the Articles of Confederation adopted in 1777, had been cast in precisely this democratic-republican mold, and for a decade the political viewpoint that had created it remained ascendant. But a decade later, with the states embarking on an increasingly angry tariff war against one another, Hamilton seized the opportunity at a failed conference on interstate trade to announce a national convention to discuss revising the Articles of the Confederation, to be held in Philadelphia the following year in May of 1787. This initiative was unconstitutional, having no relation to the decision-making processes of the Articles. But the need for it was dramatically demonstrated in the fall and winter of 1786, when the states found themselves unable to raise an army to meet “Shay’s Rebellion,” an organized insurrection in Western Massachusetts that had to be put down by privately funded troops. Against the backdrop of these events, Jay and Hamilton sought and won Washington’s agreement to serve as chairman of their proposed constitutional convention. In this effort, they were joined by other nationalists such as Robert Morris, Governeur Morris, and James Wilson, as well as by James Madison, until recently a protégé of Jefferson, who had swung into the nationalist camp after his mentor left to serve as ambassador to France. The constitutional convention met from May 25 until September 17, 1787.
Scholars have tended to downplay the extent to which the constitutional convention was orchestrated by what would soon become the Federalist Party. Of the initiators and the most consequential participants, most were longtime nationalists and later Federalists—the principal exceptions being Madison and his fellow Virginian Edmund Randolph. Only four years into civilian life, Washington was intent on avoiding the suggestion of a military intervention in political affairs. But the fact is that Washington agreed to participate as the highly visible chairman of the convention only on condition that its agenda would be the establishment of a nationalist government. These prior guarantees to Washington largely assured the outcome of the convention, but it also helped that thirty-five of the fifty-five participants of the convention were former officers who had served under Washington in the Continental Army. Thus while the nationalists were forced to compromise on some points, the convention did indeed open by passing a resolution outlining a new national government along lines agreeable to Washington. Thereafter, the text of America’s second constitution was drafted by a committee controlled by nationalists John Rutledge, Oliver Ellsworth, and James Wilson. And the final draft was written by a leading nationalist, Gouverneur Morris.
It was at the constitutional convention, as well, that the term “Federalist” came into use to refer to the nationalist party and its program. Up until this point, Americans had used the terms federal and confederal interchangeably to describe the cooperation of the thirteen independent states under the Articles of Confederation and Perpetual Union. However, as the constitutional convention opened in 1787, nationalists discovered that the word “national” was troubling in the eyes of some of the participants, precisely because it implied a single, unified nation rather than a coalition of independent states. The nationalists at the convention decided to concede on the semantics, while preserving the substantive achievement of a national government. On June 20, the prominent Connecticut nationalist Oliver Ellsworth moved to simply strike the word “national” from the proposed constitution. Thereafter, all of the descriptions of the American government as “national” were removed and replaced by the terms “United States,” “general” and “federal.” In this way, the term federal became a synonym for national (and the opposite of “confederal”). It was soon widely popularized after Hamilton initiated a defense of the new national constitution in a series of newspaper essays in which he was joined by Jay and Madison, which were known collectively as The Federalist: A Collection of Essays in Favor of the New Constitution. From this point on, the term “Federalist” denoted a political view and then a party devoted to a nationalist government. American nationalists were called “federalists.”
The nationalists’ success at the convention and in the subsequent ratifying conventions of the states amounted to what has rightly been called the Second American Revolution. A series of unconstitutional, but democratic and peaceful, political maneuvers led to the overthrow of the decentralized and anti-nationalist American constitution of 1777, and to its replacement by the new nationalist Constitution of 1787, modeled on the British constitution. The first Administration under this nationalist Constitution was inaugurated when Washington took office as President on April 30, 1789. In addition to nationalists such as himself, John Adams (Vice President), Hamilton (Secretary of the Treasury), Henry Knox (Secretary of War), and Jay (Chief Justice of the Supreme Court), Washington sought to give his administration the appearance of a unity government by appointing the leading democratic republican, Jefferson, as Secretary of State; and the moderate Randolph as Attorney General.
But Washington also made sure that that both foreign policy and judicial matters remained firmly under nationalist control, and a frustrated Jefferson began orchestrating public pressure against the administration in which he was serving by means of a proxy war in the press. In 1791, Jefferson and Madison founded a newspaper to counter Federalist policies. Jefferson resigned from office two years later, and in 1795 launched a campaign against a treaty of friendship and commerce with Britain initiated by Hamilton and concluded by Jay, still serving as Chief Justice. The famous “Jay Treaty” in effect ended America’s alignment with France, blocking the Jeffersonians’ desired alliance with the revolutionaries who had overthrown and executed the French king. Jefferson, Madison, and others, including many former opponents of the nationalist Constitution, assembled what became the Democratic Republican Party, which supported states’ rights over the power of the national government, state courts over national jurisdiction, the disestablishment of religion, the expansion of slavery, and a foreign policy favorable to France, while opposing the Federalist Party’s nationalist economic and immigration policies.
Democratic Republican enthusiasm for the French Revolution increased support for the Federalist Party among some southern conservatives, despite the nationalists’ generally northern and commercial orientation. But it wasn’t enough. Too many Americans resented the Federalists’ affinity for Britain and their opposition to essentially unregulated immigration. In 1800, Jefferson’s Democratic Republicans took office and the political fortunes of the Federalist Party swiftly declined. At this critical moment, the Federalist Party lost its three most prominent leaders in quick succession: During the latter part of the Adams presidency, a quarrel between the President and Hamilton degenerated into an ugly pamphlet war that destroyed the reputation of both men, as well as the chances of the party uniting behind one of them. Meanwhile, Washington, the man who might have imposed a truce or thrown his weight behind one of them, died suddenly in 1799. Adams retired from political life when his presidency ended in 1801, and Hamilton’s death in a duel with the Democratic Republican Aaron Burr in 1804 deprived the Federalists of the remaining leader of sufficient stature to reenergize them. Lacking a clear leadership and split between feuding factions, Federalists quickly disintegrated as a political force outside of New England. Their declining fortunes were dealt a death blow by their perceived disloyalty to the Democratic Republican government during its war with Britain beginning in 1812. Rufus King, the last Federalist senator, was also the last, informal Federalist Party candidate for president when he ran unsuccessfully in 1816.
However, the political decline of the Federalist Party did not mean the end of the American nationalists’ political ideals. Washington and Adams appointed only committed Federalists to the Supreme Court, which was dominated by justices such as Jay, Ellsworth, Rutledge, Wilson, William Cushing, Bushrod Washington, and John Marshall through the 1830s. Wielding the doctrine that the Supreme Court was responsible for interpreting the Constitution, and later the resulting power of “judicial review,” these Federalist judges continued to protect the nationalist Constitution of 1787 until many of their ideas had been adopted, whether completely or partially, even by their Jeffersonian opponents.
It is true that nationalism did not come easily to Americans. Hostility to British rule brought many to regard the “Spirit of ‘76” as being opposed to strong government in general, and to distant, national government in particular. But the Second American Revolution—and the “Spirit of ‘87”—was by no means conducted along these lines. The new nationalist Constitution was a restoration of the Anglo-American political inheritance that Washington and many of his supporters and officers had in fact been fighting to preserve during the War of Independence. The constitutional convention of 1787 brought America’s nationalists, the Federalist Party, into a position of decisive influence, permitting them to unite the American nation and establish nearly all of the institutions and traditions that came to characterize it. The Federalists’ principles went on to serve as the model for subsequent American nationalism.
II. The Nationalist Principles of the Federalist Party
What were the nationalist principles of the American Federalist Party? As in any political alliance, there were many differences of opinion and temperament among the Federalists. Moreover, the views of some of the leading Federalists clearly evolved over time. Nevertheless, we can point to eight broad political principles that may be said to have characterized the Federalists in their struggle against the anti-nationalists and the Democratic Republican Party. All of these principles derived from the belief in the existence of a unique American nation, with a unique cultural inheritance derived from Britain, and the desire to unite the various parts of this American nation under a strong central government. These principles include:
Let us look more closely at each of these principles.
1. A Distinct American Nation of British Heritage
As America gained its independence, opposed nationalist and anti-nationalist visions were advanced as to what should replace British rule. This argument ultimately came down to the question of whether there really was such a thing as an American nation in any significant sense. The weakness of the constitution of 1777 was, in other words, a direct consequence of the frailty of the fellow-feeling tying the states to one another. For example, the Virginian Patrick Henry, a great proponent of independence from Britain, was also a great opponent of American nationalism. Henry rejected the very concept of an “American people,” arguing that the Constitution of 1787 would amount to taxation without representation, much as British rule had. As he put it:
Suppose every delegate from Virginia in the new [national] government opposed a law levying a new tax, but it passes. So you are taxed not by your own consent, but by the people who have no connection to you.
The idea that there was effectively “no connection” between the peoples of the various states obviously spoke to the feelings of a large public. Yet it was opposed by nationalists who felt that a genuine mutual loyalty already did exist on the part of a great many Americans, and could be kindled in the hearts of many more.
In The Federalist 2, for example, John Jay supplied the nationalist framework for the entire series by describing exactly the kind of bond of mutual loyalty that he could see animating the American nation. As he wrote:
I have as often taken notice that Providence has been pleased to give this one connected country to one united people—a people descended from the same ancestors, speaking the same language, professing the same religion, attached to the same principles of government, very similar in their manners and customs, and who, by their joint counsels, arms, and efforts, fighting side by side throughout a long and bloody war, have nobly established general liberty and independence…It appears as if it was the design of Providence that an inheritance so proper and convenient for a band of brethren, united to each other by the strongest ties, should never be split into a number of unsocial, jealous, and alien sovereignties.
This is as compelling a nationalist view as one finds anywhere, arguing that a shared ancestry, language, religion, laws, and customs, as well as a common history of war against shared enemies, has made the American nation “a band of brethren united to each other by the strongest ties.” At the outset, then, The Federalist rejects the concept of a “creedal nation” bound by nothing other than reason and consent, of which Jefferson and Paine were the forerunners. Instead, Jay describes a thick matrix of inherited language, values and history, which those of foreign descent—such as Jay himself, descended from French Huguenot and Dutch immigrants—could nevertheless choose to adopt.
A similar nationalist view is evident in Hamilton’s writings as early as his Continentalist essays of 1781, and certainly so in The Federalist, where Hamilton too refers to “the affinity of language and manners; the familiar habits of intercourse” that characterize Americans. Hamilton’s nationalism was likewise rooted in a culture one chooses to adopt, for he was himself a relative newcomer to America, having been born to a Scottish father and a half-French mother on the Caribbean island of Nevis, and having arrived in the country only in 1772. Thus the “affinity” and even more the “familiar habits” to which he was referring were to him not native or local, but acquired and Anglo-American ones.
The same outlook informed George Washington’s “Farewell Address” of 1796, in which he argued that:
The name of American, which belongs to you in your national capacity, must always exalt the just pride of patriotism more than any appellation derived from local discriminations. With slight shades of difference, you have the same religion, manners, habits, and political principles.
2. American Constitutional Continuity with the British Constitution
The Federalists of the 1780s and 1790s were not revolutionaries who regarded America as a clean slate upon which to try out new schemes devised by the philosophers of the “Age of Reason.” Indeed, they came to abhor Jefferson and others who favored such schemes, especially after 1789 when these became increasingly identified with the murderous policies of the French Revolution. The Federalists understood that the freedom of Americans was a gift of the British constitutional tradition and the English common law, which had been incorporated into American colonial law for a century before independence, often formally so in the constitutions of the colonies. Indeed, it is telling that in the four years prior to independence, no fewer than twenty-one editions of Blackstone’s Commentaries on the Laws of England had been published in America. And when the thirteen newly independent states turned to writing their own constitutions after 1776, these were to a significant extent designed on the pattern of the English system of dispersed power, with a strong executive balanced by a bicameral legislature and an independent court system.
Today, it is difficult to imagine how bizarre this adherence to the English constitution seemed to Enlightenment philosophers writing at the time. French philosophers such as Anne Robert Turgot and the Abbé de Mably, as well as the English Lockean Richard Price (the “Dr. Price” who is the target of Edmund Burke’s Reflections on the Revolution in France), argued that by the light of reason, the best and most effective government would obviously be one in which all powers—legislative, executive, and judicial—would be combined in a single popular assembly. This view was influential in radical circles in America, where it was adopted as the basis for the Pennsylvania Constitution of 1776, which placed dictatorial powers in the hands of a single assembly, unchecked by an executive or an upper house. The mob violence and attacks on property that followed in many respects foreshadowed the revolutionary regime in France a few years later. Similarly radical experiments were undertaken in Georgia and Vermont. But the most glaring deviation from the English constitutional tradition had been the first American constitution itself, the Articles of Confederation, which likewise mandated a government consisting of a single assembly in which all powers were vested.
Against these revolutionary proposals, the Federalists sought to restore the continuity of the American constitutional tradition with the English constitution and common law. It has been often pointed out that much of the Constitution of 1787 and the subsequent Bill of Rights is borrowed from English constitutional documents or conventions. This English inheritance includes a long list of constitutional procedures and legal concepts, including the bicameral legislature, the taxation initiative vested in the lower house of the legislature, the executive veto and the pardoning power, the procedure of impeachment, due process of law, the jury trial, the right to free speech, to bear arms, and to be immune from unreasonable search and the quartering of soldiers, and so forth. At least 16 of the 21 sections making up the first four articles of the Constitution, as well as much of the first eight amendments to the Constitution, implicitly refer to English sources.
But it is rarely acknowledged that this conformity of the Constitution of 1787 to the English constitution was a matter of theoretical significance for the American nationalists. Earlier that year, John Adams, writing in London during Shay’s Rebellion, published the first volume of his Defence of the Constitutions of the United States of America, which sought to defend the Anglo-American constitutional tradition in the face of attacks by rationalist philosophers. Adams’ argument is two-pronged: He argues that even if English constitutional traditions were neither good nor evil in themselves, and “the people, by their birth, education, and habits, were familiarly attached to them,” this would provide “motive particular enough for their preservation,” which would be better than to “endanger the public tranquility… by renouncing them.” But Adams does not believe that English laws are neither good nor evil. Rather, he takes up a survey of constitutions throughout history in order to demonstrate that the greatest insights into the nature of free government have been implemented only in the English constitution, which is therefore closer to perfection than any other known to mankind. As he writes:
The English constitution is, in theory, the most stupendous fabric of human invention, both for the adjustment of the balance, and the prevention of its vibrations; and the Americans ought to be applauded instead of censured for imitating it as far as they have. Not the formation of languages, nor the whole art of navigation and ship building, does the human understanding more honor than this system of government.
Indeed, although Adams recommends reforms in the British House of Commons so that it may better carry out its democratic function, he nonetheless foresees the possibility that the Americans will, with time, “make transitions to a nearer resemblance of the British constitution.”
Adams’ book arrived in the United States in mid-April of 1787. A few weeks later, Washington, Madison, and other members of the Virginia and Pennsylvania delegations agreed upon the so-called “Virginia Plan” outlining a national government based on three branches of government and a bicameral legislature. At the convention itself, Oliver Ellsworth, John Dickinson, and others defended the British constitution. But the most prominent Federalist figure in this respect was Hamilton, who told the delegates explicitly that the closer the Constitution could be brought to the British one the better, explaining that “the British government was the best in the world, and that he doubted much whether anything short of it would do in America.” Like Adams, Hamilton praised the English constitution for balancing a strong democratic element in a representative elected lower house against an executive and upper house that served for life, and so were shielded from wild swings in public opinion. In this way, the British constitution “unites public strength with individual security.”
These views concerning the English constitution made both Adams and Hamilton early opponents of French revolutionary ideas. Adams was especially proud to have published his book even before the outbreak of the French revolution, and sometimes suggested it had influenced Burke’s Reflections on the Revolution in France, which appeared in 1790. Adams’ later Discourses on Davila (1790) was written in the same anti-revolutionary spirit. Meanwhile, Hamilton encouraged and even funded several anti-revolutionary publications in the 1790s, himself composing a series of Burkean Letters of Pacificus in 1793. At the same time, the Federalist judges on the newly appointed national Supreme Court determined, in a series of rulings during the 1790s, that the entire body of English common law was the inherited law of the federal government at its creation.
The chief opponent of the nationalist conservatives in these debates was Jefferson. Long venerated for his role in securing American independence, Jefferson is now a hero to a large section of conservatives who idolize him, especially, for his opposition to a strong national government. It is therefore sometimes hard to grasp today the ferocity with which the nationalist conservatives loathed their opponent, whom they saw as representing everything they abhorred: rationalism as opposed to traditionalism, states’ rights and the philosophy of the individual as opposed to the building up of the American nation, agrarianism as opposed to an urban and commercial future, and of course the twin evils of atheism and slavery. Nor was Jefferson a friend to the Federalists’ Constitution of 1787. As a devotee of Enlightenment rationalist philosophy, he held tradition to be unimportant at best, and considered constitutions to be mere transitory and technical devices, to be rewritten from scratch every twenty years. For him, the only real constitution was the universal rights of man, which could be known by reason and had no need of constraints inherited from the past.
The historian Gertrude Himmelfarb told us some twenty years ago, only half-jokingly, that the absence of Jefferson from the American constitutional convention was the clearest sign of Providence intervening in American history. There is much to be said for this view. Jefferson was in Paris from 1784 to 1789, where he and Tom Paine were active in assisting the efforts of the French revolutionaries. But France’s loss was America’s gain. For the fact that the two most outspoken and radical figures among American philosophical rationalists, Jefferson and Paine, were abroad in the crucial period when the restorationist Constitution of 1787 was composed and ratified, meant not only that they were not around to oppose it. Their absence also meant that moderate Virginians such as Madison and Randolph were released from Jefferson’s orbit and able to render crucial assistance to the nationalist effort. Indeed, by 1786, Madison had become one of Washington and Hamilton’s closest nationalist allies, and he remained allied with the Federalists until shortly after Jefferson returned to America.
In 1791, Jefferson’s hostility to the Federalists was made public when a laudatory note from his pen was published (apparently without his permission) as a preface to the American edition of Tom Paine’s Rights of Man, which had been written as a refutation of Burke’s defense of the English constitution, Reflections on the Revolution in France. In the preface, Jefferson, the sitting Secretary of State, endorsed Paine and praised his support for the French Revolution, attributing to it the same ideals that had animated the American Revolution. The same preface went on to condemn the “political heresies” of the new American “monarchists”—an obvious reference to Adams and Hamilton. In a follow-up letter to Washington, Jefferson laid out the battle lines for the coming struggle, explaining that the American press was now divided into two camps, with the camp of Hamilton and Adams (“since his apostasy to hereditary monarchy and nobility”) taking the side of Burke, and the other side supporting Paine.
This division between anti-nationalist liberals and national conservatives remains very much with us to this day.
3. Supreme Court Responsible for Interpreting the Constitution
The desire to unite the American nation and bring it under an effective national government was the most prominent Federalist goal, which early American nationalists pursued by a variety of means. The best-known part of this program was the Federalists’ support for a powerful chief executive modeled after the British one—a stance that earned Hamilton, Adams, and Washington the reputation of being “monarchists” or “Tories” in the eyes of Democratic Republicans who decried the very broad powers of such an executive. A concise expression of the Federalist view appears in the first of Hamilton’s Letters of Pacificus, in which he emphasizes that “The general doctrine then of our Constitution is that the executive power of the nation is vested in the President, subject only to the exceptions and qualifications which are expressed in that instrument.”
Less familiar, but no less significant, is the Federalists’ protracted struggle to forge a national judiciary with the authority to interpret the Constitution of 1787. Such authority, would enable it to impose the Constitution on the states and individual citizens, and to stand as a bulwark against anti-nationalist challenges to the powers of the national government, both in its executive and judicial branches. However, Article III of the Constitution did not explicitly grant such authority to the national judiciary, and attaining it required the Federalists to design and carry out a concerted campaign to establish a judicial branch of the national government more powerful than in any other country, including even Britain, from which this effort drew much of its inspiration.
Although much remains unknown about the origins and execution of this Federalist effort, its main outlines are clear. As mentioned above, three nationalist jurists—Ellsworth, Wilson, and Rutledge—dominated the five-man committee that wrote the first draft of the Constitution of 1787. As far as the official record is concerned, none of them raised the issue of the Supreme Court’s authority to interpret the Constitution during the convention itself. But at their respective state ratifying conventions, both Ellsworth and Wilson, as well as Hamilton, Marshall and others, went on record arguing explicitly for the vital importance of a national judiciary with the authority to interpret the Constitution—a united effort that may already have been agreed upon in Philadelphia. As Hamilton laid out the argument in Federalist 78, the lifetime tenure of judges was necessary so that the courts, freed from public pressure, would be able to execute their duty of defending the Constitution against the whims of the legislature:
Limitations [on legislative power] can be preserved in practice no other way than through the medium of courts of justice, whose duty it must be to declare all acts contrary to the manifest tenor of the Constitution void… The courts were designed… to keep [the legislature] within the limits assigned to their authority. The interpretation of the laws is the proper and peculiar province of the courts.
Hamilton thus argues that the courts, exercising “judicial discretion” and strengthened by the institution of lifetime appointments, will be able to act as “the faithful guardians of the Constitution” against legislative encroachments. He concludes by commenting: “The experience of Great Britain affords an illustrious comment on the excellence of this institution.”
During the first session of the national Congress in the fall of 1789, Ellsworth, now a senator from Connecticut, took the next step by drafting the Judiciary Act, which gave form to the federal judiciary and explicitly granted it appellate jurisdiction over decisions of state courts that touched upon the interpretation of the national Constitution or on treaties and laws enacted by the national government. This meant that all state and local laws, having been upheld by state supreme courts, could be appealed to the national Supreme Court, which if it chose, could reject them as being unconstitutional or incompatible with nationally enacted treaties or laws. This remarkable piece of nationalist legislation was passed that year in a package deal together with the Bill of Rights.
Soon afterwards, Washington appointed the first United States Supreme Court, packing it with nationalists. During the Court’s first decade of existence, Jay, Rutledge and Ellsworth served consecutively as the first three chief justices. Together with Wilson and the prominent Massachusetts jurist William Cushing, they constituted a permanent nationalist majority on the court that was preserved as additional Federalists were appointed to fill vacancies. Indeed, it was only in 1804 that William Johnson, the first justice who was not a Federalist, was appointed to the Supreme Court by Jefferson.
The role of these first Supreme Court justices is today obscured by the subsequent long dominance of the court by another important Washington ally and nationalist, John Marshall. Nevertheless, it was these early nationalist justices who laid the foundations that permitted Marshall to carry out the nationalist program. We have already mentioned the early Court’s determination that the English common law was the law of the United States—a determination that Jefferson realized would subordinate the laws of the states to the national courts by way of the common law principle of “judicial review” of the laws throughout the nation. As Jefferson put it in a letter to Randolph: “Of all the doctrines which have ever been broached by the federal government, the novel one, of the common law being in force and cognizable as an existing law in their courts, is to me the most formidable,” as it would give the national government general jurisdiction over “all cases and persons.”
The Federalists’ aim of establishing a united American nation under a single constitution was given dramatic expression when the Supreme Court heard Chisholm v. Georgia (1793), in which the state of Georgia was sued by a private individual seeking payment for goods supplied during the Revolution. The state of Georgia refused to appear, claiming that it was a sovereign state and that a sovereign, by definition, could not be summoned without its consent. In a 4-1 ruling, the Supreme Court determined that the property rights of a private citizen were protected by the national government even against the states. As Wilson put it:
This is a case of uncommon magnitude. One of the parties to it is a State, certainly respectable, claiming to be sovereign. The question to be determined is whether this State, so respectable, and whose claim soars so high, is amenable to the jurisdiction of the Supreme Court of the United States? This question, important in itself, will depend on others, more important still, and may, perhaps, be ultimately resolved into one, no less radical than this: Do the people of the United States form a Nation?
Wilson’s question was, in other words, whether there exists an overarching American nation on behalf of whom the Supreme Court is responsible to impose its law. His answer to this question was unequivocal:
Whoever considers, in a combined and comprehensive view, the general texture of the Constitution, will be satisfied that the people of the United States intended to form themselves into a nation for national purposes. They instituted, for such purposes, a national government, complete in all its parts, with powers legislative, executive and judiciary; and, in all those powers, extending over the whole nation.
The decision caused an immediate uproar, bringing about a congressional resolution to amend the Constitution under Article V. The Eleventh Amendment, which grants the states immunity from suits by individuals, was adopted by twelve state legislatures by 1795. President Washington declined to certify its ratification, and it was only recognized as a part of the Constitution by Adams three years later.
Although the Supreme Court’s ruling in Chisholm was overturned by Congress and the states, the very fact that a constitutional amendment was necessary served to firmly establish the authority of the national court to interpret the Constitution and to rule against the states. The nationalist ruling in the case of Ware v. Hylton (1796), which held that treaties made under the Constitution supersede state law, cemented this authority. And in Calder v. Bull (1798), Justice James Iredell, the most prominent Federalist in North Carolina, affirmed that laws of Congress and of the states could not be voided for violating abstract “principles of natural justice,” but that they could be overturned and declared void by the Supreme Court if they violated an explicit textual provision of the Constitution. As he wrote:
The principles of natural justice are regulated by no fixed standard; the ablest and the purest men have differed upon the subject… If any act of Congress, or of the legislature of a state, violates those constitutional provisions, it is unquestionably void… If, on the other hand, the legislature of the Union, or the legislature of any member of the Union, shall pass a law within the general scope of their constitutional power, the Court cannot pronounce it to be void merely because it is, in their judgment, contrary to the principles of natural justice.
Thus by the time Adams appointed the Federalist John Marshall as the fourth Chief Justice in January 1801, he had in his hands the tools necessary to assert the national judiciary’s role as the principal interpreter of the Constitution, including voiding acts of Congress and of the states regarded as unconstitutional. This authority was exercised in the famous decision of Marbury vs. Madison (1803), which overturned a portion of an act of Congress as unconstitutional. Echoing Hamilton, Marshall emphasized in his opinion that “It is emphatically the province and duty of the judicial department to say what the law is.” Marshall’s long tenure of thirty-five years gave nationalists effective leadership of the court up until the 1830s.
Although the anti-nationalist theory of “nullification”—the putative ability of a state to nullify a federal law, which was proposed by Jefferson and Madison as early as 1798—never succeeded in gaining the support of the national Supreme Court, another contentious issue involving national authority did indicate the eventual abandonment of the Federalist constitutional tradition. In 1857, the Supreme Court decided in the Dred Scott case, by a majority of 7-2, that blacks had no standing to sue a state in a federal court because they “are not included, and were not intended to be included, under the word ‘citizens’ in the Constitution, and can therefore claim none of the rights and privileges which that instrument provides for and secures to citizens of the United States.” In other words, the ruling declared that blacks might be citizens of particular states such as New York or Connecticut, but not of the United States—thus effectively overthrowing the idea of a single American nation. This ruling unified nationalist opinion against what was seen as a corruption of the Constitution, and became a major step on the path to Civil War. Not incidentally, the two dissenters from the majority were nationalists: Benjamin Robbins Curtis, the only Supreme Court justice appointed by a Whig president; and John McLean, who, while originally appointed by Andrew Jackson, had moved gradually towards nationalist positions, and in 1860 even contended for the Republican Party nomination for President, which Abraham Lincoln won.
4. Economic Nationalism.
Although led by George Washington, a Virginian, the Federalists were from the outset a party dominated for the most part by businessmen, lawyers, and soldiers from northern cities such as New York (Jay, Hamilton, G. Morris,), Philadelphia (R. Morris, Wilson), and Boston (Adams, Knox). Whereas the Democratic Republicans followed Virginians Jefferson and Madison in envisioning America as a vast confederacy of plantations and farms, American nationalists recognized that the might of the new nation would be determined by its capabilities in manufacturing and commerce. The Federalists looked to the ideal of the most developed economy of that time, Great Britain, which had built its advantage over all other nations through a nationalist economic policy focused on developing its own manufactures while suppressing, as far as possible, the manufacturing capacity of its rivals.
As early as 1781, Robert Morris’ “Report on Public Credit” proposed to found a solid national credit by establishing a national bank, a system of compulsory national taxes and a tariff on imports, as well as the assumption of state debts by the national government. The plan failed due to lack of support from the states, including state legislation that prevented Morris’ Bank of North America from functioning as a de facto central bank. But the Federalist economic program was put into action in 1789 when Washington chose Hamilton, upon Morris’s suggestion, as the first Secretary of the Treasury of the United States. In 1791, Hamilton proposed establishing a federal mint as well as a new central bank. It was opposed by Jefferson, Randolph, and Madison, who feared the growth of the national government and opposed the bank, especially, as an instrument that would encourage merchants and investors at the expense of the majority of Americans who were farmers. Jefferson and Randolph appealed to Washington, arguing that a national bank was unconstitutional since provision had not been explicitly made for it. To this, Hamilton replied that the Constitution authorized any activity that was needed to attain the purposes for which government was established, so long as they were not explicitly unconstitutional or immoral. As he wrote:
This general principle is inherent in the very definition of government, and essential to every step of progress to be made by that of the United States, namely: That every power vested in a government is in its nature sovereign, and includes, by force of the term, a right to employ all the means requisite and fairly applicable to the attainment of the ends of such power, and which are not precluded by restrictions and exceptions specified in the Constitution, or not immoral, or not contrary to the essential ends of political society.
Hamilton’s “First Bank of the United States” was established by act of Congress in February 1791. Later that year, Hamilton submitted his famous Report on Manufactures, in which he argued for a nationalist policy of government-subsidized industrial development, whose aim was to actively promote economic activity in key sectors and to assist those importing manufacturing technologies. Hamilton favored national subsidies and tariffs on imports in order to keep out goods from rival nations while American manufacturing was developing to competitive levels. He even supported industrial espionage and technological piracy—including gross violations of British law by issuing American patents for inventions manifestly stolen from Britain—where such policies could be used to challenge British economic supremacy. Hamilton also advocated an immigration policy focused on skilled workers in specific industries that the United States wished to develop, including targeted recruiting abroad, travel expenses for immigrant artisans, and customs exemptions for their tools and machinery.
Thus Hamilton’s desire to build an American manufacturing economy in Britain’s image found expression in a dual policy: On the one hand, he sought close trade ties with Britain and chose to acquiesce in some British restrictions on American commerce—such as limitations on American cotton exports under the Jay Treaty of 1795—when these were needed to reach a mutually beneficial agreement. On the other, Hamilton encouraged aggressive competition with the British where collaboration was plainly impossible.
After the Jeffersonians took control of government in 1801, they gradually dismantled many of these nationalist economic policies, and even let the charter of the First Bank expire in 1811. But in the wake of war with Britain from 1812 to 1815, a new generation of nationalists emerged calling for a renewal of Hamiltonian economic policies. These nationalists included diehard Federalists under Daniel Webster; as well as former Federalists such as John Quincy Adams and William Plumer, who combined with former Jeffersonians led by Henry Clay to found what eventually became a new, aptly named, National Republican Party. This nationalist economic coalition supported what Clay called the “American System,” which sought to end economic dependence on foreign imports by protecting “infant industries” and those facing unfair competition from abroad, establishing a new central bank tasked with regulating credit and issuing currency, and developing national infrastructure such as roads and railroads. These policies were elaborated by the Baltimore lawyer Daniel Raymond, whose Elements of Political Economy (1820) criticized classical liberal economic theories for ignoring the frequent divergence between national interest and individual interest. Clay’s coalition succeeded in establishing a Second Bank of the United States, which operated from 1816 to 1836, as well as in reinstating protective tariffs.
Although this revival was cut short by the rise of the Jacksonian Democrats, the nationalist economic ideas of Hamilton and Clay were taken up by the American Whig Party and then put fully into effect by Abraham Lincoln in a series of laws designed by his economic advisor Henry Carey. This revived economic nationalism included tripling the average defensive tariff, the subsidized construction of a transcontinental railroad, and the Legal Tender Act of 1862, which empowered the Secretary of the Treasury to issue paper money not immediately redeemable in gold or silver. Economic nationalism remained the policy of Lincoln’s Republican Party during its long period of ascendancy from the Civil War into the twentieth century.
5. Nationalist Immigration Policy
During the 1790s, immigration policy emerged as an important point of contention between American nationalists and their Jeffersonian opponents. Because nationalists regarded membership in the nation as arising from shared traditions and values, they tended to prefer regulated immigration policies and a long process of naturalization during which new citizens could become acclimated to American values and traditions. It was Jay, for example, who suggested in a letter to Washington that the American president should be “natural born,” an idea that was adopted by the constitutional convention and included in the Constitution of 1787. A typical Federalist view expressing concern for the assimilation of immigrants into American “customs, measures, and laws” was expressed by President Washington in a letter to Adams in 1794:
The policy or advantage of [immigration] taking place in a body (I mean the settling of them in a body) may be much questioned; for, by so doing, they retain the language, habits, and principles (good or bad) which they bring with them. Whereas by an intermixture with our people, they, or their descendants, get assimilated to our customs, measures, and laws—in a word, soon become one people.
Jefferson was not initially keen on immigration either, and in his Notes on the State of Virginia (1782), he worried that European immigrants would bring a belief in monarchy with them—or that, having thrown their customary support for monarchy away, would exchange it for “an unbounded licentiousness, passing, as is usual, from one extreme to another.” But by the 1790s, it had become clear that new immigrants arriving in America usually wanted to own and work their own land, and so were inclined to support the Democrat Republican vision of America as a nation of farmers. This led Jefferson to reconsider, and in his first message to Congress as President in 1801, he expressed strong support for immigration and for a rapid grant of citizenship to newcomers. As he wrote:
I cannot omit recommending a revisal of the laws on the subject of naturalization…. Shall we refuse to the unhappy fugitives from distress that hospitality which the savages of the wilderness extended to our fathers arriving in this land? Shall oppressed humanity find no asylum on this globe? … Might not the general character and capabilities of a citizen be safely communicated to everyone manifesting a bona fide purpose of embarking his life and fortunes permanently with us? With restrictions, perhaps, to guard against the fraudulent usurpation of our flag?
Here, Jefferson proposes that citizenship may be safely granted to “everyone manifesting a bona fide purpose of embarking his life and fortunes permanently with us.” Other than cases of outright fraud, he sees no purpose in restrictions on immigration or a lengthy process of naturalization, in effect proposing a policy of open borders.
The Federalists responded to this reversal with astonishment. In three essays on immigration, Hamilton replied by quoting Jefferson’s own earlier views at length, and argued that this change of heart was the result of electoral considerations. The truth, Hamilton urged, is that unrestricted immigration endangers the “common national sentiment” of the country, and should be undertaken with a due concern to maintain, as much as possible, a “uniformity of principles and habits”:
The safety of a republic depends essentially on the energy of a common national sentiment; on a uniformity of principles and habits; on the exemption of the citizens from foreign bias and prejudice; and on that love of country which will almost invariably be found to be closely connected with birth, education and family… In the recommendation to admit indiscriminately foreign emigrants of every description to the privileges of American citizens on their first entrance into our country, there is an attempt to break down every pale which has been erected for the preservation of a national spirit and a national character; and to let in the most powerful means of perverting and corrupting both the one and the other… To admit foreigners indiscriminately to the rights of citizens the moment they put foot in our country would be nothing less than to admit the Grecian horse into the citadel of our liberty and sovereignty.
Concerns of this kind led to Federalist efforts to make the path to citizenship more restrictive and gradual. In 1790, naturalization required only two years of residency in the United States. But in 1795, Congress created a five-year process requiring two years of residency before filing a declaration of intent to become a citizen; and then a waiting period of three additional years. Finally, in 1798, the Federalists passed the Alien and Sedition Acts requiring a nineteen-year path to citizenship: fourteen years of residency, and then, after filing the declaration, another five years before naturalization. The Federalists’ embrace of such a difficult citizenship procedure reflected a rising fear that new immigrants would help shift the country’s politics in a more radical direction. But on a deeper level, it reflected the Federalist view that the American nation was characterized by distinctive values and traditions that one had to adopt in order to join it. In short, the Federalists wanted newcomers to become Americans before they became American citizens.
The Jeffersonians vehemently contested this approach to immigration and citizenship. Ultimately, theirs became a Lockean, voluntarist view of citizenship, in which everyone has a natural right to leave one country and become citizen of another at will. By simply choosing to come to the United States, one was made fit to be a citizen. The growing strength of such views effectively neutralized Federalist efforts to control immigration, with Democratic Republican officials at the state level resisting the implementation of the national government’s restrictive naturalization policies. It appears that the immigrant vote did play an important role in Jefferson’s victory of 1800, which led to the 1802 Naturalization Act and the formal return to the naturalization requirements of 1795.
6. Alliance with Britain.
In the early 1790s, the ideas of the French Revolution were still openly advocated in Britain, challenging the legitimacy of the English constitution and threatening to overthrow it. But by 1793, the situation had changed considerably. Revolutionary France was now at war with Britain, and public opinion in that country had rallied around the traditionalist views of Burke and the government of William Pitt. In America, however, a ferocious debate broke out over how to respond to the conflict. Democratic Republicans such as Jefferson, Paine, and Madison, combining a deep hostility to Britain with an attraction to French revolutionary notions, wished to see the United States openly side with France. They argued that America was still bound by the defensive Treaty of Alliance it had concluded with the Kingdom of France in 1778—a rather remarkable proposition given that the revolutionaries in Paris had abolished the monarchy and had executed the king who had signed the treaty. But more than this, the Jeffersonians proposed that the interests of the United States and France coincided because of their ideological affinity as sister republics. In effect, they proposed an international alliance of revolutionaries that would work to establish a new world order.
The Federalists’ view was precisely the opposite, supporting a formal neutrality that would, in practice, lean towards Britain. As we have seen, the American nationalists saw the United States as sharing not only a language, but a political, legal, and religious tradition with Britain, and believed that America would benefit materially from good commercial and political relations with London. But they also thought that Americans had little in common with revolutionary France, and no interest in aiding the French project of bringing revolution to every corner of Europe. Indeed, the American nationalists tended to endorse Washington’s view, declared shortly after he became President in 1789, that it was in America’s interest that God “protect and guide all sovereigns and nations (especially such as have shewn kindness unto us).”
In April 1793, President Washington issued a Neutrality Proclamation, declaring that the United States would maintain impartial and friendly relations with each of the belligerent powers. This policy, whose immediate consequence was the termination of the Treaty of Alliance with France, was immediately lauded by Federalists and attacked by Jeffersonians. Hamilton joined the debate in June and July, publishing seven public letters under the pseudonym Pacificus (and two more the following year under the pseudonym Americanus) taking Washington’s side, and defending the president’s constitutional authority to conduct foreign affairs against Jeffersonian claims that Congress must be involved in making such decisions. The heart of Hamilton’s argument was that the interests of the nation, rather than any internationalist revolutionary brotherhood, had to be the primary consideration guiding foreign relations. As he wrote:
[Americans ought] not to over-rate foreign friendships—[we ought] to be upon our guard against foreign attachments. The former will generally be found hollow and delusive; the latter will have a natural tendency to lead us aside from our own true interest, and to make us the dupes of foreign influence. They introduce a principle of action, which in its effects, if the expression may be allowed, is anti-national.
In this passage, Hamilton follows Vattel in arguing against permanent alliances, instead proposing that “self-preservation is the first duty of a nation.” But in addition to offering this nationalist framework for thinking about foreign affairs, Hamilton also observed that revolutionary France presented the United States with a peculiar situation, in which a foreign state was waging a war that was not only offensive, but also had the explicit ideological purpose of exporting revolution everywhere possible:
It is not warrantable for any nation beforehand to hold out a general invitation to insurrection and revolution, by promising to assist every people who may wish to recover their liberty, and to defend those citizens of every country who may have been or who may be vexed for the cause of liberty; still less to commit to the generals of its armies the discretionary power of judging when the citizens of a foreign country have been vexed for the cause of liberty by their own government.
Hamilton understood that a blanket invitation to “insurrection and revolution” in all countries in the name of liberty would not advance the cause of freedom, whether in France or anywhere else. Rather, in raising armies whose aim was to overthrow the constitutional order throughout Europe, France might “find herself at length the slave of some victorious Scylla or Marius or Caesar”—a prediction that soon proved correct.
When Jefferson, still serving as Secretary of State, read the Pacificus essays in July 1793, he wrote a furious letter to Madison demanding that he write a rebuttal: “For God’s sake, my dear sir, take up your pen, select the most striking heresies and cut him to pieces in the face of the public.” Madison, now firmly back in Jefferson’s orbit, acquiesced and penned a series of five essays in which he accused Hamilton and the Federalists of being “degenerates,” and of “hat[ing] our republican government”:
Several pieces with the signature of Pacificus were lately published, which have been read with singular pleasure and applause by the foreigners and degenerate citizens among us, who hate our republican government and the French Revolution.
Madison goes on to deplore Montesquieu for his “warped” admiration, “bordering on idolatry,” for the British constitution; and ridicules Hamilton for arguing that foreign policy is chiefly the prerogative of the American president: “The power of making treaties and the power of declaring war, are royal prerogatives in the British government, and are accordingly treated as executive prerogatives by British commentators,” he wrote.
The Federalists were not content with mere neutrality in the war between Britain and revolutionary France. While maintaining military neutrality, Washington and Hamilton sought to establish a more supportive relationship with Britain, designing a treaty that would put the United States on friendly terms with London and resolve issues remaining unresolved from the 1783 accords. The Treaty of Amity, Commerce and Navigation, commonly known as the Jay Treaty, was signed in November 1794. It allowed the two former enemies to trade on a reciprocal “most favored nation” status—quite an achievement for the young American government. In return, the United States acquiesced in British maritime policies designed to damage France. Several contentious border issues were also resolved by the treaty, paving the way for a decade of peaceful and mutually profitable trade between the United States and Britain in the midst of the French Revolutionary Wars.
The Jeffersonians bitterly opposed the Jay Treaty for effectively aligning America with Britain. Randolph, who had replaced Jefferson as Secretary of State, resigned as well. Opposition to the treaty became the signature issue for the Democratic Republicans in the 1796 presidential elections, in which Jefferson carried all of the southern states as well as Pennsylvania. Adams, winning by a mere three electoral votes (71-68), was able to extend America’s pro-British orientation for another four years. But after Jefferson’s victory of 1800, relations with Britain began a downward spiral that eventually culminated in an unnecessary war against Britain during Madison’s presidency.
7. Alliance Between Religion and State.
In 1776, the Continental Congress called upon the thirteen states to write their own constitutions as part of the drive for independence from Britain. This meant the relationship between religion and the government of the various states was reexamined in the midst of a revolutionary backlash against all symbols of British rule. Nine of the thirteen colonies—New York, Maryland, Virginia, North Carolina, South Carolina, Georgia, New Hampshire, Massachusetts and Connecticut—had established churches at the time, of which all but the last three were Anglican. Subordinated to English bishops who were in fact a branch of the British government, these Anglican churches in America were quickly disestablished almost everywhere: in Maryland and North Carolina in 1776, in Georgia and New York in 1777, and in South Carolina in 1778. Only in Virginia did the Anglican church hang on for another decade, finally succumbing, after concerted efforts by Jefferson and Madison to eradicate it, in 1786. This was in sharp contrast to the established churches in New England, which were Congregationalist and continued to receive state support for two more generations: until 1818 in Connecticut, 1819 in New Hampshire, and 1833 in Massachusetts.
The Federalists who came to the constitutional convention in Philadelphia tended to favor some kind of an alliance between Christianity and the state. But the form this alliance would take was uncertain: Although only a few northern states still had established churches, most of the other states still required some kind of religious test or oath for officeholders, and continued to assist Protestant churches in various ways. Under these circumstances, the Constitution of 1787 conceded that these matters were to be left in the hands of the states. Similarly, the First Amendment to the Constitution (1791) prohibited the national Congress from making any “law respecting an establishment of religion, or prohibiting the free exercise thereof”—again leaving the matter in the hands of the states.
But the Federalists were not supporters of a Jeffersonian policy of building a “wall of separation between church and state.” On the contrary, they hoped to cultivate a tolerant Protestant nationalism, which they believed would strengthen the constitutional republic they had created, providing it with citizens who were capable of morality, self-discipline, and deference. In addition to defending the existing state provisions for the encouragement of Christianity, Federalists at the national, state, and local levels were prominent in honoring and promoting traditional expressions of Christian public religion, including government-sponsored prayer services and days of thanksgiving and fasts. Washington was particularly concerned with public religion. As early as 1777, as head of the Continental Army, he proclaimed a day of prayer and thanksgiving after the great victory at Saratoga. As President of the new national government in 1789, he issued a proclamation designating November 26 as a national day devoted to thanksgiving, emphasizing that all nations have a duty to honor God:
It is the duty of all nations to acknowledge the providence of Almighty God, to obey his will, to be grateful for his benefits, and humbly to implore his protection and favor… Now therefore I do recommend and assign Thursday the 26th day of November next to be devoted by the People of these States to the service of that great and glorious Being, who is the beneficent Author of all the good that was, that is, or that will be—That we may then all unite in rendering unto him our sincere and humble thanks—for his kind care and protection of the people of this country previous to their becoming a nation—for the signal and manifold mercies, and the favorable interpositions of his Providence which we experienced in the course and conclusion of the late war—for the great degree of tranquility, union, and plenty, which we have since enjoyed—for the peaceable and rational manner, in which we have been enabled to establish constitutions of government for our safety and happiness, and particularly the national one now lately instituted… We may then unite in most humbly offering our prayers and supplications to the great Lord and Ruler of Nations and beseech him to pardon our national and other transgressions.
During the 1790s, in the wake of the horrors of the atheistic French Revolution, and the perceived adherence of the Jeffersonians to its ideals, the Federalists became more active in their attempts to find a role for religion, even if non-established, in the public life of the nation. An evangelical Christian, John Jay repeatedly defended his and America’s religious beliefs in debates with European supporters of the French Revolution and Paine’s Rights of Man, later joining another prominent American nationalist, Elias Boudinot, in founding the American Bible Society. Similarly, John Adams, as President in 1798 during the French Revolutionary Wars, condemned the “principles and manners which are now producing desolation in so many parts of the world,” and emphasized that America’s political order must be a religious one:
We have no government armed with power capable of contending with human passions unbridled by morality and religion. Avarice, ambition, revenge or galantry would break the strongest cords of our Constitution as a whale goes through a net. Our Constitution was made only for a moral and religious people. It is wholly inadequate to the government of any other.
Although “a zealous believer in the fundamental doctrines of Christianity” as a young man, Hamilton did not initially show much interest in matters of religion and state. However, during the 1790s, he too came to regard religious faith as the indispensable antidote to the spread of radical views. By 1802, while discussing the decline in support for the Federalist Party, he explained that he had reached the conclusion that the struggle for nationalist conservative principles could not be won only with reasoned arguments, because the Jeffersonians, while “eulogizing the reason of men and professing to appeal only to that faculty,” were constantly “courting the strongest and most active passion[s] of the human heart.” Unless the Federalists could enlist “some strong feelings of the mind,” all of their political plans and efforts would eventually be in vain. He therefore proposed the formation of a “Christian Constitutional Society,” whose objective would be to “support the Christian religion” and the Constitution—in effect a Christian nationalist coalition, with clubs meeting across the country. But Hamilton’s death less than two years later prevented him from attempting to implement this plan.
8. Opposition to Slavery.
As a party, the Federalists regarded the goal of unifying the thirteen states under a national government as precluding an attempt to abolish slavery in their generation. At the constitutional convention in 1787, for example, the debate over slavery was largely suppressed to allow the southern states to join the Union. Nevertheless, the Federalist Party was from the outset distinguished from its Democratic Republican opponents by the fact that many of the leading Federalists, including Jay, Hamilton, Adams, Gouverneur Morris, Oliver Ellsworth, and William Cushing, were prominent in the effort to end slavery in America (although Jay owned slaves and released them only upon his death). Washington, too, upon his death, became the only southern plantation owner among the American founders to free all of his slaves. Thus while the founding generation of Federalists did not elevate anti-slavery into a central political principle, it remains the case that the opponents of slavery found a home in the Federalist Party. This pronounced anti-slavery tendency grew directly from other Federalist principles we have discussed: from the vision of America as an industrial and commercial republic, in which men would be free to sell their labor as they chose; from an affinity for Britain, whose courts had decreed slavery to be odious and without support in the common law; and from a Christian commitment to all men as having been created in the image of God. And the Federalist Party laid the foundations for the anti-slavery views of the next generation of American nationalists, including Federalists such as Daniel Webster and Rufus King, Whigs such as John Quincy Adams and Henry Clay, and, ultimately, the Republican Party of Abraham Lincoln.
During the Revolutionary War, the Act for the Gradual Abolition of Slavery (1780) was part of the radical democratic program of the anti-nationalist regime in Pennsylvania. But with independence won, the effort to end slavery was taken up by leading Federalists. In 1783, William Cushing, Chief Justice of the Massachusetts Supreme Court (later a Federalist appointment to the U.S. Supreme Court) followed the example set in Britain eleven years earlier by Lord Mansfield, who had ruled that slavery has no basis in the laws of England. Now, in Massachusetts, Cushing wrote:
As to the doctrine of slavery and the right of Christians to hold Africans in perpetual servitude, and sell and treat them as we do our horses and cattle…, nowhere is it expressly enacted or established… This being the case, I think the idea of slavery is inconsistent with our own conduct and Constitution; and there can be no such thing as perpetual servitude of a rational creature unless his liberty is forfeited by some criminal conduct or given up by personal consent or contract.
Arguing that as slavery had never been enacted as law in Massachusetts, Cushing found this institution incompatible with the state constitution drafted by Adams and adopted in 1780. In this way, legal protection for the institution of slavery was brought to an end, eliminating slavery in the state almost immediately, and beginning a process that brought about the gradual freeing of slaves throughout New England.
But the largest emancipation of black slaves in North America before the Civil War took place in the state of New York, where this effort was led by prominent nationalists. Jay had drafted a state law abolishing slavery in 1777, but it had been defeated. Then in 1785, Jay and Hamilton were founding members of the “New York Society for Promoting the Manumission of Slaves and Protecting Such of Them as Have Been or May be Liberated,” which succeeded that year in passing a state law prohibiting the sale of slaves brought into the state. Jay was the Society’s first president, with Hamilton briefly serving after him. In 1799, as Governor of New York, Jay finally signed into law the Act for the Gradual Abolition of Slavery, which decreed that from July 4 of that year, all children born to slave parents would be free, and that more than 30,000 adult slaves would gradually be freed thereafter. In 1821, the Federalists Rufus King and Augustus Jay (son of John) successfully blocked an attempt to introduce a clause to the New York State Constitution that would have disenfranchised black voters.
While the Federalists and their allies were able to make steady progress in dismantling slavery in some of the states, they had much more limited success at the national level. At the constitutional convention in 1787, another Federalist member of the New York Manumission Society, Gouverneur Morris, who authored the final text of the Constitution, gave an impassioned speech condemning slavery, calling it “the curse of heaven.” According to Madison’s notes, written in the third person, Morris said of slavery that:
It was a nefarious institution. It was the curse of heaven on the states where it prevailed. Compare the free regions of the Middle States, where a rich and noble cultivation marks the prosperity and happiness of the people, with the misery and poverty which overspread the barren wastes of Virginia and Maryland, and the other states having slaves…The moment you leave the [north-]eastern states and enter New York, the effects of the institution become visible…Proceed southwardly, and every step you take through the regions of slaves presents a desert increasing with the increasing proportion of these wretched beings. Upon what principle is it that the slaves shall be computed in the representation? Are they men? Then make them citizens and let them vote…He would sooner submit himself to a tax for paying for all the Negroes in the United States, than saddle posterity with such a Constitution.
In the end, the Federalists yielded to the notorious three-fifths formula for calculating the representation of southern slave populations in Congress, in exchange for a provision ending the importation of slaves by 1808. Similarly, in 1789, Federalists supported the Northwestern Ordinance, which banned slavery beyond the Ohio River, while having to capitulate to the southern states in allowing slavery in the Mississippi and Southwest territories, as well as in the nascent District of Columbia. This would remain the pattern at the national level until the Civil War, with nationalists generally opposing the extension of slavery, while repeatedly proposing federally-funded manumission schemes that failed to gain sufficient support. As late as the 1820s, the last Federalist senator, Rufus King, proposed a plan for the Federal government to encourage manumission of slaves, but it was once more rejected. Only with the founding of the Republican Party in 1854, did the nationalist campaign for a united American nation free from the curse of slavery finally attain critical mass.
III. The Federalists and Modern American Nationalism
American nationalists took the leading role in writing and ratifying the Constitution of 1787, and in establishing the national government of the United States during its first, formative decade. Indeed, the decline of the Federalists as a formal political party occurred, in no small part, because of the grudging acceptance by the first Democratic Republican presidents, Jefferson and Madison, of key aspects of the Federalist platform: Most Americans did come to regard themselves as members of a single nation and to accept the Federalists’ national government with its strong executive and judiciary. Moreover, their American national identity remained attached to a powerful Anglo-American tradition in language, religion, and law that was still plainly visible to Tocqueville when he traveled in the United States during the 1830s.
Even as the Federalist Party waned, nationalism continued to be a force in American politics. Younger Federalists in Congress organized around Daniel Webster and combined with a group of renegade Democratic Republicans led by Henry Clay, who became the standard bearer for a return to Hamiltonian ideas. They succeeded in securing the election of former Federalist John Quincy Adams as president in 1824, and created first the National Republican Party, and later the American Whig party—a name strikingly intended to invoke the Anglo-American conservative tradition and the ideas of Edmund Burke. These American nationalists came together around Federalist causes such as economic nationalism and opposition to the expansion of slavery, even as they supported Congress against Andrew Jackson’s strong executive. In the 1850s, Whigs such as William Seward and the almost unknown Abraham Lincoln coalesced into a new nationalist political grouping, the Republican Party. This nationalist revival succeeded, at the cost of a terrible civil war, in saving the Federalists’ national government and implementing Hamiltonian economic policies, while at the same time burning to the ground the most monstrous legacy of Jeffersonian America—the institution of slavery. Thus while it is true that Lincoln was comfortable borrowing Jefferson’s rhetoric to mix with his imposing biblical imagery, his policies were in a tradition the Federalists would have easily recognized. After the Civil War, even Lincoln’s assassination could not derail this decisive nationalist victory, which forged an American political consensus that lasted into the twentieth century.
Far from being un-American, nationalism to a great degree made America what it is. To be sure, the United States has changed immensely since the days of Washington, Jay, Adams, and Hamilton, the leaders of the original American nationalist party. Nevertheless, it is difficult to miss the way in which, at a time in which the indications of national dissolution grow ever more insistent, the issues that animated the Federalist Party have returned to the fore in our own time. For decades, American political life has been dominated by a Jeffersonian discourse that focused on universal theories of individual rights at the expense of a careful consideration of America’s unity and strength as a nation. This Jeffersonian intellectual hegemony encouraged regime-change adventures in distant lands, recklessly indiscriminate immigration and trade policies, the elimination of even the slightest echo of religious observance from public life, and a growing hatred toward the country’s Anglo-American constitutional and cultural inheritance. Today, nationalists are rediscovering the worth of Federalist ideas: Of a foreign policy based primarily on national interest, and on an alliance with English-speaking countries and like-minded national states sharing America’s commitment to national independence and individual liberties. Of an economic policy directed toward a renewal of American industry and technological leadership in the face of dangerous rivals abroad. And of immigration policies emphasizing the need for newcomers to integrate into a culture that cherishes inherited American traditions and the values they bear. It may be that as Americans regain an appreciation of the Federalist Party’s principles, their wisdom will be retrieved in other areas as well, leading to a recognition, for example, that the American nation will not endure without a return of religion to public life, and without ensuring that the descendants of slaves are an integral and honored part of the American nation.
On one issue, however, today’s nationalists may well wonder at the views of their Federalist predecessors. The Federalists’ national Supreme Court, with the power to void legislation, played a crucial role in establishing a unified American national state—just as they intended. But the Federalists assumed that the justices would be traditionalists, wishing to serve as “faithful guardians of the Constitution,” as Hamilton wrote. None of them imagined the circumstances that most Western nations face today, in which jurists use the national Supreme Court to impose what is in effect a new constitution—one that is post-nationalist and hostile to Christianity—by judicial fiat. Under these conditions, contemporary nationalists have no choice but to seek ways of limiting the power of the judiciary to subvert the Constitution, just as their forefathers sought ways to limit the power of the legislature to do so.
In these areas and others, a nationalist politics must be built upon a fundamental understanding that was embraced by the Federalists—and that can be embraced again by their nationalist heirs in our day as well: the insight that Americans are not merely a collection of individuals, an essentially arbitrary subset within some universal brotherhood of individuals. They are rather a distinct nation, with a proud and important heritage that is unique in the world, and that still has much to achieve and much to contribute, both to America and to others.
Ofir Haivry is Distinguished Senior Fellow at the Edmund Burke Foundation and author of John Selden and the Western Political Tradition. Yoram Hazony serves as Chairman of the Edmund Burke Foundation and is author of The Virtue of Nationalism.
The Republican Party has become the war party. These conservatives, supposedly committed to the American republic, based on individual liberty and limited government, advocate that the U.S. should make every foreign crisis America’s own, defend every rich friend, engage in nation-building everywhere, turn policy over to politically influential allies, dictate to great powers, and make new enemies at every turn.
At least that is the policy advocated by the Republican Study Committee (RSC), the conservative voice within the House Republican caucus, in its embarrassing new screed “Strengthening America & Countering Global Threats.” With the U.S. beset by crises at home, these self-styled conservatives would divert American attention, waste valuable resources, and sacrifice precious lives to engage in counterproductive social engineering abroad.
The RSC declares its lack of seriousness in the paper’s introduction. It believes the fount of all the world’s ills is the Obama administration. There is much to criticize in the latter, but it followed a true horror show, the big-spending war-mongering Bush administration. And then against Barack Obama in 2008 the GOP nominated John McCain, who never found a war he did not want America to fight and in the midst of a financial crisis admitted his economic ignorance.
Opined the RSC: “For eight years, President Obama’s failed policies allowed our greatest adversaries to grow stronger while weakening America’s position as the world’s preeminent power. During this time, Communist China and Russia went completely unchecked, Iran was gifted a plane full of cash, jihadist groups such as ISIS were casually dismissed as the ‘JV squad,’ key allies were offended, foreign aid and United Nations dues failed to advance U.S. interests, and America behaved sheepishly on the world stage.”
Drivel and nonsense.
How was the U.S. weakening? The Bush administration entangled America in endless conflict in Afghanistan and made what international scholars widely view as the worst foreign policy blunder in decades, the disastrous invasion of Iraq. President George W. Bush looked into Vladimir Putin’s eyes and saw love and affection. The Bush administration demanded the Palestinian elections which brought Hamas to power in Gaza. President Bush backed the lawless independence of Kosovo, setting a precedent for Russian backing of Abkhazia’s and South Ossetia’s secession. The administration continued NATO expansion and heedlessly pushed for inclusion of Georgia and Ukraine, encouraging the former’s reckless behavior—Mikhail Saakashvili’s government started the shooting in Georgia’s catastrophic war against Moscow—and inflaming Russian hostility.
The Bush administration rejected negotiations with Tehran as leading conservatives demanded war with Iran. So the latter sped up its nuclear research program. President Obama did not “gift” money to Iran; instead, Tehran’s own funds were returned for agreeing to detailed restrictions and comprehensive inspections. The Trump administration’s “maximum pressure” campaign has been a disastrous failure, making Iran even more hostile and disruptive.
ISIS threatened not the U.S., but a gaggle of corrupt, authoritarian governments across the Middle East. They were capable of defeating the Islamic State but didn’t need to do so once Washington took over. The RSC whines about offending allies that deserve criticism and worse—vile oppressors, rich welfare dependents, and failed states.
As for allegedly “sheepish” behavior, the Obama administration spent more on the military, twice increased troop levels in Afghanistan, intervened in Libya, backed the Saudi invasion of Yemen, reinserted U.S. forces in Iraq, took America into the Syrian civil war, and pushed the reluctant Europeans to confront Russia over Ukraine. The Obama administration’s problem was promiscuous, foolish intervention, not inaction.
Alas, the GOP House conservatives have learned nothing from the bipartisan experience of turning Uncle Sam into GloboCop. They begin with China, which does pose an important challenge to the U.S., as the document contends. However, in dealing with Beijing’s worsening behavior—Chinese President Xi Jinping looks a lot like Mao Zedong reincarnated—the RSC is high on posturing. It urges meaningless sanctions on Chinese apparatchiks, which won’t cause them to stop oppressing the Chinese people. And the RSC proposes a “statement of policy making Xinjiang a major issue in U.S.-China relations,” as if Beijing would notice.
The most serious issue is potential military confrontation between the two nations. Yet on this issue the RSC says little other than headlining the relevant section “strengthening our alliances and partnerships in the Indo-Pacific and beyond.” In GOP parlance “strengthening” alliances always means increasing military subsidies and support for states which should be doing more themselves. Washington’s objective should be to encourage allies to deter Chinese aggressiveness, backstopping their independence rather than guaranteeing their interests, such as territorial claims, of only peripheral importance to America.
Indeed, the RSC makes a common mistake when it claims: “China continues its military buildup in the South China Sea threatening the United States as well as allies and partners.” Actually, Beijing threatens U.S. influence, not America. There is no Chinese plan to attack the U.S., rather, what China wants is a variant of the Monroe Doctrine, to stop Washington’s threats against Beijing in its own neighborhood. It costs America far more to project power then China to deter use of that power. The RSC fails to grapple with the fundamental question: how much are Americans prepared to spend and risk to confront another nation in its home territory over security interests of other states that are not vital to the U.S.?
A more serious blunder is the RSC’s screed against Russia, entitled “rolling back aggression through a strategy of deterrence.” The chapter is hysterical, featuring a map of “Russia’s expanding aggression,” highlighted by such forgettable border territories as Transnistria, Donetsk, and Abkhazia, which are not American security interests. The RSC seems most exercised over Russian behavior that is irrelevant to America, such as support for Bashar al-Assad’s Syria, a Moscow ally for decades. Russia is a negative actor, but hardly the dire security threat the overwrought RSC claims.
Russia has returned to a pre-1914 great power, demanding respect for its interests and secure borders. Yet Washington violated its pledge against expanding NATO up to Russia’s borders, dismantled Serbia without concern for Moscow’s interests, encouraged a street putsch against an elected, albeit corrupt, president of Ukraine friendly to Russia, and treated the erratic government of Georgia, which fired on Russian military forces, as a major ally. This does not justify Moscow’s military adventurism, but imagine Washington’s reaction to similar behavior by Russia (such as inviting Mexico to join the Warsaw Pact)—there would have been little American concern for democratic or diplomatic niceties. As for interfering in other nations’ elections, the U.S. has done so more than 80 times, including in Russia, most ostentatiously in 1996, without apology.
Anyway, all the RSC can think of is more sanctions on Russia and military support for NATO. The RSC would increase American misuse of its financial dominance, further pushing Europe toward Russia, China, and other states in looking for alternative financial mechanisms. Worse, sanctions ensure continuing hostility from Moscow for no purpose. Russia will not surrender Crimea under any circumstances, nor will it abandon its confrontational foreign policy in the face of what it sees as U.S. aggression. Better would be to find an accommodation with Moscow. There is no better evidence that GOP foreign policy is moribund, even braindead, than that it has encouraged a potentially dangerous China-Russia condominium against America.
As for NATO, why can’t the Europeans, 75 years after the end of World War II, take over their own defense? With 11 times Russia’s GDP and more than three times Russia’s population, Europe is capable of protecting itself. The Pentagon should not create an international defense dole. Why must America, busy elsewhere in the world and overburdened financially, spend even more to “reassure” European nations that prefer to use their resources to fund generous welfare states?
The RSC’s proposed policy in the Mideast is, if anything, even worse. It continues Washington’s obsession with Tehran: “Iran is not a great power or strategic competitor, but it still presents a significant challenge as a rogue regime backed by a military and intelligence apparatus while being the world’s leading state sponsor of terrorism.”
This is largely nonsense. The Mideast no longer is important to America. The region’s energy role is diminishing and Israel is a regional superpower that can defend itself. Iran has a decrepit military and is surrounded by enemies; reliance on proxy forces is a sign of weakness, not strength. Saudi Arabians have done far more to promote terrorism, including funding and staffing the 9/11 attacks, a role covered up by the Bush administration. Washington deems Iran “terrorist” not because it attacks America, but because it supports Hamas and Hezbollah, quasi-governments that battle Israel.
U.S. policy toward Iran has verged on criminal: backing a coup to overthrow democracy in 1953, supporting the brutal Shah for a quarter century, aiding Saddam Hussein’s war of aggression in the 1980s, shooting down an Iranian airliner, arming the even more repressive and aggressive Saudi dictatorship, and constantly threatening war against Iran. Yet the RSC advocates intensifying the “maximum pressure” campaign which has bolstered Tehran’s hardliners and further destabilized the region. Demanding that Iran surrender its independent foreign policy while threatening its destruction and aiding its enemies is a policy best characterized as idiotic, designed for failure.
The best evidence that the RSC is bereft of moral as well as geopolitical sense is its support for the Saudi Arabian dictatorship. Indeed, the conservative Republican organization backs the Trump administration’s sacrifice of American interests to the corrupt Saudi royal family. Saudi Arabia is a totalitarian state, without political or religious liberty. It invaded Yemen, kidnapped the Lebanese prime minister, backed jihadist insurgents in Syria, supported strongman Khalifa Haftar in the Libyan civil war, financed tyranny in Egypt, used troops to sustain the dictatorial Sunni monarchy against the Shia majority democracy movement in Bahrain, and launched economic war against Qatar designed to turn the latter into a satellite state. This is the regime which the RSC would make dominant in the Persian Gulf.
The paper closes with a standard Republican bromide: “New global threats make American leadership more imperative now than ever before.” The U.S. must forever attempt to dominate the globe. America must defend everyone. America must confront everyone. America must impose its will on everyone.
This agenda is neither sustainable nor desirable. It is hubris masquerading as foreign policy. The next administration should put into effect what George W. Bush advocated but almost immediately abandoned, a “humble” foreign policy that really puts America first.
Doug Bandow is a Senior Fellow at the Cato Institute. A former Special Assistant to President Ronald Reagan, he is author of Foreign Follies: America’s New Global Empire.
The post ‘Conservative’ House GOP Wants America to Wage War Against All appeared first on The American Conservative.
Imagine visiting the German city of Frankfurt and walking down Heinrich Himmler Avenue, taking a left on Reinhard Heydrich Plaza, continuing down Joseph Goebbels Boulevard, and ending up in front of a golden statue of Adolf Hitler. It would never happen—in fact, sounds like a scene out of the TV show Man in the High Castle.
Whoever has visited Germany will know that all the Nazi emblems were purged, in both West and East Germany, following World War II. Monuments today are reserved for politicians of the post-war period, the reunification of the 1990s, as well as resistance fighters during the Nazi era. Germany, without a doubt, has done a commendable job of educating the public about the horrors of its past and preserving artifacts for historical reappraisal. So far as controversial monuments in Germany go, the only one I can think of is the recently erected statue of Karl Marx in his birthplace of Trier.
A Godwinesque opener of sorts, and yet travel through Europe and you’ll soon discover that not all countries have done what Germany did. Having studied at a French university, I can attest that the French have never accepted responsibility for the crimes committed by their collaborationist government in the south, known as the Republic of Vichy. This name was only created later, as the official title of the non-occupied zone was the French Republic from the invasion in 1940 until liberation by the United States in 1944. President De Gaulle considered the real French Republic to have never ceased to exist, leading to the popular saying “Non, Vichy n’était pas la France” (“No, Vichy was not France”). The persecution and deportation of the Jewish community there was only recognized by President Chirac in 1995, sparking a historical debate that is still ongoing today.
But the master of having an ambiguous relationship with one’s past is Italy. That country is filled with fascist insignia and monuments, while practically every flea market will sell you artifacts that blatantly glorify Mussolini. Suggesting that Italy has not dealt with its own past can turn a conversation heated in no time. Not only have fascist monuments not been torn down, new ones have been erected as recently as 2012.
That is not to say that giving activists free rein over the removal of statues is a good idea. The conversation surrounding Black Lives Matter and its opposition to some monuments have reached Europe, but it has most intensely affected the United Kingdom, where the purging of statues has kicked COVID-19 out of the headlines. On its list of 78 statues it would like to see removed, Black Lives Matter UK names former prime minister William Gladstone (in office from 1868 to 1894). As Dr. David Jeffrey writes, the four-time liberal PM was against the slave trade, opposed the colonization of Africa, introduced the secret ballot, expanded the vote among working-class men, legalized trade unions, introduced universal schooling from ages five to 12, pushed for home rule for Ireland, fought against landlord and aristocratic privilege, and almost singlehandedly built the modern tax system. Yet despite his roaring liberal credentials, his statue is being targeted over unfounded accusations that he owned slaves himself.
Then there’s the push to remove statues of Winston Churchill, which would be a particular blow to Prime Minister Boris Johnson (he wrote a book on Churchill after all). Churchill has come under fire over Britain’s involvement in the Bengal famine of 1943, which cost the lives of between two and three million people. Churchill’s knowledge, responsibility, and opinion on the matter have been hotly debated in recent weeks, adding to the general understanding that Churchill is a complex figure who needs more historical study. Associating him with the famine is at present based on conjecture and bad faith. Hillsdale College has collected the relevant documentation, which largely shows that Churchill acted in good conscience.
In Belgium, meanwhile, protesters have a much better case in demanding the removal of King Leopold II. Belgium’s second monarch is credited with significant social reforms, such as largely outlawing child labor, compensation for workplace accidents, and giving Sundays off. However, what he’s better known for is the private colonization of the Congo, which began in 1885. With the help of mercenaries, he ruthlessly exploited the country for resources and hard and forced labor, with those who worked too slowly having their hands cut off. The historical consensus is that his reign caused the deaths of 10 million people in the Congo. Before his own death, he gave away his authority and “property” to the Belgian state, effectively making Belgium a colonial power in the process. Keeping a statue of him on the streets of Brussels is like erecting a statue of Hitler for building the Autobahn. By every reasonable standard, Leopold II monuments belong only in museums.
The fact is that two things can be true at once: yes, the left cancels those whom they find inconvenient in the first place; and yes, there are statues that do not deserve public glorification. Removing a statue doesn’t mean erasing its history, nor do existing statues endorse all of the views—public or otherwise—of a particular persona. A statue is erected for the actions, successes, and principles that a person stood for. We need to strike a balance between recognizing bad deeds while not judging historical characters by the social standards of today.
What ought to happen to statues of Confederate leaders is for Americans to decide. But whatever the case, it should be undertaken on the basis of a moral compass that is understandable to all.
Bill Wirtz comments on European politics and policy in English, French, and German. His work has appeared in Newsweek, the Washington Examiner, CityAM, Le Monde, Le Figaro, and Die Welt.
The post Why Some Statues in Europe Should Be Torn Down (And Others Shouldn’t) appeared first on The American Conservative.
The United States must maintain an aggressive posture toward Iran and prolong its military intervention in Iraq indefinitely, CENTCOM Commander General Kenneth F. McKenzie Jr. argued in a livestreamed conversation with the Middle East Institute on Wednesday. This high-ranking support for demonstrably failed strategies—and the specious arguments McKenzie advanced to support them—are an ill omen for U.S. foreign policy. If McKenzie and those who share his thinking get their way, our government will continue to waste blood and treasure on reckless antagonism.
Nearly reciting from CENTCOM’s “priorities” brief, McKenzie described U.S.-Iran relations as in a state of “contested deterrence,” which “really obtained from the January exchange where we [killed Iranian General Qassem] Soleimani, and they attacked our forces at Erbil and also at Al Asad Airbase.” The Soleimani strike broke a cycle of escalation, he argued, because “the Iranians have had to recalculate…just what we’re willing to do.”
As he elaborated, however, McKenzie’s case unraveled.
He claimed the Soleimani assassination re-established deterrence, but he did so only by reversing the order of events. “In 2019,” he said, “we saw state-on-state attacks generated from Iran against Saudi Arabia—the Aramco attack—and then we saw a state-on-state attack against us in early January—you know, in Iraq, when they attacked the Al Asad Airbase—so I believe right now they are deterred from undertaking those activities, because they have seen that we have both the capability and the will to respond.”
It’s a compelling narrative until you notice the order of McKenzie’s telling (Aramco, Al Asad, Soleimani) is not the order in which the events occurred (Aramco, Soleimani, Al Asad).
How can the Soleimani hit be said to establish deterrence to prevent attacks like the Al Asad strike if the Al Asad strike took place after Soleimani was killed? (In fact, the strike was a direct response to Soleimani’s death, albeit one calculated to avoid plunging into outright war.) This reversal McKenzie uses is a clumsy abuse of chronology.
The better explanation for Iran’s decision to at least temporarily scale down its regional troublemaking in 2020 as compared to 2019 is threefold.
First, Iran is dealing with a severe COVID-19 outbreak that’s focusing some of Tehran’s attention at home. Second, the Iranian regime has reportedly decided to lie low in the Middle East until after the U.S. presidential election later this year.
Third—and most important—in January, we came to the brink of war with Iran, and Tehran’s primary goal is regime survival, which becomes impossible if the United States invades. In that sense, the sequence of events McKenzie mentioned helped push Iran to back down—but that sequence must be seen in the context McKenzie neglects to mention. For one thing, Washington also backed down in January. For another, all this took place against the backdrop of the Trump administration’s withdrawal from the Iran nuclear deal and institution of the consistently counterproductive “maximum pressure” campaign, which has incentivized the very regional provocations McKenzie wants credit for stopping.
McKenzie’s treatment of Iraq is equally troubling. He introduced the possibility of ending our war there and bringing American soldiers home as a win for Iran, which ignores that such a move is the closest thing to victory available to the United States.
The general explained that, from his perspective, “we’re in Iraq to finish the defeat of [the Islamic State],” not only in the already-accomplished goal of unmaking the territorial caliphate but also in ending ISIS’s ability “to carry out attacks.” However, McKenzie also conceded that this is a Sisyphean task: the threat of ISIS “is not going to go away,” he said. “There’s never going to be a time, I believe, when either ISIS or whatever follows ISIS is going to be completely absent from the global stage.”
McKenzie’s plan, then, amounts to a permanent U.S. military presence in Iraq.
Ideally, he said, that would mainly be a supporting role backing local military—but if the past two decades have shown anything, it is that this sort of partial drawdown always leaves the door open to a new round of escalation. The war in Iraq was technically “over” when the fight against the Islamic State began, but eight years later it is meaningless to speak of this as anything but a continuous 17-year conflict. As long as U.S. troops remain in Iraq, this war will continue, and new cycles of large-scale fighting will be possible—perhaps even a full-on war with Iran.
Instead of juggling dates and attempting to justify perpetual war, the task at hand with Iran and Iraq alike is true de-escalation and a pivot to realist diplomacy. As McKenzie himself agrees, U.S. military might already deters significant Iranian aggression, and extremist groups like ISIS will never be eliminated (at least not by military means). Further military intervention will not make us safer or the Mideast more peaceful.
It is time to abandon the feckless, military-first foreign policy that has brought us so much grief for nearly 20 years.
Bonnie Kristian is a fellow at Defense Priorities, contributing editor at The Week, and columnist at Christianity Today. Her writing has also appeared at CNN, Politico, USA Today, the Los Angeles Times, Defense One, and The American Conservative, among other outlets.
Current debates over trade policy illuminate both the premises of American foreign policy in the aftermath of the Second World War and the way that high neoliberalism might be in tension with it. While in some ways continuous with the preceding project of global engagement, the policies of the post-1989 (and especially post-2001) era might actually undermine the architecture of the “liberal international order.”
This was a key context for Josh Hawley’s May speech that called for the U.S. to exit the World Trade Organization. The Missouri senator was not demanding a return to isolationism. Instead he praised the General Agreement on Tariffs and Trade, the 1947 pact that helped establish an infrastructure for multilateral trading agreements. He also argued for the continuing importance of global trade. Hawley’s critique of the WTO was less about the virtues (or vices) of global trade itself and more about geopolitics: at this present moment, certain transnational institutions serve neither the national interest nor the project of sustaining democratic governance.
As Hawley noted, the exigencies of the Cold War played a major role in the construction of the global trading order after the Second World War. While many American policymakers might have celebrated the virtues of “free trade,” they also saw trade agreements as a vehicle for helping to solidify an alliance to contain the Soviet Union. Mutual access to national economic markets could, they hoped, be a way of reducing possible geopolitical conflict within the anti-Soviet alliance. Favorable access to the American market could encourage cooperation with American leadership. And, of course, many American corporations had hopes of export opportunities in other countries.
However, American policymakers during this period were not doctrinaire free-traders. Trade was subordinated to broader geopolitical aims, especially regarding the Soviet Union. The United States maintained Smoot-Hawley-level tariffs on Soviet products until the mid-1970s (and continued many tariffs after then, too). It also restricted the export of many industrial products to the USSR. America sought to ensure that the commanding functionaries of the Soviet Union would have minimal leverage over its own economic activity and that the Soviets would not be able to make use of key American technologies. More broadly, policymakers supported a number of measures to ensure the continued vitality and industrial capacity of the American economy.
The establishment of the World Trade Organization in 1995 was part of a broader movement toward the bureaucratic formalization of postwar efforts. The WTO had a more expansive mandate than its GATT predecessor, and a more powerful bureaucracy to manage global trade. This powerful transnational bureaucracy has now elicited criticism from Senator Hawley and others.
The entrance of the People’s Republic of China into the WTO in 2001 caused significant national and global economic turbulence. Many observers hoped that the admission would cause China to liberalize, and some—like George W. Bush—even predicted that it would reduce the American-Chinese trade deficit. Instead, the trade deficit exploded, and “China shock” rattled many American communities. Far from embracing some American model of liberalization, the Communist Party of China has trumpeted its vision of “socialism with Chinese characteristics” across the globe. A recent analysis from the Lowy Institute found that entrance into the WTO has allowed the PRC to become a more important trading partner for countries across the globe, often displacing the United States. The close trading relationship between the United States and the PRC has also given the Communist Party of China the increased ability to intervene in American politics. For instance, a New York Times analysis found that the Chinese government had calibrated its trade retaliations to inflict particular harm on Republican-held districts in the lead-up to the 2018 midterms.
These trends have assailed the foundations of multilateral institutions. The success of the “liberal” order relies on deeper social and political infrastructures, which neoliberal policy too often disregards. Along with a sense of diminished sovereign control, economic dislocation has stoked the fires of bitter outsider politics on both sides of the Atlantic. In the United States, annual real economic growth dropped by half during the 2001 to 2015 period, long before Donald Trump took his famous ride down the golden escalator. The Great Recession—fueled in part by the financialization that has characterized the neoliberal period—helped ignite various populist movements and delivered a body blow to global trade. Trade as a percentage of world GDP peaked in 2008.
One of the major insights of the architects of postwar American foreign policy was that engagement abroad depended upon consensus and prosperity at home. A national social safety net could help diffuse social tensions and avoid radical economic dislocations. National infrastructure projects—such as a federal highway system—had obvious national security implications, but they could also provide economic benefits. At times, promoting those domestic goals required intervention in global trade flows. For instance, Ronald Reagan’s negotiation of import quotas for Japanese automobiles helped lessen domestic political tensions over the effects of trade with Japan and encouraged Japanese auto manufacturers to open factories in the United States.
There are lessons here for present-day policymakers. Policies that support economic resilience and industrial capacity might not be opposed to the project of global engagement; they might actually be necessary to make that engagement possible. A United States that does not have ready access to medical supplies, military equipment, foodstuffs, and other key economic goods will be unable to exercise the responsibilities of a great power on the global stage. Diversified networks of trade can be one vehicle for such access, as can industrial policies to encourage domestic production of key economic and strategic goods. Senator Tom Cotton and Congressman Mike Gallagher have proposed the “Protecting our Pharmaceutical Supply Chain from China Act,” which speaks to both aims. Over time, it requires certain federal agencies to purchase pharmaceutical products with PRC-free active pharmaceutical ingredients. And it offers various incentives for companies to expand pharmaceutical manufacturing at home.
Efforts at nurturing domestic industrial capacity are not incompatible with continued participation in global networks of trade. Senator Hawley has called for a coalition of trade among American allies, and Marshall Auerback has argued that, even within the WTO framework, there is plenty of room for trade reform that reinforces economic sovereignty and encourages domestic manufacturing. Today, many countries—such as Germany and South Korea—have used industrial policy to develop a certain economic infrastructure while also trading with the rest of the world. The choice between industrial policy and participation in multilateral institutions is a false one. What exactly a reformed vision of globalization should involve is up for debate, but it nevertheless seems possible to consolidate some of the gains from the current iteration of globalization while also addressing some key challenges of civic and economic integrity.
One could go even further and suggest that an absence of American economic reform might undermine the standing of many multilateral institutions created in the postwar era. The strength of the postwar international order rested to some extent upon the strength of its member nations, especially key stakeholders. The United States during the Cold War tried to combine international engagement with the maintenance of the internal resources for a resilient democratic republic. While there may be some tension between the project of openness to trade and sustaining robust internal supply chains, politics is in part the art of navigating tensions. Striking some balance between those imperatives can allow the United States to continue to be a cornerstone for many of the institutions it helped erect out of the ashes of war.
Fred Bauer is a writer from New England. You can follow him on Twitter: @fredbauerblog.
The post Economic Resilience and Global Trade Are Not Mutually Exclusive appeared first on The American Conservative.
It’s been a staple of television for decades: a dramatic or comedic show following the work of a pair of police officers as they go about their sworn duty to protect and serve.
From Dragnet to Police Squad (in color!) to NYPD Blue to The Wire to Brooklyn 99, the cop show has taken many forms. Some have merely used police as a more or less incidental backdrop to comedy (in the way that Irish cult classic sitcom Father Ted wasn’t really about religion or priests). Others have shown the darker and grittier side of police work, not altogether sympathetically. And some have managed to deal with serious issues, including critiques of police misconduct, with humor. In the wake of George Floyd’s murder, there’s been a lot of talk about canceling or reimagining cop shows. There’s also one worth watching anew: Dragnet’s late-60s, full-color cousin Adam-12.
The police procedural follows a pair of LAPD officers, veteran Pete Malloy and rookie Jim Reed, as they drive around late-60s Los Angeles responding to various, often humorous incidents, with a criminal encounter for the denouement. Like most midcentury television dramas, it was a fairly serious show that was also free of explicit content of any kind. Like Dragnet, its encounters and crimes were purportedly pulled from real police reports. And like most cop shows of its era, it casts the police in an almost entirely positive light (though the idealized LAPD of the show employs a mostly white force).
When a young officer with children is gravely injured and dies in the hospital, we are invited to mourn and to empathize with his family. Characters of all races and walks of life populate the show, from zany counterculture druggies to jumpy, inexperienced criminals to a group of young Mexican children who mob a passerby, mistaking him for the U.S. president. One would think some liberty has been taken with the incidents the officers respond to, and it probably has, though police, like priests, have seen almost everything.
Reed and Malloy never become angry, overreact, or shoot first and ask questions later. Routine stops never devolve into extrajudicial execution. They are skilled at deescalation, negotiation, and conflict resolution, and evince no hint of a militaristic mindset towards police work. When brutality or corruption occur—rarely—they are never tolerated or treated lightly—and never committed by the protagonists. When our officers do make a mistake, it prompts discussions on how to do better next time. The officers drive ordinary cars, not souped-up black Chargers, they carry ordinary service handguns rather than black rifles, and, of course, they wear blue, the origin of all the pro-police slogans riffing off that color. (Why, then, do the police prefer menacing black equipment today?)
Reed and Malloy are unswervingly patient, professional, and courteous, even to the criminals they arrest; in one episode, for example, a Latino boy turns himself in for an unsolved arson, and the officers thank him for being honest as they cuff him.
Wikipedia describes Adam-12 as a “realistic police drama.” Given what we’ve seen from the police only in the last month, that is rather like calling Pravda a disinterested journal of Soviet affairs. Those who know the full extent of police brutality and misconduct over the decades can hardly watch these midcentury procedurals without rolling their eyes. Such shows did not portray the police as they are, and not even necessarily as they would like to be. Rather, they portray the police as they would like to be portrayed.
And it’s not as if the producers simply happened to like cops. Real police departments have had formal and informal influence over their depictions on television. Dragnet, like Adam-12, was advised by the LAPD itself, and featured real LAPD buildings and cars. Adam-12’s end credits even include a brief frame that reads: “Technical advice for the filming of Adam-12 came from the office of Chief Thomas Reddin, Los Angeles Police Department.” What, exactly, “technical advice” consisted of is uncertain, but it would be naive to think that it referred only to portraying the proper make and model of a squad car or service pistol. Kent McCord, who played Jim Reed, gave an interview in 2016 that revealed some of these details. Most of the police involvement was overtly technical, but the relationship between the producers and the department was close enough to allow for informal influence. At The Atlantic, Conor Friedersdorf wrote at length about the LAPD’s essentially propagandistic role in Dragnet’s police portrayals, taking exemplary police work and presenting it as the norm to the producers. The murky but very real influence of police departments on police-themed TV shows is not unlike the pervasive and longstanding influence of the military-industrial complex on movies and video games, and their largely positive or at best uncritical portrayals of war and weaponry. In his interview, McCord adds that Adam-12 episodes were actually used as training videos for police departments around the country. It would seem there was sincerity and idealism mixed with self-promotion.
Though it’s worth noting that the LAPD is the same department that inspired N.W.A.’s “Fuck Tha Police.”
Author Douglas Rushkoff wrote, in his 1994 book Media Virus! Hidden Agendas in Popular Culture, that Adam-12 “marked the last gasp of this righteous style of cop TV.” Yet “it was as if the world was laughing at the ‘straight’ roles these cops had to play in an increasingly un-straight world….‘to protect and serve’ meant to acknowledge and permit a certain amount of bizarre activity in early seventies Los Angeles.”
It is easy to assume, if you don’t know very much history, that this all changed with the crack epidemic or when the Bush-era Middle East wars blew back in the form of desert-tan military equipment in the hands of cops. But the police needed to be forced to give Miranda warnings all the way back in 1966—two years before Adam-12’s first episode. The police beat civil rights protestors, and throughout the late 19th and 20th century, they permitted and in some cases participated in lynchings of black men. Bull Connor’s brutal officers didn’t deploy tanks and carry M-16s. Dressing the police like Reed and Malloy, although it would be a good start, will not turn copaganda into documentary. That requires a reckoning with racism and institutional culture, and a much more substantial set of reforms.
Shows like Adam-12 were produced with good intentions, and they made it easier for young people—certain young people, anyway—to admire the police and perhaps to seek a career in the force. In 2015, following the death of Martin Milner, who portrayed Malloy, LAPD Chief Charlie Beck put it this way: “As you watch any of the ‘Adam-12’ episodes, you see professional, compassionate, internally driven, hardworking, clean-cut, impeccably tailored, fit Los Angeles police officers—those police officers that have no dark side, that do the right things for the right reasons every time. And that is the image that drew us all to this place.”
It’s no surprise cops like Adam-12. It’s a good show, and it certainly portrays two ideal officers and a clean, professional department. Real-life police departments could do worse than supplementing their training videos, once again, with a few episodes.
The post Adam-12 Was ‘Copaganda’—And Cops Should Emulate It appeared first on The American Conservative.
“John Roberts is David Souter,” Newsweek’s conservative opinion editor, Josh Hammer, said Thursday.
The nation’s highest court — led by its putatively conservative chief, Roberts — has just stiff-armed the Trump administration’s attempt to demolish the Deferred Action for Childhood Arrivals (DACA) program, a signature Obama-era policy. Arguing that history repeats itself, Hammer’s reference was to President George H.W. Bush’s first appointee to the Supreme Court; the Rockefeller Republican’s lurch to the left bedeviled “values voters” for the better part of two decades. In smacking down the Department of Homeland Security, Roberts said: “The dispute before the court is not whether DHS may rescind DACA. … All parties agree that it may. The dispute is instead primarily about the procedure the agency followed in doing so.” Roberts’ archconservative colleague — Justice Clarence Thomas — called the majority’s rationale a “mystifying” cop-out. And Justice Brett Kavanaugh said “the only practical consequence… appears to be some delay.”
But for President Trump, it was yet another morning of bitter recriminations — in the face of mounting legal defeats: “Do you get the impression that the Supreme Court doesn’t like me?” Earlier this week, in a case Trump appeared to care less about, the Court’s majority — joined by one of Trump’s appointees, Neil Gorsuch — extended massive Civil Rights protections to gay and transgendered people. It’s worth remembering that the president ran first and foremost on immigration. Though popular enough with the demographic, Trump felt the need to shore up the Religious Right with the selection of his deputy, Michael R. Pence. Not a religious man, on the week’s earlier decision the president said: “They’ve ruled and we live with their decision. That’s what it’s all about. We live with the decision of the Supreme Court. Very powerful. A very powerful decision actually. But they have so ruled.”
But however you slice it, two legal defeats for the right wing in one week is bitter medicine for the head of the Republican Party. It couldn’t have come at a worse time. Trump is already juggling a pandemic, a new economic depression, and a flawed response to social unrest that’s been both lacerated in the liberal press and derided by hardliners in his own column. A person close to the president told me he looks “weak” and “that’s the one thing voters won’t forgive.” Though Roberts was a George W. Bush appointee — and it’s worth noting that Gorsuch did not join the DACA majority — Trump, nonetheless, now risks personally owning what is increasingly viewed by doctrinaire conservatives as a generation of failed political strategy as it relates to the judicial branch.
“I will be releasing a new list of Conservative Supreme Court Justice nominees, which may include some, or many of those already on the list, by September 1, 2020,” Trump said Thursday. He continued (in perhaps the first public sign that he knows he trails in this race): “If given the opportunity, I will only choose from this list, as in the past, a Conservative Supreme Court Justice… Based on decisions being rendered now, this list is more important than ever before (Second Amendment, Right to Life, Religous [sic] Liberty, etc.) – VOTE 2020!”
But it’s now an open question if many loyal Republicans, after decades of presidential dominance — twenty-four years of control of the White House since 1981 — will conclude that it just doesn’t matter who gets appointed to the court, if the culture is, from their point of view, lost. It’s a perspective many would argue is overheated but from a political standpoint, pessimism about the public square — perhaps best exemplified by The American Conservative’s Rod Dreher and “the Benedict Option” — is commanding an increasing following. At the very least, that impression could have real reverberations in the coming election, given what’s on offer. The Republican Party boasts the White House, the U.S. Senate, a majority of Supreme Court seats and governor’s mansions. And yet, over the last several months, the country has undergone the most sweeping social change in at least a half-century.
Another figure close to the president draws another parallel to George H.W. Bush, the last incumbent in the White House to lose re-election. The forty-second president invoked the Insurrection Act — something Donald Trump declined to do — to quell the 1992 riots in Los Angeles. But the damage was done. The images of a major American city ablaze were seared into national memory. Voters opted for something different come November — even flirting with the most credible third party challenge in nearly a hundred years — but the prize was eventually won by the Democratic Party. By any measure, the urban wreckage in 2020 — in tandem with months of national lockdown — is more significant than in 1992, and national disillusionment as severe as 1968, that year of assassination and metropolitan meltdown. Some conservatives console themselves that history will repeat — with the Left punished as it was in 1968. That interpretation elides that the Republicans, not the Democrats, are in power. If anything, the logic works in reverse.
Thirty years ago, I published an article in a rather obscure legal publication (the Real Property, Probate, and Trust Law Journal) entitled “Suburban Zoning: Two Modest Proposals.” My proposal was to relax local restrictions on accessory apartments and business occupations in single-family homes. It was reprinted in an annual handbook of zoning articles and in a publication addressed to municipal attorneys, but was otherwise stillborn. Its hour has come around at last.
The devastating impact of the virus on the institutionalized elderly almost certainly will reduce the number of Americans who are prepared to consign their parents and other elderly relatives to the tender mercies of nursing homes. In some states, 60 percent of coronavirus deaths involve residents of these institutions.
The “granny house” is a familiar European institution. The two-family house where rent from a second unit pays the mortgage is also familiar and finds some recognition here in FHA loan legislation, though not in common practice. Tax laws abroad encourage shared housing. In Germany, owner-occupants who construct an additional rental unit can deduct against taxes 5 percent of the cost for 8 years and 2 ½ percent thereafter, while in Finland and Great Britain large portions or all of the rent can be disregarded for tax purposes. Forty percent of American suburbs allow accessory housing in some form; richer suburbs are in the vanguard, since without such housing it would be difficult to recruit modestly paid teachers and policemen.
There has been a dramatic change in household size that renders much housing built in earlier periods needlessly large. The number of persons per household declined from 3.33 in 1960 to 2.52 in 2019. The number of adult persons living alone increased from 6.9 million in 1960 to 36.5 million in 2019. Creation of new small units may directly benefit minority and lower-income persons by being introduced to neighborhoods in small numbers, vouched for by landlord-neighbors. Negative effects are few, since most suburban neighborhoods were built in anticipation of larger occupancies per housing unit. The existing housing stock is thus the sleeping giant of housing policy. In most homes with more than one bathroom, all that is necessary for creating a second unit is an additional kitchen, at costs far less than new subsidized housing construction typically costing hundreds of thousand dollars per unit.
The virus crisis and attendant shutdowns has also resulted in a massacre of small businesses and in increased demand for home offices for working remotely. Most zoning ordinances allow professional but not retail businesses in residences. It has been urged that the zoning exceptions favoring professional offices rest not on functional but social class distinctions. The slaughter of smaller retailers and artisans by the present crisis rests on their inability to reduce their principal fixed cost, namely rent. Allowing use of one ground-floor room in a home for a retail business, subject to appropriate traffic and health regulations, would allow many very small businesses to withstand economic shocks, while enhancing convenience, reducing the need for automobile use in suburbs, and providing opportunities for youth employment.
Lancaster County, Pennsylvania has a generous home occupation ordinance allowing use for business purposes of not more than 25 percent of a dwelling unit or 500 square feet, with not more than one outside employee. Provision is also made for “live-work”‘ units in some residential zones where the business is confined to the ground floor, occupies no more than 50 percent of the building, and employs no more than five outside employees, subject to limits on signage and deliveries by large commercial trucks.
The late Bernard Siegan in his study of Houston, an un-zoned city, concluded that commercial uses will normally limit themselves to no more than five percent of structures and will provide services to local residents and augment the viability of neighborhoods, particularly for the young and the old without easy access to automobiles. Small local convenience stores will also provide competition for the Walmarts, Amazons, and Targets that have been allowed to augment themselves as a result of ill-designed coronavirus restrictions.
No politician gets to cut ribbons when a second kitchen is installed in a single-family home or when a small grocery store appears in one room in a large featureless housing development, nor will such entities be fertile sources of campaign contributions. But some green shoots of this nature will do something to remedy the depressed spirits caused by the decimation of the residents of nursing homes—and the disappearance of variety, color, and choice in commercial shopping areas.
George Liebmann is the author of a number of books on local and sub-local institutions, most recently America’s Political Inventors: The Lost Art of Legislation.
The post To Save Elderly at Risk of Covid, Legalize ‘Granny Houses’ appeared first on The American Conservative.
Wednesday, Secretary of State Mike Pompeo met for seven hours at Hickam Air Force Base in Hawaii with the chief architect of China’s foreign policy, Yang Jiechi.
The two had much to talk about.
As The Washington Post reports, the “bitterly contentious relationship” between our two countries has “reached the lowest point in almost half a century.” Not since Nixon went to China have relations been so bad.
Early this week, Chinese and Indian soldiers fought with rocks, sticks and clubs along the Himalayan truce line that dates back to their 1962 war. Twenty Indian soldiers died, some pushed over a cliff into a freezing river in the highest-casualty battle between the Asian giants in decades.
Among the issues surely raised with Pompeo by the Chinese is the growing bipartisan vilification of China and its ruling Communist Party by U.S. politicians the closer we come to November.
The U.S. has been putting China in the dock for concealing information on the coronavirus virus until it had spread, lying about it, and then letting Wuhan residents travel to the outside world while quarantining them inside China.
In America, it has become good politics to be tough on China.
The reasons are many.
High among them are the huge trade deficits with China that led to an historic deindustrialization of America, China’s emergence as the world’s first industrial power, and a U.S. dependency on Chinese imports for the vital necessities of our national life.
Then there is the systematic theft of intellectual property from U.S. companies in China and Beijing’s deployment of thousands of student-spies into U.S. colleges and universities to steal security secrets.
Then there is the suppression of Christianity, the denial of rights to the people of Tibet and the discovery of an archipelago of concentration camps in western China to “reeducate” Muslim Uighurs and Kazakhs to turn them into more loyal and obedient subjects.
Among the strategic concerns of Pompeo: China’s fortification of islets, rocks and reefs in the South China Sea and use of its warships to drive Vietnamese, Malaysian, Indonesian and Philippine fishing vessels out of their own territorial waters that China now claims.
Another worry for Pompeo: China’s buildup of medium- and intermediate-range ballistic missiles, a nuclear arsenal not contained or covered by the Cold War arms agreements between Russia and the United States.
Then there were those provocative voyages by a Chinese aircraft carrier through the Taiwan Strait to intimidate Taipei and show Beijing’s hostility toward the recently reelected pro-U.S. government on the island.
Finally, there are China’s growing restrictions on the freedoms the people of Hong Kong have enjoyed under the Basic Law negotiated with the United Kingdom when the territory was ceded back to Beijing in 1997.
Also on the menu at Hickam was almost surely the new bellicosity out of Pyongyang. This week, the building in Kaesong, just inside North Korea, where bilateral peace talks have been held between the two Koreas, was blown up by the North. With the explosion came threats from the North to send combat troops back into positions they had vacated along the DMZ.
The rhetoric out of the North against South Korean President Moon Jae-in, coming from the 32-year-old sister of North Korean dictator Kim Jong Un, the rising star of the regime, Kim Yo Jong, has been scalding.
In a statement this week, Kim Yo Jong derided Moon as a flunky of the Americans: “It is our fixed judgment that it is no longer possible to discuss the North-South ties with such a servile partner engaging only in disgrace and self-ruin, being soaked by deep-rooted flunkyism.”
North Korea’s state media published photos of the destruction of the joint liaison office. Pyongyang is shutting off communications with Seoul, and a frustrated South looks to be ginning up and reciprocating.
The North-South detente appears dead, and President Trump’s special relationship with Kim Jong Un may not be far behind.
There are rumors of a renewal of nuclear weapons and long-range missile tests by the North, suspension of which was one of the diplomatic achievements of Trump.
Whether Trump’s cherished trade deal with China can survive the growing iciness between the two nations remains to be seen.
What the Chinese seem to be saying with their actions — against India, Vietnam, Malaysia, Indonesia, the Philippines, Taiwan, Australia, Hong Kong and Japan — is this: Your American friends and allies are yesterday. We are tomorrow. The future of Asia belongs to us. Deal with it!
No one should want a hot war, or a new cold war, with China or North Korea.
But if Trump was relying on his special relationships with Kim Jong Un and Xi Jinping, his trade deal with China and his commitment by Kim to give up nuclear weapons for recognition, trade and aid, he will have to think again.
For the foreseeable future, Communist bellicosity out of Beijing and Pyongyang seems in the cards, if not worse.
Patrick J. Buchanan is the author of Nixon’s White House Wars: The Battles That Made and Broke a President and Divided America Forever.
Around the middle of the last century, American conservatives came to regard “relativism” as an essential characteristic of the Left.
Political theorist and onetime Yale professor Willmoore Kendall, who had been the teacher of William F. Buckley, was the best-known exponent of this position. Kendall was particularly concerned that liberals in post-World War II America were unwilling to stand up to communist infiltration and Soviet aggression, at least not in the decisive manner that he and his student, who became the animating spirit of the conservatism of that age, would have desired.
Kendall’s explanation, which others echoed and, in some cases, anticipated, was that many intellectuals believed “in an unlimited right to think and say what you please, with impunity and without let or hindrance.” Particularly in the face of the communist threat, Kendall thought that Americans would have to give up the idea of an “open society.” They would have to grasp that “any viable society has an orthodoxy—a set of fundamental beliefs, implicit in its way of life, that it cannot, should not, and, in any case, will not submit to the vicissitudes of the marketplace.”
Kendall viewed English democrat and feminist John Stuart Mill as a particularly dangerous thinker on political questions. He was convinced Mill’s best-known work, On Liberty, had gone too far in advocating an “open society.”
Mill set out to defend the right of totally free inquiry but, according to Kendall, landed squarely on relativism. In The Conservative Affirmation (1963), Kendall traced the non-judgmentalism of many Americans when faced by the communist threat to Mill’s willingness to consider all views and opinions. According to Kendall, Mill helped create America’s “national religion of skepticism” and made it increasingly difficult for Americans to hold on to what was left of a traditional society. Mill also dealt with moral issues by encouraging the pursuit of truth without accepting “truth itself with all its accumulated riches to date.”
In a moving tribute to Kendall, Tom Woodlief, writing recently at The American Conservative, declared that “this outcast Yale professor predicted 2020 better than his erstwhile colleagues.” Kendall had warned against “the suicidal pact with relativism,” which is now driving the antifascist Left. According to him, “the doyens of the suicidal society will feel an irresistible compulsion to silence the voices insisting that there is truth, even Truth, and that therefore many other beliefs are in error.”
Please note that I fully share Mr. Woodlief’s admiration for Kendall and especially for his writings on the formation of American constitutional government and his perceptive reading of the political theory of John Locke. Where I must part company is in Kendall’s attribution to the Left of a fixation with an “open society.” Equally open to question is Kendall’s treatment of Mill’s On Liberty, a work that Maurice Cowling, Linda Rader, and Joseph Hamburger have all interpreted differently from Kendall. These scholars have documented that Mill was far less interested in open discussion than he was in other ends. Above all, he was trying to build a secular society based on a consensus centered on scientific truth. Mill was an explicit 19th-century progressive who believed that open inquiry would advance his teleological goals.
Moreover, with due respect to Woodlief and Kendall, those who support Antifa and Black Lives Matter have hardly failed to recognize that there is “Truth” in the world. They simply reject the moral right of their enemies to express other views. This is a moral stand, hardly a relativistic one, and it is a political-existential one, in the sense in which Carl Schmitt understood “the Concept of the Political” as the most intensely antagonistic of human relationships. It is unimaginable that the more fervent and more activist side in our culture wars is not driven by its own morality, which expresses itself in rage.
One might also question whether the Left has ever believed consistently in something called “moral relativism” or whether it has merely appealed to it as a tactic to disarm opponents. Certainly the pro-communist leftists with whom Kendall debated were not likely to “relativize” Nazism or even the Francoist regime in Spain or South African Apartheid the way they did Soviet tyranny.
Russell Kirk liked to tell the story of a leftist acquaintance who claimed to have a perfectly open mind. When Kirk asked his interlocutor who was morally superior, “Jesus of Nazareth or Stalin,” this fellow seemed unable to rate those figures by the required standard. But when he was asked who was worse, Hitler or Stalin, Kirk’s acquaintance would immediately respond “Hitler.” I had similar experiences with advocates of the “open society” before the Left gave up its facade of universal tolerance. It may be that dishonesty, not relativism, was the problem with how the Left has presented itself.
If one were to ask what exactly the Left has believed about morality over the decades, I would begin by pointing out that the most important concept is equality. The Left has never denied this and I see no reason to question that commitment. What seems to me striking is the Left’s preoccupation with equality to the neglect of other values that seem at least as much deserving of respect, such as deference to elders, respect for the achievements of one’s civilization, piety, freedom, and so on. We might also question how the Left understands its highest value, which is clearly different from the way non-leftists might approach it. For example, some may think that equality before the law is enough; others may want equal voting rights, and still others may believe it is the duty of the state to reduce its citizens or subjects to the same living conditions.
The present Left also seems interested in imposing equality of esteem for those whom it designates as historical victims. This is certainly not an expression of relativism but an attempt to carry a highest value one step beyond where it was carried in the past. The drive toward a more total equality brings with it a host of human problems, anarcho-tyranny as seen in cities like Seattle right now being the most obvious. But the belief that all values are relative does not in any way seem to have influenced this course of events.
Another curious characteristic of the Left is how furiously it reacts to Western failures to meet its fastidious standards of equality. The existence of economic disparities in Western countries drove generations of leftists to look for answers in communism, or at least to treat communist governments as efforts to create more “just” or more “scientifically run” societies. The enemy then and afterwards was “fascism” and it remained so long after the Second World War. Fascism has been defined as a chronic Western disease, arising out of specifically Western cultural and religious attitudes rooted in bigotry. Fascism used to be explained with reference to those who controlled the means of production. It was an ideological tool for oppressing the poor and maintaining colonial empires. In its more contemporary form, fascism has become whatever the intersectional Left considers to be morally reprehensible. Since the list of fascist offenses continues to grow by the minute, the only moral way to deal with this right-wing pestilence is by “canceling culture.” Only by getting rid of all reminders of a traditional Western society can we protect ourselves from the pervasive fascist menace.
Yet somehow the evils we are supposed to combat never appear anywhere outside the West. Other societies live in a perpetual state of grace as victims of the West or as examples of what we might become with the proper reeducation. The late Paul Hollander wrote a voluminous study on “political pilgrims” who visited “progressive” or Marxist societies, where they hoped to find human perfection. Hollander’s “pilgrims” were hardly relativists. They were fixated on a highest value, usually equality, but equality combined with scientific management, which they imagined was being realized in some distant place but not in their own country.
Where Kendall was correct was in grasping that the Left was destroying traditional human attachments, where people are integrated into families and communities. There, morality operates in an inherited social context, and not in the pursuit of highest values. Although one may be skeptical about the portentous importance that Kendall ascribed to relativism, his description of a society without shared premises descending into “ever-deepening differences of opinion” is accurate. So was his prediction that such a society would descend “into the abandonment of the discussion process and the arbitrament of public questions by violence and civil war.”
Paul Gottfried is the editor-in-chief of Chronicles. He is also Raffensperger Professor of Humanities Emeritus at Elizabethtown College, where he taught for 25 years, a Guggenheim recipient, and a Yale Ph.D. He is the author of 13 books, most recently Fascism: Career of a Concept and Revisions and Dissents.
A quick run-through of father figures in classic American literature reveals a surprisingly small list.
Many heroes or heroines of American literary classics are orphans, raised by grandparents or extended family—Tom Sawyer and Huck Finn quickly come to mind. Others have fathers who are alive but absent, fighting in wars or otherwise removed from the immediate scene. Mr. March from Little Women fits this profile, as does the dad in A Wrinkle in Time.
Stand-in father figures abound in works like The Last of the Mohicans, in which Delaware Indian Chingachgook adopts the orphaned Natty Bumppo, creating a meaningful father-son relationship. The fatherhood of these characters, however, is not a central aspect of the storylines. When specifically considering fathers who are front and center in the lives of their families in the classic American literary landscape, the list comes down to Pa Ingalls and Atticus Finch. These two dads are the gold standard of fatherhood in American literature.
The character of Charles Ingalls in the Little House books, referred to as Pa, is based on and named for author Laura Ingalls Wilder’s own father. Pa is the quintessential model father. He tells stories, plays games, and has a great sense of humor. He protects and provides for his family, always placing its needs before his own. Pa is also a fair disciplinarian of his children, expecting them to do exactly as he tells them, which ensures their safety. Pa’s love for his family is never in doubt.
Similarly, Atticus Finch is modeled on To Kill a Mockingbird (1960) author Harper Lee’s own attorney father. The children call him Atticus rather than Dad or Pa, indicating an unusual relationship. He is their only living parent, which means that Scout and Jem must rely on Atticus in ways that children with a living mother would not. They do have a mother figure in Calpurnia, the housekeeper, but Atticus is clearly the main parent. He is not interested in hunting, fishing, or playing sports, things that other fathers do. Instead, he reads the newspaper and practices law. Later, the kids are amazed to learn that their father is one of the best shots in the county when he must take down a rabid dog. The kids didn’t know that their dad could shoot!
Most importantly, Atticus models compassion, kindness, and a sense of justice that applies to all people equally, no matter their class or skin color. These two literary fathers, Charles Ingalls and Atticus Finch, present models of fatherhood in large part because they are created as tributes to great real-life fathers. The love of authors Laura Ingalls Wilder and Harper Lee for their own fathers shapes their literary creations. Pa Ingalls and Atticus Finch may seem to offer unattainable heights for father-children relationships, but their basis in actual people makes their depiction one of realism rather than fantasy.
Readers first meet Pa Ingalls in Laura Ingalls WIlder’s book, Little House in the Big Woods (1932). Wilder’s intention with the book was to preserve her father’s stories for future generations, believing them to be worth saving. These tales are entertaining yet didactic, teaching his young daughters important frontier lessons. Editors at Harper Brothers asked Wilder to expand her narrative with explanations of pioneer life. Pa looms large in these as well, as readers see through Laura’s eyes Pa’s range of skills: from butchering pigs, to smoking meat, to making bullets, to cleaning rifles. Sprinkled throughout these explanations are the Ingalls family experiences of seasonal highlights, for which Pa—often with fiddle in hand—is the center: harvest, Christmas, maple sugaring and its accompanying dance, extended family visits. In subsequent books, as Laura grows up, her relationship with Pa deepens and grows. They are kindred souls.
Laura inherits Pa’s love for adventure and deep-seated need to keep moving west. In On the Shores of Silver Lake, she watches the birds depart and feels that emigrant restlessness before they have even located a homestead in Dakota Territory. “The wings and the golden weather and the tang of frost in the mornings made Laura want to go somewhere. She did not know where. She wanted only to go. . . . ‘Oh Pa, let’s go on west!’” In response, Pa acknowledges his similar yearnings. “‘I know, little Half-Pint. . . You and I want to fly like the birds. But long ago I promised your Ma that you girls should go to school. You can’t go to school and go west. When this town is built there’ll be a school here. I’m going to get a homestead, Laura, and you girls are going to school.’ Laura looked at Ma, and then again at Pa, and she saw that it must happen; Pa would stay on a homestead, and she would go to school.”
Pa sets the example for Laura, making personal sacrifices for the good of the entire family. In that same conversation, Ma attempts to soften the blow by explaining that both Laura and Pa will thank her some day for tying them to civilization. Pa demonstrates for Laura the proper response: “Just so you’re content, Caroline, I’m satisfied.” Wilder as author follows that statement with insight into Pa’s heart: “That was true, but he did want to go west.”
As Laura grows older, Pa must rely on her help with the heavier homesteading chores. He cannot do them alone, and there is no money to hire a farmhand. This reliance on Laura’s help further strengthens their special bond, as does Laura’s deepening appreciation for Pa’s leadership in the newly-formed town of DeSmet. When blizzards block trains from bringing supplies for almost half a year, Pa’s ingenuity at home and guidance among townspeople keeps everyone alive. Laura’s boredom the following winter prompts Pa to create weekly “Literaries,” which not only entertain the townspeople, but provide a platform for Pa’s humor and talent. Throughout the entire Little House series, Wilder pays tribute to the real Charles Ingalls in her portrayal of him as a loving, fair, and inventive father, deserving of respect from his neighbors as well as his children.
In much the same way, Harper Lee depicts Atticus Finch as affectionate, wise, and of the highest character. Lee called her work “a simple love story,” but she did not mean what one would normally think of as a love story. There’s no big romance in To Kill a Mockingbird. Instead, Lee focuses on the love of a father and his two children, and his desire to teach them integrity through his own example. The children daily watch for his return from his law office, running to meet him at the corner post office the moment they catch a glimpse of him. When Atticus takes on a case defending a black man accused of raping a white woman, Scout and Jem are plagued by the taunts of almost everyone in town that their father is partial to African Americans. As far as Scout is concerned, those are fighting words. She and Jem readily defend their father with their fists. Atticus demands that they stop. These insults make for some confusing moments for young Scout, who doesn’t understand why her father’s legal defense of a black man should cause anyone to look askance at her family. During the trial, Jem and Scout sneak into the courtroom and sit in the balcony with the African Americans. When Atticus leaves the building at the end of the trial, Scout is intent on the action on the courtroom floor and oblivious to her surroundings. “Someone was punching me, but I was reluctant to take my eyes from the people below us, and from the image of Atticus’s oily walk down the aisle. ‘Miss Jean Louise?’ I looked around. They were standing. All around us and in the balcony on the opposite wall, the Negroes were getting to their feet. Reverend Sykes’s voice was as distant as Judge Taylor’s: ‘Miss Jean Louise, stand up. Your father’s passin’.”
The respect shown to Atticus amazes the children, who see him only as their somewhat inept father. When Atticus’ devotion to justice is met by violence directed at his children, he questions his parenting abilities. In the next moment, however, Scout reveals that she has internalized her father’s lessons about compassion and justice. What more could a parent desire than that.
The integrity and good parenting modeled by Pa Ingalls and Atticus Finch stem from the love of novelist daughters for their own fathers, to whom both Laura Ingalls Wilder and Harper Lee penned glorious tributes. In turn, these literary figures based on real fathers serve as fine reminders of what the best fathers bring to their families.
Dedra McDonald Birzer is a lecturer in history and rhetoric at Hillsdale College. This article is a tribute to the model fathers in her own life: her father Ken McDonald and her husband Brad Birzer.
The post Daughters That Turned the Love for Father Into Literary Classics appeared first on The American Conservative.
Last August, CNN published a moving feature on inner-city Baltimore by their enterprise writer, John Blake. In it, he describes the story of an unlikely Trump voter: his late, 91-year-old African American father who provided for his family with “a well-paying job, with union benefits, as a merchant marine.”
Blake paints a vivid portrait of his childhood memories in what was once known as “The Greatest City in America”:
The community I grew up in during the 1970s and ’80s was full of men and women like my father. Many of them had blue-collar jobs at places like the Bethlehem Steel plant or the Domino Sugar plant in the city’s inner harbor. They proudly purchased big Chevy Impalas, kept their homes in impeccable condition and had crab cookouts in their backyards.
Today, however, the city is in dire straits. Blake describes his family neighborhood in West Baltimore as an “economic wasteland,” noting that “Those stable, career jobs have now been replaced with minimum-wage service jobs and temp work.” Conservatives like to emphasize the importance of personal responsibility, church attendance, and stable families as a cure for inner-city poverty, however, Blake urges them to reconsider what he views as the most successful “anti-poverty” and “anti-drug” program known to man: JOBS, JOBS, JOBS.
It’s a message that comes naturally to conservatives. In fact, one could argue that it was the central theme of the Trump campaign in 2016. But Republicans have failed to translate that message into a majority coalition over the past 3.5 years. Instead, the President sits with a job approval rate at 40.9%, according to recent polling from FiveThirtyEight. Even worse, only 8% of blacks voted for Trump in 2016, and at the start of 2020, 83% of black Americans believed that the President was a racist.
It didn’t have to be this way.
When J.D. Vance wrote his NYT bestselling memoir, Hillbilly Elegy, he described the plight of the white working class in a way that captivated the nation. For perhaps the first time in a generation, Republicans truly began to understand the connection between bad economy policy and bad social outcomes.
The argument was simple: Free trade sent jobs overseas to China and Mexico. America’s great cities deindustrialized. The businesses, civic organizations, and churches that supported company towns like Detroit, Pittsburgh, and Dayton dried up. The social fabric of communities ripped apart. And those who were left behind turned to opioids (ironically, made in China) to numb the pain, leading to a drug epidemic that kills nearly 70,000 people per year and counting.
Against that backdrop, Donald Trump rode a wave of anti-globalization sentiment into the White House, and the rest is history. Republicans learned an important lesson too: while some of the problems in poor Rust Belt communities stem from failures of personal responsibility, there were also more systemic economic issues at play. Bad decisions made on Wall Street and in Washington had wreaked havoc on the Heartland. For those with eyes to see, the “American Carnage” was all around.
Yet, for many on the right, the lesson didn’t translate to how they viewed black communities. Many blacks moved up from the South during the “great migration” to take manufacturing jobs in cities like Baltimore. While they faced the horrors of segregation, there existed thriving black communities with union jobs, black-owned business, healthy churches, and vibrant cultural institutions.
Much of this was uprooted by the very same deindustrialization that pillaged white working-class communities. Matters were made worse by the very real barriers posed by racism. The jobs left behind were often, as Blake described, “minimum-wage service jobs and temp work.” In other words, not a lot opportunity to pursue the American dream. And blacks that moved to the city in the latter half of the 20th century to buy homes in formerly white neighborhoods often fell victim to predatory developers in the form of “blockbusting,” which in some cases artificially inflated the values of homes sold to African-Americans by 80 – 100%, placing an “onerous burden on black homeowners.”
And to top it all off, in the decades following the Immigration and Naturalization Act of 1965, mass immigration depressed working-class wages and forced blacks to compete for the new, low-wage service jobs, which replaced their well-paying, unionized jobs in manufacturing. As South-side Chicago pastor, Corey Brooks, told TAC in a recent interview: “a lot of African Americans do believe that…[on account of] being so loose with immigration…a lot of jobs that young African Americans could have, they’re not available.” This sentiment is backed by data from a 2007 study published by Harvard’s George J. Borjas, U. Chicago’s Jeffrey Grogger, and UC San Diego’s Gordon H. Hanson, which argued that “The 1980-2000 immigrant influx, therefore, generally ‘explains’ about 20 to 60 percent of the decline in wages, 25 percent of the decline in employment, and about 10 percent of the rise in incarceration rates among blacks with a high school education or less.”
All of these factors combined for a perfect storm buffeting the black community again and again over the past half century, producing understandable anger and frustration. Yes, two-parent families and increased church attendance would help to ameliorate some of the problems in America’s inner-cities. However, just as the white working class was under siege by forces beyond their control, even more so was the black community.
If only we had a President who could champion the interests of the working and middle classes, both white and black. If only we had a president who could stand behind a podium in West Baltimore, just as well as Youngstown, Ohio, and blast globalist politicians and hedge fund managers who sold these communities down the river to make a buck off of cheap labor from Communist China. If only we had a president who could explain, in very clear terms, how tight labor markets raise wages in both inner-city and rural communities. If only we had a messenger who could inspire a new, working- and middle-class majority to restore a sense of solidarity and patriotism across racial and class divides.
This weekend, President Trump travels to Tulsa for a “Great American Comeback” rally—the first stadium event he’s held since the coronavirus lockdown—that’s certain to attract tens of thousands of MAGA hat-wearing fans and provoke the ire of the mainstream media and cultural elites.
In a very different universe, one can imagine a similar rally taking place in downtown Baltimore. The crowd—a cross section of the city’s diverse residents—cheering wildly as a champion of a pro-worker, pro-family, “one-nation conservatism” takes the stage. He or she looks out at the crowd, and with a resolute eye, points to an Acela train in the distance whizzing past the burnt-out, boarded-up neighborhoods of Baltimore’s skyline, and vows with utter sincerity and perfect moral clarity: Never again.
Am I a racist? Are you? People tell me I sort of have to be a racist, it’s not really my choice. Today, if you’re old, white, from the Midwest, a bit conservative, then you’re racist. Maybe you don’t say racist things specifically, and maybe you never did anything to disadvantage a black person yourself, but by original sin, you’re part of “systematic racism.“
Now maybe your immigrant parents arrived in the U.S. 75 years after slavery, or you as a white racist have trouble finding a privileged job that pays a living wage. No matter, you’re still privileged thanks to a system going back 400 years whether you like it or not. You can’t change what you are and people hate you for it. That’s the systemic part, defined as “not something that a few people choose to practice. Instead it has been a feature of the social, economic, and political systems in which we all exist.”
I’d like to say that was from the news, but in recent days I heard most of that from a close relative, and the rest from a friend of many years, neither of whom want to interact with me anymore. I’ve been sending one checks since her birthdays were in the single digits. I grew up alongside the other. They have both taken themselves out of my life because the internet told them I am a racist.
Crowd-sourced (what old timers call a mob) leftist fundamentalism has given us a country where everyone can be called a Nazi, er, racist, and dismissed. Once the red line was only actual Nazis. So no “Thank you, Elie Wiesel for that moving account. Now in rebuttal, Hitler’s deputy, Martin Bormann…” You had to be an actual Nazi to hold an opinion outside the boundaries of legitimacy.
Not any more. Racism scholar Ibram Kendi says one is now either racist or anti-racist, that there is no room for such thing as a “non-racist.” The New York Times said white allies should “Text your relatives and loved ones telling them you will not be visiting them or answering phone calls until they take significant action in supporting black lives.” Another article described my own situation, claiming “BLM protesters are breaking up with their racist, Facebook-addled relatives.” A Twitter thread about one such family dissolution had over 800,000 likes. HuffPo ran an article by a biracial woman eviscerating her white mother for being too white.
High school debate clubs used to propose a topic in advance but not assign a “side” until just before the match. The idea was you would vigorously support or attack a position you may not personally agree with. You were supposed to learn something intellectual from all this along with the ability to see things from another point of view. It is a vision of the world a long way from calling someone a witch, er, racist, and dismissing them whole.
We don’t understand debate, or its cousin compromise, anymore. There is no longer any tolerance for others’ views because the current fascism of the left does not see opinions as such; they are not acquired thoughts so much as they are innate to who we are, the inside and the outside fixed by color and class. You can’t change, only apologize, before being ignored at family gatherings, unfriended, and canceled. From the New York Times firing an editor for running an op-ed by a senator, to me wondering about the practicality of defunding the police and losing a friend over it, there is no legitimate other side. So I can’t speak, I can only whitesplain (used to be mansplain). People arbitrate my intent before I open my slack jaw. It’s even a job title—a writer at a black news site calls himself a “wypipologist.”
I am unsure where all these woke white people came from. The world around me, since George Floyd’s death, is flooded with overzealous sympathy, the media a waste can for guilt, and people who had never heard of the idea a week ago pronouncing themselves deeply committed to defunding the police.
Companies are stumbling over each other like they just found Jesus at an AA meeting to add Black Lives Matter to their websites, just above the ad banners. The Washington Post reports that African Americans have said they’ve been overwhelmed by the number of white friends checking in, with some sending cash because guilt is an expensive hobby. White celebs are swarming to confess their past ignorance on race. In what may be the ultimate expression of shallowness, someone who calls herself an influencer and life coach posted an Instagram guide on “how to check in on your black friends.” Which corner was everyone standing in solidarity on last week?
The Slack for a hospitality company I worked for pre-COVID exploded last week when a benign HR data request went out on #BlackOutTuesday. The almost all-white staff went insane with accusations of racism. Of course, the blindsided (and now racist) HR drone didn’t think about Tuesday being some private racial Ramadan when we all fasted from reality; she doesn’t follow the right people on Twitter. The mob, sounding like they’d drunk a human growth hormone and Adderall smoothie, barked until the company issued a sort-of apology. Then they celebrated as if they’d brought George Floyd back to life.
It shouldn’t have caught HR so off guard. The unemployees live in a world where “journalism is a profession of agitation.” They were taught nothing matters more than starting a sentence with “as a… (woman, harassment survivor, deep sea diver)” because no argument, and certainly no assembled historical fact, could be more important than a single lived experience. They were brought up on TV shows that juxtaposed white and black characters like someone was stringing together magic diversity beads. They made the boss apologize even though nothing was really different except that made-up racial “holidays” are now on the list of things where there is only one allowable opinion. Soon enough we’ll all be asked over the PA to take a knee for the national anthem at sporting events.
The harsh self-righteousness oozed. It sounded very much like people wanted to imagine they were on the cutting edge of a revolution, the long-awaited (well, for four years) Reichstag fire. So what makes this moment into a turning point?
Not much. Less than taking a stand, it feels more like radical chic from people who have been cooped up for months, cut off from bars and the gym. They don’t seem to know we’ve had this week before, after the deaths of Rodney King, Eric Garner, Freddie Gray, and Michael Brown. The protests feel like the last round of BLM, Occupy, Pink Hats, March for Our Lives, even Live Aid in 1986 when Queen sang for everyone’s racist parents to end hunger forever. Remember in 1970 when Leonard Bernstein threw a cocktail party for the Black Panthers Defense Fund and Tom Wolfe wrote about it? That changed everything; I mean, people used to say “Negro” back then. But I’m pretty sure a year from now there will still be funded police departments.
It took some rough nights to work out the rules and root out the looters, but even as the protests have faded, the whole thing has become a set piece: the demonstrators arrive with water bottles and healthy snacks. The route is established with the police a long way from “by any means necessary” boulevard. As long as everyone enjoys their revolutionary cosplay inside the white lines, the cops don’t have to spank anyone with pepper spray. The AP describes the once violent protests outside the White House now as having a “street fair vibe.” See, it got complicated explaining how looting beer from a convenience run by Yemeni refugees was connected to racial justice.
It all reveals itself as hollow because this fight isn’t between racism and anti-racism. It’s Black Rage versus White Guilt. The cops quickly quiet down the former and the media slowly wears out the latter. That means little of the action will have much to do with the real issues but everyone will feel self-righteously better. Until next time.
Along the way, however, the collateral damage of wokeness is producing the totalitarianism it purports to challenge by denying any view that challenges it. Ideas are redefined by one side as the bad -isms of racism, sexism, fascism, and pulled out of the marketplace along with the people who want to talk about them. No invite to the barbecue, no seat at the Thanksgiving table. In a political system built on compromise, I’m not sure how we’re supposed to get things done.
For me, I am not a racist. I’ll get over my problem with lost friends. America, I’m not so sure.
Peter Van Buren, a 24-year State Department veteran, is the author of We Meant Well: How I Helped Lose the Battle for the Hearts and Minds of the Iraqi People,Hooper’s War: A Novel of WWII Japan, and Ghosts of Tom Joad: A Story of the #99 Percent.
It’s been thirteen years of bad luck for Walter Russell Mead and his thesis of Anglo-American global dominance, which he laid out in his once widely-hailed God and Gold: Britain, America, and the Making of the Modern World. No less than The Economist, The Financial Times, and The Washington Post credited God and Gold—which in its Whiggish view of history lauds the dynamism and individualism of the Anglo-American world—to be one of the best non-fiction books of its year.
Thirteen years later, the liberal progressivism and individualism that Mead so glowingly honors seems not only inadequate to address the many troubles that ail us—foreign policy overextension, family collapse, manifold types of addiction, racial identity politics, to name but a few—but perhaps in some senses is even inimical to human flourishing.
Mead published God and Gold in the final years of a George W. Bush presidency that after 9/11 had transitioned from focusing on education reform (remember “No Child Left Behind”?) to an aggressive foreign policy aimed at promoting liberal democracy around the globe. In 2007 there were 150,000 U.S. troops in Iraq, many of whom were engaged in defeating a Sunni insurgency that threatened Shia Nouri al-Maliki’s “coalition government.” There were another 25,000 U.S. troops in Afghanistan supporting Hamid Karzai’s fledgling democracy, as well as another approximately 15,000 more soldiers from a variety of U.S. allies, with some of the largest contingents from our “Anglo” allies: the United Kingdom, Australia, Canada, and New Zealand.
A 2020 study reported that Washington to date has spent over $2 trillion on the Iraq War—an average of about $8,000 for every taxpayer. We have spent another $2 trillion on Afghanistan. Thirteen years later, Iraq is one of the most unstable countries in the region, while Afghanistan remains violent, with a corrupt, incapable central government still threatened by the Taliban. The U.S. military, in turn, suffered about 60,000 casualties, not counting however many more thousands suffered, or continue to suffer from PTSD. Moreover, since 2007, U.S. foreign policy has contributed to instability in Libya, Syria, and Yemen that have resulted in humanitarian disasters and millions of new refugees, many of whom are likely to permanently change the political and cultural identities of European nations where they have settled, likely permanently. Islamic extremism — which Mead believed capable of complete integration into the West — appears an even more global, intransigent threat than it was at the time of 9-11, given the rise of ISIS and the spread of thousands of Saudi-funded Wahabbist madrassas across the world.
Things in the homeland, meanwhile, have also deteriorated across many indicators. While Mead praised the dynamism of the low-trade-barriers global economic system, the American middle-class, that great socio-political stabilizing force, was shrinking, suffering from stagnant wages, the Great Recession of 2007-2009, and 2010 foreclosure crisis, among other factors. While Mead extolled global capitalism for blessing both Anglo-America and the world, the nation’s infrastructure was failing, requiring trillions of dollars of new investment. (Too bad we spent so much on those Iraqis and Afghans!) While Mead noted the amazing vibrancy of Anglo-American learning and research, the quality of American education and average American IQ were declining. Education expert Mark Bauerlein has labeled millennials the “dumbest generation.” Mead lionized our academic freedom, but that now seems misplaced given how our radicalized campuses treat those who won’t conform to progressivist ideology.
Religion, Mead argued, provides “the psychological strength and social support” that undergirds Anglo-American dynamism and “help[s] human beings cope with the new demands of life an open, changing society.” Yet in with the rise of the “nones,” the religiously unaffiliated, as well as rising distrust of American ecclesial leaders in the wake of sex and corruption scandals, the faith seems more like a sideline spectacle. Mead’s wonderment at “the remarkable revival of American evangelical religion” now seems like a description of a distancing historical curiosity, given the slow, but noticeable decline of that religious demographic. Religiosity, that quality of American culture that French political theorist Alexis de Tocqueville deemed essential to the preservation of our national civic character, is plummeting among younger Americans.
Nor is this decline unique to the United States. The United Kingdom, whom Mead describes as our great older brother who first established Anglo political and economic global dominance in the 17th century, has also been struggling. Even before the coronavirus contributed to a significant contraction of the UK’s economy, Britain had been weakening in the face of a Scottish independence movement, dangerously high immigration, and social alienation and disruption evidenced in the need for a “Ministry of Loneliness.” In the wake of political oppression in Hong Kong, many of that region’s residents would prefer to be governed once more by the Brits—yet John Bull is too impotent to hoist the Union Jack on any foreign shores. Australia, in turn, is so dependent on Beijing for its economic vitality that China may be capable of cowing Australian politicians into submission. Canada has likewise been intimidated by Chinese threats following Ottawa’s arrest and attempts to extradite a corrupt Huawei executive.
Brexit, like the election of an American president who ran on an anti-globalist, anti-immigration policy that declined international economic treaties like the Transpacific Partnership and promoted building walls on our borders, manifest what Russian thinker Eugene Vodolazkin has called “The Age of Concentration.” By this Vodolazkin means various Western global powers—like the Anglo-American “Five Eyes” alliance—are entering a period of disengagement and turning inward from a globalist system has perhaps been more of a curse than a blessing
Mead’s analysis possessed a strong sense of optimism regarding Anglo-American dynamism:
Have strip malls and townhouses sprung up in the meadows and forests where one played as a child? Are gender roles melting and changing even as new immigrant groups fill the land? Is the old industrial economy of union labor and stable employment mutation into something mysterious, complex, dynamic and new? … For the dynamic believer, change is both a sign of progress and an opportunity to show the crowning virtue of faith.
Such words, in light of the description of American social and economic collapse in flyover country as described by J.D. Vance in Hillbilly Elegy, seems inadequate, at the least. The economic crisis, as Vance’s memoir so bitterly observes, is compounded by the American family crisis—the number of children raised by their biological parents has dramatically declined, while children raised by a single parent has tripled since 1960. University of Virginia sociologist W. Bradford Wilcox, among others, has noted the disastrous consequences of these trends. Substance abuse (another Vance theme) has ravaged entire American communities.
“The world is about to become a much better place,” is a central tenet to the Anglo-American worldview, noted Mead. It may well be, but that our irrational cancel-culture is engaged in debates about the legitimacy of transgenderism or whether or not to eliminate the police—ideas that Americans a generation ago would have found preposterous and socially suicidal—suggests our nation is going in the wrong direction. Mead praised the unprecedented “level of dignity and security” reached in western Europe, which also seems risible given the continent’s low birth rates, rising immigration and terrorism, and growing social acceptance of abortion and euthanasia. The West is killing itself, not only demographically and culturally, but quite literally.
Mead sees social and economic disruption as emblematic of a dynamism necessary for man’s good. Perhaps in some senses he’s right—if a society isn’t seeking to grow and improve, it’s on its way to decline and death. Yet one has to wonder whether the kinds of dramatic upheavals America has witnessed in the last two generations have been so disruptive as to essentially undermine the survival of our democratic experiment. Social and economic distemper, as many pundits have noted, has reached fever pitch. Progressivism and individualism—two pillars of what Mead perceives to be essential to Anglo-American success—don’t seem to be solutions to our nation’s current crises, but their cause. We need less radicalism and atomism; more patriotism that promotes civic engagement, more rationality that tempers emotivism, and a more Thomistic impulse that prioritizes our immediate neighbor over solving problems elsewhere. In reference to those, at least, Mead’s thesis is accurate—we’ll need God to persevere.
Casey Chalk covers religion and other issues for The American Conservative and is a senior writer for Crisis Magazine. He has degrees in history and teaching from the University of Virginia, and a masters in theology from Christendom College.
The post Anglo-American Dominance, or Anglo-American Decline? appeared first on The American Conservative.
I’m writing like this because I didn’t know how better to respond to emails in which you related, in passing, that despite reservations about aspects of progressivism, you were “kind of socialist.” You wrote this well before the United States descended into a riotous state in May, and my reply here will hardly touch on current events. You probably noted that I did not respond immediately. I wasn’t sure how to. It would have seemed trite to ask “what kind” in an email—clearly you are not Stalinist, Maoist, or Trotskyite in any orthodox sense. And flippant to respond, “Yeah great, but that never seems to have worked out too well in actual practice.” But I assumed that if you were “socialist” in the way of François Mitterand during most of his presidency, or François Hollande, or, as Bernie Sanders says, “like Denmark” (probably to deflect scrutiny of his past and beliefs) you would not have bothered to make the remark. European social democracy has made such a durable and lasting peace with parliamentary institutions, civil liberties, and the right of individuals to own property that neither its partisans, nor its left-wing detractors, depict it as “socialism.”
I am aware that the discourse I am going to inflict will seem very last-century and beside the point to many people. You and I are part of the last generation of Americans for whom the issues of communism and the Cold War were taken seriously on our college campuses, if then only by a small subset of students and professors. The colleges we attended were overwhelmingly liberal in their students and faculties, but there was still a major contingent of liberal anti-communists on the faculty, people who themselves had come of age when the revolutionary communist movement allied with the Soviet Union was ideologically powerful and plausibly viewed as potentially victorious. Even when communism on the Soviet model was no longer widely admired in the West—the major disenchantment did not set in until the late 1940s, and considerably later in Western Europe—the struggle between the Free World and the communist world, which at least once brought us to the brink of nuclear war, was the scale the upon which virtually every political event throughout the world was weighed by the intelligence services and diplomatic corps of Washington and Moscow and many other capitals.
Never has a battle of such importance faded so quickly from memory. This was in part due to circumstance; the Soviet empire’s collapse, in the midst of seemingly minor unrest, almost completely unanticipated by professional observers, was sudden and bloodless. Washington and the West, eager to integrate the former Soviet countries into a pacific neoliberal world, had no wish to celebrate its victory. There would be no Nuremberg type trials of Soviet bloc leaders, no solemn efforts at decommunization. Indeed, how could there have been? By 1989 the Soviet Union was seeking to reform itself. For a generation it had been more sclerotic dictatorship than practitioner of internal revolutionary terror.
The result is that very few Americans born after 1970 have any sense of the texture and content, the passions of the battles waged around the communist idea. History in the schools and universities of the West now seems to be taught almost entirely through the lens of the crimes whites have committed against colonial subjects and people of color among those they ruled over —at least such is the impression one gets whenever young people talk about the lessons of history. The Holocaust constitutes the major exception, and the horrific enormity of the Nazi project is documented in the Holocaust Memorial Museum, which has had 40 million visitors since its opening in 1993. In popular culture, Nazis and their crimes have remained a bottomless source of significant books and movies. In striking contrast, the victims of communism are commemorated officially by a ten-foot statue in a tiny park near Union Station, tended by a small private foundation.
But when you speak of socialism, my historical antennae and not entirely forgotten knowledge of the history which created those victims, are aroused. For as important as fascism and Nazism were to the history of the West in the past century, the history of communism, and its sometime synonym socialism, in their Bolshevik, Maoist, and other iterations was more central, more critical to what politically engaged people all over the world were arguing about, fighting over, dying over. Communism, not fascism, was the major story of the 20th century.
Of course, their histories were profoundly intertwined. Ostensibly enemies, also siblings, both born in the wake of a war whose length and brutality no European alive in 1914 could have conceived of in what was considered the most civilized part of the earth. By its end, millions of men who had endured military conscription and experienced violence on scales previously unknown entered into European politics. For the war’s non-winners, this new age of the masses was colored by the sense of meaningless sacrifice. The war shattered the Russian state, leaving power to be scooped up in October 1917 by the tiniest of armed factions with a willful leader. A few years later Mussolini toppled a weak government in Italy. In Germany, initial communist attempts to emulate Lenin’s successful putsch failed, put down by right-wing veterans of the Freikorps, and Hitler too was rebuffed in his first attempt at seizing power by force. But the Bolsheviks, Mussolini’s fascists, and the Nazis shared prior convictions which the war had magnified: a hatred of the bourgeoisie, of capitalism, of parliamentary democracy, of all the political habits and social constraints woven into the societies of 19th century Europe.
François Furet, author of the indispensable work about Western reactions to communism, The Passing of an Illusion, describes a sentiment widespread in the wake of the war:
Today it is hard to imagine the hatred aroused by parliamentary deputies at the time. The deputy was hated as the essence of all the lies of bourgeois politics. He symbolized oligarchy posing as democracy, domination posing as law, corruption lurking beneath the affirmation of republican virtue. The deputy was seen as exactly the opposite of what he pretended to be, of what he ought to be: in theory the representative of the people; in reality, the man through whom money—that universal master of the bourgeois—takes possession of the will of the people.
The power in this scabrous description lies in the fact that it was at least partially true. Building upon it, Lenin, Mussolini, and Hitler in succession forged an antithesis, the three regimes, each emulating and building upon the actions of the others, creating a totally new kind of political power: the party state, government by a single party to which all political life was subordinated. The singular feature of the three regimes was the treatment of political murder and internal violence as a virtue, the treatment of one’s fellow citizens as enemies in war. Mussolini boasted of “our ferocious totalitarian will” in 1925, though his government never approached the domination of his subjects or the levels of internal violence achieved by Hitler, Lenin, and Stalin. By the 1930s, long before Hannah Arendt made the point in her famous The Origins of Totalitarianism, observers noted the similarity between Stalin’s Soviet regime and Nazi Germany. Some socialists like Karl Kautsky made the point repeatedly, as did liberals like Élie Halévy in his famous lecture The Era of Tyrannies. Numerous political figures, in France especially, rotated between communism and various parts of the anti-parliamentary Right in the 1930s. Hitler himself put the matter succinctly, in a 1934 conversation recorded by Hermann Rauschning:
It is not Germany that will turn Bolshevist, but Bolshevism that will become a sort of National Socialism. Besides there is more that binds us to Bolshevism than separates us from it. There is above all genuine revolutionary feeling, which is alive everywhere in Russia except where there are Jewish Marxists. I have always made allowance for this circumstance, and given orders that the Communists are to be admitted to the party at once. The petit bourgeois social democrat and trade union boss will never make a good National Socialist but the Communist always will.
The victims of communism are less commemorated than the victims of fascism, but there are more of them. Far more, in great part because communism was a living social model for a longer time and ruled over more people. The multi-authored Black Book of Communism, published in France in 1997 and two years later in the United States, estimates a death toll of 100 million. Most of what the authors depict was known long ago, though recently opened Soviet archives fill out the picture. Terror—the arrest, imprisonment, or execution of political opponents—was an inextricable part of the Leninist regime from its very outset. The concept of “enemies of the people” was introduced into law in November 1917. The fastest growing organ of the new government was the Cheka, the political police whose ranks grew from 100 to 12,000 in six months, and was liberated by Lenin from any need to follow, as its chief Felix Dzerzhinsky put it, “the nit-picking legalism of the ancien régime.” In January 1918, the Bolsheviks disbanded the first freely elected assembly in Russian history and shot those who protested publicly. Non-Bolshevik newspapers were shut, leaders of other political parties imprisoned. By the summer of 1918, the Red Terror was underway in earnest; according to Cheka documents, 10,000-15,000 executions were carried out in two months, more than double the total of Tsarist governments from 1825 to 1917.
Grigory Zinoviev, a top Bolshevik and close Lenin confederate, exclaimed in 1918 that the Bolsheviks needed their own socialist terror—that they could get 90 million Russians on their side and “the other 10 million we’ll have to get rid of them.” Zinoviev’s estimate proved to be roughly half the eventual Soviet death toll, which appropriately later would include Zinoviev himself, purged and executed by Stalin in 1936. In any case the new regime initiated rolling waves of terror against different segments of the population—striking workers in cities, peasants who resisted their land being taken, against the Cossacks of the Don, the bourgeoisie. Of course, there was a brutal campaign against priests and nuns. Said one Cheka leader, “we are exterminating the bourgeoisie as a class. In your investigations, don’t look for documents or evidence about what the defendant has done…the first question you should ask him is what class he comes from, what are his roots, his education, his training.” As an editorial in the first issue of the Kiev Cheka newspaper put it, “Our morality has no precedent, and our humanity is absolute, because it rests on a new ideal. Our aim is to destroy all forms of oppression and violence. To us everything is permitted, for we are the first to raise the sword not to oppress races and reduce them to slavery but to liberate humanity from its shackles. Blood, let it flow like water!”
It did. In terms of sheer number of victims, the most deadly Soviet effort was the campaign to “exterminate the kulaks as a class.” Kulaks were farmers who owned some land. After Stalin consolidated his rule, dekulakization was begun on a grand scale. GPU (the secret police which succeeded the Cheka) brigades went to expropriate kulak property—pillows, shoes, and underwear as well as land. Kulaks who resisted were executed or deported to Siberia; nearly 2 million were deported to forced labor camps in nearly uninhabitable locations. They perished by the hundreds of thousands. The great famine of Ukraine in 1932-1933, the result of Stalin’s collectivization policy, killed four million in Ukraine, another million in Kazakhstan, and another million in the northern Caucasus. Five years after the man-made famines, Stalin initiated his next wave of terror, going after the intelligentsia, party members, and industrial administrators: this period produced the famous trials in which top Bolshevik leaders confessed to outlandish crimes. These provoked at least some of communism’s progressive admirers in the West to question their adulation of Bolshevism.
Communism became a global system, so the roughly 20 million deaths caused by Lenin and Stalin were only a beginning. In the Black Book’s estimate of 100 million deaths, the largest element comes from China’s famine during Mao’s “Great Leap Forward” from 1957 to 1960. There, a combination of forcing peasants off private plots into communes, in emulation of Stalin’s anti-kulak policies, and an obsession to raise homegrown steel production at any cost produced the most deadly famine in the history of the world. Thirty years later the Chinese government estimated the death toll at 18 million, but many scholars put the figure at more than twice that. Rounding out the victim toll was China’s forced labor prison system, modeled on the Stalinist gulag. China’s Great Proletarian Cultural Revolution of the late 1960s, more visible to the outside world than famines or labor camps because it took place in cities, killed fewer people. But the sheer joy young Red Guards took in their public humiliations of so-called class enemies—teachers, scientists, authors, individuals with more knowledge or talent or prestige than they possessed—seemed an almost perfect embodiment of communist spirit. It is probably not an accident that this period coincided with the greatest growth of Maoist groups in the West, though they never matched the progressive adulation of Stalin received from progressives. In his review of The Black Book of Communism, Tony Judt related that he had heard a tenured Cambridge economist remark, at the height of the cultural revolution, that Maoism represented mankind’s best hope. On a per capita basis, Cambodia’s Khmer Rouge are probably responsible for communism’s most violent interlude, leaving dead a quarter of their country’s population in a mere few years of power.
Despite its death toll, communism has seldom received the degree of moral obloquy as Nazism. One reason is simply historical. Stalin became (eventually, after Hitler broke the Hitler-Stalin pact) an ally in the war against Nazi Germany; Soviet armies were critical to the Allied victory; and Soviet jurors served at Nuremburg, passing judgement on Nazi crimes against humanity. After Stalin’s death, the Soviet system mellowed; gulag prisoners were freed, political dissidents faced imprisonment or exile rather than torture and death. A third reason is the strong attraction Stalinism had to a huge band of American intellectuals for many years, eventually to the extent that communists or fellow travelers or progressive anti-anti-communists controlled many of the most influential publications in the United States. They stood ready to extoll the Soviet regime or vilify those who sought to tell the truth about the terror or labor camps. Finally, and this remains true today, many who unambiguously and publicly deplored Stalinist terror still found a moral difference between Hitler’s killing and Stalin’s or Mao’s. Stalin and Mao killed “class enemies” viewed as impediments to a state of social equality; Hitler killed “race enemies,” an irremediable condition, which permitted no salvation for the victim’s children. In the useful formulation of the late Berkeley historian Martin Malia, “fascism never pretended to be virtuous.”
* * *
At this point, Philip, or indeed earlier, you may wonder what does all this ancient history, about Lenin, the Cheka, Stalin and the party-state, the politically induced famines, the purge trials, the gulags, China’s similar labor camps and deadlier famines, have to do with the socialism that summons you? If one asked Bernie Sanders, or leaders of the Democratic Socialists of America, they would protest that the blood-soaked history of communism has nothing to do with the system they seek to impose. Socialist journals in the United States gave up their adulation of Stalin long ago. For decades, the revolutionary socialist position in the West had been that Stalin deviated from the righteous path that Lenin (or Lenin and Trotsky) laid out in 1917. But Lenin was undeniably the creator of the one-party state, the initiator of a terror regime against political opponents, the terminator of the first freely elected parliament in Russia’s history. One can now find (in the pages of Jacobin, for instance, a fairly new socialist publication—one whose name intentionally evokes the terror phase of the French Revolution) acknowledgment that Lenin, frankly, made some errors. This is put forward in a matter of fact way, rather as mainstream American historians acknowledge certain failings of the Founding Fathers.
There actually is a strain of socialism genuinely untainted by this history of “really existing socialism,” as it used to be called. From the moment of the Bolshevik Revolution, there was resistance in the Second International—the grouping of European socialist parties, those who argued that a putsch and a terroristic dictatorship in relatively backward Russia could not possibly be the fulfillment of the socialist dream they had been working for. Karl Kautsky, the dean of Marxist theoreticians of his era, was anti-Bolshevik; so was Léon Blum, in 1920 the most eminent of French socialists. These two retained their Marxism and faith in proletarian revolution, a concession which put them at some sort of rhetorical disadvantage in argument with their “enemy brother” communists.
In the United States too, some American socialists were unquestionably democratic—sustaining an unimpeachable commitment to competitive free elections and civil liberties. On the murderous nature of Lenin’s and Stalin’s regime communism they were, more so than most liberals and many conservatives, completely undeceived. Throughout the 1930s and into the Cold War period, they were willing to speak out about Stalin’s crimes, often in alliance with liberals in anti-communist organizations. Norman Thomas, several times a Socialist Party candidate for president, was one of these; so, in subsequent generations and with less strident anti-communism, were Michael Harrington, famous for his important 1962 book on poverty, The Other America, and Irving Howe, founder of Dissent, and many others. But just as was the case in Europe, American non-communist or anti-communist socialists were always a far smaller part of the Left than those in the communist orbit as party members, reliable fellow travelers, admirers of whatever new revolutionary dictatorship had emerged in the Third World, or general floaters in the sea of pro-communist progressivism.
The Bolshevik Revolution excited American progressives—“I have seen the future and it works” extolled Lincoln Steffens, in 1917 probably America’s most prominent liberal journalist, born of wealth and a friend to presidents. Steffens didn’t deny Lenin’s crimes but excused them, the price to be paid for entry to the socialist Promised Land. By the 1930s, with Nazism on the rise and the world economy in the throes of the Depression, communism was arguably the dominant ideology among American intellectuals. One can’t go through a book like A Better World, William O’Neill’s meticulous study of American intellectuals and Stalinism, without being astonished at how many of the memorable literary names of the 1930s and 1940s were at one point or another firmly in the communist orbit. This often required extraordinary mental gymnastics to exonerate or whitewash mass murder. Their platforms included the nation’s most important political magazines—The Nation and The New Republic, and popular ones like The Saturday Evening Post. It meant that prestigious publishers worked not only to extoll Stalinism but to stifle its critics. You may know of the difficulty George Orwell had in finding either an American or British publisher for Animal Farm, the well known allegorical novel critical of socialism. Bennet Cerf, the founder of Random House and the most well known American publisher of his generation, at one point proposed that the publishing industry withdraw all books critical of Russia.
The cultural power of American Stalinism began to wane after World War II, and was more or less forced into full retreat after the Stalinist takeover of Eastern Europe began to sink in. But what remains remarkable about the period was how much genuine enthusiasm Stalin’s regime generated among American intellectuals for a long period. No matter how many innocents it imprisoned or killed, the Russian Revolution remained a sentimental favorite on the broader Left, in circles well beyond those who adhered to party line discipline or “orthodox” Marxism. As Raymond Robins, a wealthy benefactor of progressive causes, put it in a 1947 letter to Senator Claude Pepper: “Jefferson’s POLITICAL FREEDOM and Promise of EQUALITY has met LENIN’S ECONOMIC FREEDOM AND FULFILLED EQUALITY—and the new wine of LIBERTY in old bottles has begun to ferment.” I grew up hearing variants of such progressive bromides in my own home, from good people I loved. Many did.
Of course communism proved eventually, to the communists born into the system, if not to Western leftists, a failure and a lie. The Marxist philosophy of history, according to which the replacement of private ownership with a socialism defined by planning and the power of a party claiming to represent the proletariat, was supposed to bring about the end of social inequalities, a classless and stateless society. Nothing remotely like this happened in real life. The actual working class, in both communist Eastern Europe and the West, were more interested in the kind of economic amelioration workers in the United States and Western Europe achieved, without communism.
The late 1970s were a dark time for progressives and socialists. The Vietnamese communists finally won, and within years millions of Vietnamese were trying to escape their country in small boats. The communist victory in Cambodia resulted in the death of a quarter of the population. Alexander Solzhenitsyn’s The Gulag Archipelago dealt a mortal blow to communism’s prestige in France, the last Western country where it was considerable. These events also had troubling implications for democratic socialists as well, those who still kept faith in the possibility of a socialism without forced labor camps and with actual free elections. In 1978, Commentary magazine published a symposium on this question, asking whether political freedom was compatible with socialism at all. Various neoconservatives, many of them former communists or socialists, faced off against democratic socialists, who were a bit of a dying breed. At the time I was especially struck by a passage by William Barrett, a City College philosopher and veteran of the little magazine wars about Stalinism in the ’40s and ’50s. He wrote:
How could we ever have believed that you could deprive human beings of the fundamental right to initiate and engage in their own economic activity without putting every other human right in jeopardy. And to pass from questions of rights to those of fact: everything we observe about the behavior of human beings in groups, everything we know about that behavior from history, should tell us that you cannot unite political and economic power in one center without opening the door to tyranny.
If one had to pick a coda for the American liberal intellectual fascination with communism, a good candidate occurred a few years after Barrett wrote. Poland’s commissars had just clamped down on Solidarity, the upstart labor union which they realized had far more legitimacy than the communist government. At a forum at New York’s Town Hall, before a progressive audience which felt itself under a kind of siege after Ronald Reagan’s election, Susan Sontag, renowned author and critic, the dark queen of New York’s literary intellectuals, took the stage. Sontag was still a woman of the Left, but she had in her recent years made numerous contacts with dissident writers in the communist world, and advocated for them. When she took the stage she opened with fighting words: “Imagine, if you will, someone who read only the Reader’s Digest [roughly the Fox News of its era] between 1950 and 1970, and someone in the same period who read only The Nation or The New Statesman. Which reader would have been better informed about the realities of communism? The answer, I think, should give us pause. Can it be that our enemies were right?”
She concluded—putting into aphorism the point I laboriously sketched earlier in this letter—that “Communism is Fascism—successful Fascism if you will…what we have called Fascism…has largely failed…but Communism is in itself a variant, the most successful variant of Fascism, Fascism with a human face.”
Many in the crowd reacted with hoots and jeers. However long ago they had abandoned any faith in the Soviet bloc as a lodestar of progressive aspiration, they did not take well to being compared unfavorably to Reader’s Digest readers. Sontag was, in the end, and rather soon after, proved wrong about communism’s durability but only about that.
* * *
Now, nearly 40 years later, we have a new New Left, symbolized by the relative success of Bernie Sanders’s campaign, and you write that you find socialism appealing. Of course, many in the Sanders movement would deny vigorously that communism has anything to do with what they seek for America. Sanders himself likes to proclaim various Scandinavian social democracies as his model. This is not obviously true; he retained Trotskyite affiliations well into his 40s, honeymooned in the Soviet Union, and to this day cannot restrain himself from making exculpating and relativizing remarks about socialist dictatorships formerly allied with the Soviet Union. A non-socialist can nonetheless appreciate the Sanders appeal; many of his foreign policy views are far from crazy, and most of all he comes across as comfortingly familiar: the Jewish Marxist from Brooklyn is almost an American archetype, one which in real life has probably done less harm to other humans than any other type of communist. Nonetheless the forces which sustained his campaign are new, with both similarities and differences from previous socialist movements.
First, and most obviously, this new New Left does not prioritize, or even pretend to prioritize, the working class. Honestly, it has been a long time since the socialist Left did. In its early stages the Students for a Democratic Society was animated by middle-class concerns about alienation and meaning in an affluent mass society; as the Sixties Left hardened during that pseudo-revolutionary decade, its leading voices made it clear that its revolutionary coalition would be blacks in the “internal colonies” of “Amerikkka” fighting in alliance with revolutionary youth. The lack of labor participation in what was called “the Movement” was one of its most notable features.
Those were prosperous years for American workers, as recent ones were not. The research of Princeton professors Anne Case and Angus Denton has demonstrated a dramatic rise in “deaths of despair” among working-class whites—as the factories where they worked were moved abroad, marriages collapsed, rates of alcoholism and opioid addiction soared. This was the first time in America’s history (if one were to exclude the plight of Native Americans) that a large segment of the American population went backwards instead of forwards over a long period of time in terms of life expectancies and living standards. This American tragedy was contemporaneous with a surge in left-wing campus activism, the rise of MeToo feminism, Black Lives Matter, a rhetorical war on cops, an explosion of student and professor denunciations of systemic structural racism, white privilege, and border enforcement, an obsessive attention to what pronouns people use to describe themselves, indeed a new radicalism touching upon virtually every conceivable issue. Except one—if any campus activist sought to protest or even call attention to the literal death of working-class whites, they did it so quietly as to escape notice.
The animating force of today’s Left has to do not with class in the Marxian sense, but race. If this was apparent well before the protests and riots over the killing of George Floyd in Minneapolis, it is now an unavoidable reality. But it is not obvious what program or policy can be derived from the belief that racism defines America. Certainly no critical race theorist approaches the brilliance or sheer rhetorical panache of Karl Marx, one of the great minds of the 19th century.
Durable and attractive to intellectuals as it was, Marxism as social prophecy was false—the working class did not experience increased pauperization under capitalism, and as a consequence the dictatorial parties which sought to rule in its name had to lie repeatedly about the social reality around them. The current vogue of left-wing theory, which could be reduced to the belief that nearly every imaginable inequality is rooted in white racism, will prove no more true than Marxism: to make the most obvious point about it, in recorded history nations and peoples were spectacularly unequal in their levels of commerce, technology, literacy, and every other measure, not excluding the centuries when the white populations of Western Europe were mired in the Dark Ages. Any socialism derived from critical race theory will be forced to lie, and enforce public lies, about social reality every bit as much as their Marxist forebears did, and be no more tolerant of democracy or intellectual freedom than communism turned out to be. The word “racist” has already replaced “bourgeois” as the new master term of the Left, and is the one it would use, given the opportunity, to justify expropriation, resettlement, imprisonment, or execution.
Well before the recent spate of American leftist intellectuals applauding violent insurrection in their own democratic country, one could observe, if only impressionistically, that the burgeoning new Left and the Sanders movement possesses no magical “American” quality which would make it less authoritarian in power than every socialist Left which has preceded it. Few liberals realize this, but the simple exercise of free speech rights has been made difficult for conservatives on virtually every American campus; where the Left actually does hold power, it wields it by denying free speech to its opponents whenever it can. Varied snapshots of the Sanders campaign are no more encouraging. Videos recorded by James O’Keefe’s Project Veritas, assiduously unreported by the national media, show paid Sanders staffers expounding with remarkable candor their desire for forced labor camps in the United States and for Republicans to be confined in them, while sometimes extolling specific camps in the Soviet gulag. The New York Times described the hosts of the popular socialist podcast Chapo Trap House revving up a crowd of Sanders campaign workers in Iowa with appreciatively received remarks of “let the hate feed you.” When British prime minister Boris Johnson was hospitalized with COVID-19, a prominent D.C. organizer for Democratic Socialists of America tweeted out that she hoped he “drowns in his own mucus.” Such men and women are clearly closer to being aspiring Felix Dzerzhinskys more than they are young Irving Howes.
America was in crisis before COVID-19 hit, and the safest prediction in the world is that the economic devastation caused by the pandemic will lead to political instability. Clearly the working class collapse chronicled by Case and Denton reveals that the neoliberal globalist model was deeply flawed; whether it can be improved is far from clear. All kinds of social theories are now competing for attention on the Right and Left, and the future is unknown. But a great deal is known about one of these theories, Philip—the concentration of economic and political power that socialism calls for leaves people poorer and less free, at best. At worst, and very often, it leads to blood-soaked totalitarian terror states. So when you say you are “kind of socialist” I have to express my alarm. Social democracy, as practiced by the political heirs of those who definitively rejected the Bolshevik Revolution, tends to be a humane and moderately effective system of government. The socialist dream is more ambitious, more exciting, more geared to satisfy the very human yearning for total and unchallenged power. But I have learned to hate and fear it, and hope others do as well.
Scott McConnell is a founding editor of The American Conservative and the author of Ex-Neocon: Dispatches From the Post-9/11 Ideological Wars.
If anyone is benefiting from the pandemic, it’s TikTok. Amid continued stay-at-home orders and gruelingly slow reopening processes, the popular social media platform that allows folks to create 15-second videos—often lip-synced dances, comedy sketches, and tutorials—has enjoyed a spike in growth like no other.
Since January, there’s been a nearly 50 percent uptick in its U.S.-unique visitors, and it added more than 12 million unique visitors in March alone. Previously known as a platform for Gen-Zers, the quarantine lockdown has pushed the Millennial generation and mainstream Hollywood onto TikTok. Everyone is anxious to blow off some steam these days, and TikTok has surely helped.
But with the platform’s skyrocketing prominence has come plenty of reason for concern.
The difference between TikTok and other major social media? TikTok isn’t an American company—it’s owned by a Chinese firm, ByteDance. And as TikTok continues to dominate in growth and reach (it has 800 million active users as of late April), it might be smart for Congress and agencies like the Federal Trade Commission—even the Office of the Director of National Intelligence—to take a closer look at its practices.
ByteDance, a company now valued at $100 billion, launched TikTok in 2017, and entered the U.S. market when it bought the Shanghai-founded app Musical.ly, merging it with TikTok. The company has since tried distancing its American product by operating a similar but separate app, Douyin, in China. They maintain, too, that American data they collect from TikTok stays on servers in the United States.
Somehow, TikTok thinks it can convince people it’s not really a Chinese company. But its executives have declined to testify before the Senate because they live in China, according to Senator Josh Hawley. Yet the spokesman for the company has insisted that TikTok isn’t a Chinese company, on the basis that ByteDance is incorporated in the Cayman Islands.
It isn’t relevant how many offshore registrations a company has. We should call a spade a spade: TikTok is a Chinese company, and its track record and immense data collection presents a danger to Americans and democracy. We know what it does with data collection in its own country—it violates human rights in part by operating a mass censorship and surveillance apparatus against its own people.
TikTok has used its reach and content moderation capability to help cultivate the Chinese government’s preferred global narrative. The platform’s content moderation decisions are even made by ByteDance in Beijing, according to former U.S. employees of TikTok. It makes sense, then, that leaked documents have showed TikTok moderators being instructed to remove posts about political topics sensitive to China, including the Hong Kong protests. TikTok has also blocked videos about China’s human rights abuses, including the Xinjiang re-education camps.
Back in March, The Intercept disclosed that TikTok moderators were instructed to suppress posts created by users deemed “too ugly, poor, or disabled” and to censor political speech in livestreams, banning users who “endanger” China’s “national honor” and criticize “state organs such as police.”
TikTok, and consequently the Chinese government, collect private information from all of their 800 million users. We’re talking usage information, IP addresses, users’ mobile carrier information, unique device identifiers, keystroke patterns, and location data. TikTok also has built-in facial recognition software. According to a lawsuit against it in the U.S. (that was recently settled), the software can be used to evaluate the quality of uploaded videos, determine the user’s age, and acquire facial geometry. And while TikTok insists that American data doesn’t make it to China, a class-action lawsuit has alleged a transfer of the personally identifiable information of Americans to servers located in China.
Congress is finally catching up. Legislation recently introduced would prohibit all federal employees from using or downloading the app. For its part, TikTok has been hiring top talent from other tech companies, like former Disney executive Kevin Mayer, its new CEO. This is a clear attempt to give the company an American facade as it faces tough questions in Washington.
Congress and consumers are entitled to answers. Only an enormous amount of transparency, cooperation, and foundational change would transform TikTok into a tech company we can trust with our data. That’s not impossible, but it’s going to take longer than 15 seconds.
Ashkhen Kazaryan is the director of civil liberties at TechFreedom, a Young Voices contributor, and an expert at the Federalist Society’s Emerging Technology Working Group.
The post Time’s Up: Chinese-Owned TikTok Has Some Explaining to Do appeared first on The American Conservative.
Barbara Boland, TAC foreign policy reporter: What happens when middle-aged salesmen, bakers, police officers, and bankers, too old to be conscripted into the army, are sent to commit genocide? The book Ordinary Men: Reserve Police Battalion 101 and the Final Solution in Poland, by Christopher Browning, takes a look at that very question, spotlighting those conscripted into a police battalion and tasked with enacting Hitler’s Final Solution.
The book is built from first-hand accounts of the “ordinary men” themselves (from their interrogations when they were put on trial in the 1960s). You wouldn’t think these men would have turned out to be mass murderers. But in short order, that’s exactly what happened, as they became directly responsible for the deaths of 38,000 men, women, and children, and another 45,200 who were rounded up and sent to the Nazi extermination camp Treblinka.
This book is a deep dive into the many reasons it was possible for ordinary men to so quickly become killers—and how easy it is to create a social dynamic where following orders and executing others becomes routine. Ordinary Men should be required reading, especially in our chaotic times.
The book begins with a poignant scene. At the site of the first mass murder his unit will commit, in the Polish town of Jozefow, a commander tells his men with tears in his eyes that if they are not up for the task, they may stand aside. Out of hundreds of men, only 12 do. The rest set about shooting 1,500 Jews in the back of the head and neck. Some of their victims are instantly killed, but many fall into a mass grave and are suffocated by the bodies that fall on top of them
From there, the book proceeds through a number of graphic, stomach-churning scenes. The unit rounds up and deports Jews to Treblinka and participates in chilling mass murders, including the Judenjagd, the “Jew Hunt” in the Polish countryside.
The book gives many personal accounts, showing that some people enjoyed killing, while a handful of others did everything they could to avoid it without reducing the effectiveness of the battalion. But the vast majority of the men did what they were ordered to do, and were amazingly effective it.
The contrast between the horrors the men committed and the ordinary people they were before the war, as well as the explanations they gave later for their actions, is striking. Although Browning explores many rationales, the reader will walk away from this book unsettled. Like Hannah Arendt, Browning’s conclusion is that the reasons men commit mass murder are fairly mundane: they defer to authority, feel the psychological need to conform, are afraid to look “weak” in front of other members of the battalion, are detached from the people they kill, had been indoctrinated by the Nazis to dehumanize the Jews. Not everyone killed for the same reasons, and none of these reasons alone would have likely been enough, but together, they were sufficient to justify one of Hitler’s most brutal killing sprees. What’s even scarier is how very few chose to step aside from these mass killings when given the chance.
I fear that we live in a world in which war and racism are ubiquitous, in which the powers of government mobilization and legitimization are powerful and increasing, in which a sense of personal responsibility is increasingly attenuated by specialization and bureaucratization, and in which the peer group exerts tremendous pressures on behavior and sets moral norms. In such a world, I fear, modern governments that wish to commit mass murder will seldom fail in their efforts for being unable to induce “ordinary men” to become their “willing executioners.”
In his recent monologue about mob rule and cancel culture, Tucker Carlson made what seems like an uncontroversial claim: “No child is born evil.” Beyond setting Calvinists aflutter on Theology Twitter, however, Tucker’s assertion exposes why conservative fusionism was predestined to disintegrate.
Market-lovers, churchgoers, and commie-haters had plenty uniting them back in 1979. The Soviets threatened world annihilation, Jimmy Carter had us mired in stagflation, and the recently legalized abortion industry was cranking out record numbers of corpses. Leaders calling themselves fusionists, combining all three strains of conservatism, promised solutions.
But with success came divisions. In the 1990’s, having dispensed with the Soviet Union, neocons turned to Middle East nation-building. Christians became increasingly intolerable in a liberalizing and globalizing world. Multinational panjandrums shifted manufacturing overseas. The partners Frank Meyer once described as interdependent pillars began to realize they might have irreconcilable differences.
How could this be, when they’d been so good together? The traditionalists kept the individualists mindful of virtue, and the individualists kept the traditionalists alert for infringements on liberty. They needed each other, didn’t they?
Maybe not as much as we thought, because they weren’t so different after all. This was Jonah Goldberg’s claim a couple of years ago: “The libertarian individualists of the 1960s were more virtue-oriented than they appreciated. The traditionalists of the period were more concerned with freedom than they often let on. And many of the arguments about fusionism amounted to the sorts of squabbles we associate with the faculty lounge; they were so vicious because the stakes were so low.”
The real problem, in Goldberg’s view, is that the ignorant rabble displaced the intellectuals. Whereas before we had Brent Bozell intelligently disputing Frank Meyer, now we get Diamond & Silk starring at CPAC, while the leader of the GOP tweets murder rumors about one of his critics.
But whereas Goldberg roots conservatism’s demise in changes to technology and political party structure, in the spirit of originalism we might consider what American conservatism’s founders believed. It’s hard to find two early conservative thinkers more antagonistic toward one another than Harry Jaffa and Willmoore Kendall, yet they agreed political philosophy—and thereby political strategy—is fundamentally driven by beliefs about the nature of man.
“All societies,” wrote Kendall in Basic Symbols of the American Political Tradition, “think of themselves, once they begin to think of themselves at all, as representing a truth, a meaning, about the nature and destiny of man, and thus about that which, in the constitution of being, is above and beyond man.”
Jaffa, who otherwise sought to eviscerate Kendall in his extensive rebuttal of the aforementioned book, concurred on this point: “what men are by nature, that is, prior to civil society, determines what purposes civil society may rightfully serve.”
Writing before either of them, Richard Weaver made the same claim in the opening pages of Ideas Have Consequences, asserting that rejection of the transcendental had placed man at the center of things, thus leading to “the abandonment of the doctrine of original sin.” This in turn opened the door to all manner of progressive nonsense rooted in the fundamental belief that any flaws in man’s behavior are due to correctible social forces—a conviction Stalin embraced wholeheartedly.
We can see how factions in today’s disintegrating conservative coalition differ on Tucker Carlson’s assertion, and how that disagreement drives them apart. For the individualists, be they libertarians, free-market globalization advocates, or just Creaster Republicans with business degrees, man is essentially pretty good, and will add all kinds of value to the GDP if government will only get out of his way. Cut taxes and regulation, keep oil prices low, and maybe even consider reducing our debt, and the country will do just fine.
To the neocons and their establishment DC heirs, man is as good or bad as his ideas. The Russkies got Marx, we got Madison, and therein lies all the difference. Use diplomacy or a smart bomb or a few thousand expendable young Marines to depose a guy clutching The Communist Manifesto, replace him with a guy who has The Federalist Papers stuffed into his back pocket, and that’s how you illuminate a darkened world one freedom beacon at a time.
To the traditionalists, meanwhile, if man is not born evil then he certainly comes out of the womb bent in that direction. His worst impulses must therefore be restrained, his virtue cultivated. What’s more, because original sin implies the existence of God, the aims society ought pursue, as Eric Voegelin argued, extend before the cradle and past the grave.
Historical conditions drove these disparate tribes together for a time, but we shouldn’t confuse soldiers sharing a foxhole for Siamese twins. Past policy divisions aside, just look at how their worldviews shape each tribe’s response to the coronavirus pandemic: To the market-lover, the economy’s health trumps the health of the vulnerable elderly. To the neocon, it’s time to go toe-to-toe with China. The traditionalists, meanwhile, are concerned less about the financial survival of Tysons and Smithfield Foods than with keeping local institutions open, because optimizing lifespan or the national GDP is less important than giving people regular work, community engagement, and communion with God.
Rumbling beneath the break-up on the American right is the echo of the question Tucker hinted at: What is man? Friends and allies can certainly disagree about the answer, but sooner or later they’re going to want different foxholes. And eventually they’re going to train their guns on each other.
Tony Woodlief is a writer who lives in North Carolina.
The post Why Conservative Fusionism Was Destined to Disintegrate appeared first on The American Conservative.
The release of John Bolton’s book today has become a Washington cultural event, because he is, by all measures, Washington’s creature.
Those who dislike the Trump administration have been pleased to find in The Room Where It Happened confirmation in much of what they already believed about the Ukraine scandal and the president’s lack of capacity for the job. Some accusations in the book, such as the story about Trump seeking reelection help from China through American farm purchases, are new, and in an alternative universe could have formed the basis of a different, or if Bolton had his way, more comprehensive, impeachment inquiry.
While Bolton’s book has been found politically useful by the president’s detractors, the work is also important as a first-hand account from the top of the executive branch over a 19-month period, from April 2018 to September 2019. It also, mostly inadvertently, reveals much about official Washington, the incentive structures that politicians face, and the kind of person that is likely to succeed in that system. Bolton may be a biased self-promoter, but he is nonetheless a credible source, as his stories mostly involve conversations with other people who are free to eventually tell their own side. Moreover, the John Bolton of The Room Where It Happened is no different from the man we know from his three-decade career as a government official and public personality. No surprises here.
There are three ways to understand John Bolton. In increasing order of importance, they are intellectually, psychologically, and politically—that is, as someone who is both a product of and antagonist to the foreign-policy establishment—in many ways typical, and in others a detested outlier.
On the first of these, there simply isn’t much there. Bolton takes the most hawkish position on every issue. He wants war with North Korea and Iran, and if he can’t have that, he’ll settle for destroying their economies and sabotaging any attempts by Trump to reach a deal with either country. He takes the maximalist positions on great powers like China and Russia, and third world states that pose no plausible threat like Cuba and Venezuela. At one point, he brags about State reversing “Obama’s absurd conclusion that Cuban baseball was somehow independent of its government, thus in turn allowing Treasury to revoke the license allowing Major League Baseball to traffic in Cuban players.” How this helps Americans or Cubans is left unexplained.
Bolton’s hawkishness is combined with an equally striking lack of originality. It is possible to be an unorthodox or partisan hawk, as we see in populists who want to get out of the Middle East but ramp up pressure on China, or Democrats who have a particular obsession with Russia. Bolton takes the most belligerent position on every issue without regards for partisanship or popularity, a level of consistency that would almost be honorable if it wasn’t so frightening. No alliance or commitment is ever questioned, and neither, for that matter, is any rivalry.
Anyone who picks up Bolton’s over 500-page memoir hoping to find serious reflection on the philosophical basis of American foreign policy will be disappointed. The chapters are broken up by topic area, most beginning with a short background explainer on Bolton’s views of the issue. In the chapter on Venezuela, we are told that overthrowing the government of that country is important because of “its Cuba connection and the openings it afforded Russia, China, and Iran.” The continuing occupation of Afghanistan is necessary for preventing terrorists from establishing a base, and, in an argument I had not heard anywhere before, for “remaining vigilant against the nuclear-weapons programs in Iran on the west and Pakistan on the east.” Iran needs to be deterred, though from what we are never told.
Bolton lacks any intellectual tradition or popular support base that he can call his own. Domestic political concerns are almost completely missing from his book, although we learn that he follows “Adam Smith on economics, Edmund Burke on society,” is happy with Trump’s judicial appointments, and favors legal, but not illegal, immigration. Other than these GOP clichés, there is virtually no commentary or concern about the state of American society or its trajectory. Unlike those who worry about how global empire affects the United States at home, to Bolton the country is simply a vehicle for smiting his enemies abroad. While Bolton’s views have been called “nationalist” because he doesn’t care about multilateralism, nation-building, or international law, I have never seen a nationalist that gives so little thought to his nation.
The more time one spends reading Bolton, the more one comes to the conclusion that the guy just likes to fight. In addition to seeking out and escalating foreign policy conflicts, he seems to relish going to war with the media and the rest of the Washington bureaucracy. His book begins with a quote from the Duke of Wellington rallying his troops at Waterloo: “Hard pounding, this, gentlemen. Let’s see who will pound the longest.” The back cover quotes the epilogue on his fight with the Trump administration, responding “game on” to attempts to stop publication. He takes a mischievous pride in recounting attacks from the media or foreign governments, such as when he was honored to hear that North Korea worried about his influence over the President. Bolton is too busy enjoying the fight, and as will be seen below, profiting from it, to reflect too carefully on what it’s all for.
Bolton could be ignored if he were simply an odd figure without much power. Yet the man has been at the pinnacle of the GOP establishment for thirty years, serving appointed roles in every Republican president since Reagan. The story of how he got his job in the Trump administration is telling. According to Bolton’s account, he was courted throughout the transition process and the early days of the administration by Steve Bannon and Jared Kushner, ironic considering the reputation of the former as a populist opposed to forever wars and the latter as a more liberal figure within the White House. Happy with his life outside government, Bolton would accept a position no lower than Secretary of State or National Security Advisor. Explaining his reluctance to enter government in a lower capacity, Bolton provides a list of his commitments at the time, including “Senior Fellow at the American Enterprise Institute; Fox News contributor; a regular on the speaking circuit; of counsel at a major law firm; member of corporate boards; senior advisor to a global private-equity firm.”
Clearly, being an advocate for policies that can destroy the lives of millions abroad, and a complete lack of experience in business, have proved no hindrance to Bolton’s success in corporate America.
Bolton recounts how his two top aides, Charles Kupperman and Mira Ricardel, had extensive experience working for Boeing. Patrick Shanahan similarly became acting Secretary of Defense after spending thirty years at that company, until he was replaced by Mark Esper, a Raytheon lobbyist. Why working for a company that manufactures aircraft and weapons prepares one for a job in foreign policy, the establishment has never felt the need to explain, any more than it needs to explain continuing Cold War-era military commitments three decades after the collapse of the Soviet Union.
Ricardel resigned after a dispute over preparations for the First Lady’s trip to Africa, an example of how too often in the Trump administration, nepotism and self-interest have been the only checks on bad policy or even greater corruption (“Melania’s people are on the warpath,” Trump is quoted as saying). Another is when Trump, according to Bolton, was less than vigorous in pursing destructive Iranian sanctions due to personal relationships with the leaders of China and Turkey. At the 2019 G7 summit, when Pompeo and Bolton try to get Benjamin Netanyahu to reach out to Trump to talk him out of meeting with the Iranian foreign minister, Jared prevents his call from going through on the grounds that a foreign government shouldn’t be telling the President of the United States who to meet with.
The most important question raised by the career of John Bolton is how someone with his views has been able to achieve so much power. While Bolton gets much worse press and always goes a step too far even for most of the foreign policy establishment, in other ways he is all too typical. Take James Mattis, a foil for Bolton throughout much of the first half of the book. Although more popular in the media, the “warrior monk” slow-walked and obstructed attempts by the president to pull out of the Middle East, and after a career supporting many of the same wars and commitments as Bolton, now makes big bucks in the private sector, profiting off of his time in government.
In the coverage of Bolton, this is what should not be lost. The former National Security Advisor is the product of a system with its own internal logic. Largely discredited and intellectually hollow, and without broad popular support, it persists in its practices and beliefs because it has been extremely profitable for those involved. The most extreme hawks are simply symptoms of larger problems, with the flamboyant Bolton being much more like mainstream members of the foreign policy establishment than either side would like to admit.
Richard Hanania is a research fellow at the Saltzman Institute of War and Peace Studies at Columbia University.
The post Bolton May Be a Beast, But He’s Washington’s Creature appeared first on The American Conservative.
We are told that ours is a government of laws and not of men. But is it? Those rote words of assurance are called into question by the sad saga of President Obama’s executive initiatives for Deferred Action for Childhood Arrivals, or DACA, and by President Trump’s ill-fated effort to reverse those actions through his own executive authority. The outcome should be alarming to anyone who cares about constitutional government as pieced together by the American Founders.
The alarm is particularly acute in relation to one man, Chief Justice John Roberts, who seems bent on ensuring that the Supreme Court, as currently constituted, never tilts toward conservatism with any consistency. He was nominated for his current position by President George W. Bush because of his conservative record, but it isn’t clear—and has never been clear, when we look back on it—precisely what he stands for, aside from his own extravagant ambition.
Joan Biskupic, in her biography, The Chief: The Life and Turbulent Times of Chief Justice John Roberts, recounts that Roberts, as he was angling for a seat on the U.S. Court of Appeals for the D.C. circuit, wished to remain aloof from the conservative Federalist Society, even as he accepted the Federalists’ endorsement for the position. The endorsement was helpful in getting him considered for the court nomination by the second President Bush in 2005, but it could have proved problematic at confirmation time. Roberts’s political conundrum was explored by The Washington Post’s Charles Lane at the time of his nomination to the appeals court.
“Roberts burnished his legal image carefully,” wrote Lane. “In conservative circles, membership in or association with the [Federalist] society has become a badge of ideological and political reliability….But the society’s alignment with conservative GOP politics and public policy makes Roberts’s relationship with the organization a potentially sensitive point for his confirmation because many Democrats regard the organization with suspicion.”
So he sought to fuzz up the matter, even to the point of being “irked” when a Post business reporter identified him as a Federalist member. He asked for a correction, though he had attended society meetings regularly and had cultivated an ideological alignment with the organization for years. Thus do we see a man seeking to obscure his true convictions, whatever they may have been, in an elaborate finesse. Nothing particularly unusual about this in the annals of Washington politics. What’s alarming with regard to Roberts, however, is that he’s still doing it now as Chief Justice of the United States—and doing it in ways that reveal an airy disregard for some of the fundamentals of the American system. In the DACA case, a clear presidential violation of the U.S. Constitution doesn’t seem to bother him in the least.
At issue in the DACA case, DHS v. University of California, was whether Trump could employ his executive authority to reverse previous executive actions by Obama to extend a kind of immigration reprieve to so-called Dreamers who were brought to the United States illegally as children, through no fault of their own. There is widespread support throughout the country, including within the Trump administration, for extending some kind of legal status for the Dreamers. But the question that emanated from Obama’s action was whether the president could constitutionally issue such an order on his own, thus bypassing Congress. The answer clearly is no.
Obama himself acknowledged that constitutional reality on numerous occasions before he decided to take the action anyway. Under pressure from his liberal supporters to wave his executive wand over the Dreamers, he repeatedly refused on the basis of his not having the authority to do so. “I am not king. I can’t do these things just by myself,” he said in 2010. In March 2011, he added that with “respect to the notion that I can just suspend deportations through executive order, that’s just not the case.” Two months later he added that he couldn’t “just bypass Congress and change the [immigration] law myself….That’s not how democracy works.”
Even after Obama reversed himself on the constitutionality question in 2012, no one ever disputed in any serious way the reality that federal immigration laws, enacted by Congress, don’t confer upon the president any authority to suspend execution of those laws. Indeed, Congress had rejected previous efforts to pass new laws enabling such an approach to the DACA issue.
Then the judiciary gave further clarity to the matter when Obama sought to follow up his 2012 DACA actions with a 2014 executive initiative designed to give an administrative amnesty, along with some federal benefits, to certain parents of Dreamers—up to 4.3 million illegal immigrants. In the same series of actions, Obama also initiated a substantial expansion of DACA.
The courts struck down both. After Texas and 25 other states sued the administration over this second overreach, the Fifth Circuit Court of Appeals upheld a nationwide injunction against it. The president’s action, said the court, “does not transform presence [of illegals] deemed unlawful by Congress into lawful presence and confer eligibility for otherwise unavailable benefits based on that change.”
The Supreme Court subsequently affirmed the Fifth Circuit ruling and the injunction—as well as the well-established principle that Congress has full constitutional authority over immigration law. The president must bow to that. Obama was right the first time.
Based on those rulings, and an opinion by then-Attorney General Jeff Sessions that the rulings demonstrated that DACA also was illegal, President Trump in June 2017 exercised his executive authority to terminate Obama’s DACA policy. In other words, he used his executive authority to reverse an unconstitutional executive action by his predecessor.
He was stymied by the Court. And the man who threw the wrench into it was Roberts, who joined the four liberal justices and wrote the majority opinion. Studiously avoiding the constitutional issues involved (a Roberts hallmark, it increasingly seems), he argued that the problem was that the Trump administration hadn’t properly followed the niceties of federal laws requiring certain rule-making procedures, with notice and comment-period requirements. Never mind that the Obama administration hadn’t followed any such procedures either in promulgating its previous unconstitutional rule-making.
This is astounding. Justice Clarence Thomas, in a spirited dissent joined in part by Justices Samuel Alito and Neil Gorsuch, called the majority decision “mystifying” in that DACA was “unlawful from the start, and that alone is sufficient to justify its termination.” He also took issue with Roberts’s quibbling assault on a Justice Department memo that sought to justify Trump’s actions based on the DACA illegality. Thomas faulted the Roberts ruling for requiring the Trump administration to “overlook DACA’s obvious deficiencies and provide additional policy reason and justifications before restoring the rule of law.” This, he added, “will hamstring all future agency attempts to undo actions that exceed statutory authority.”
As The Wall Street Journal noted, this is an “invitation for executive mischief, especially by Presidents at the end of their terms. They’ll issue orders that will invite years of legal challenge if the next president reverses them.”
We know why the four liberal justices jumped on Roberts’s reasoning as their vehicle for retaining DACA even in the face of its clear unconstitutionality. Based on years of judicial activism, it seems clear that they don’t care about such things; it’s the outcome that animates them. But what was Roberts’s motivation? Difficult to say, except that he seems to delight in making mischief through jesuitical tangents seemingly designed to avoid getting to the heart of the constitutional issues brought before his court.
There are enough instances of this kind of judicial review to call into question what Roberts actually believes in. His first dramatic tilt came in his famous 2012 actions in the case involving Obama’s Affordable Care Act, in which Roberts accepted the unconstitutionality of the act’s “individual mandate” under the Constitution’s Commerce Clause but justified it, through contortions of logic, as a tax.
As Biskupic writes in her biography, “Some conservatives believed he was not voting his true sentiment, but trying to shore up his reputation and institutional legacy.”
Then there was Roberts’s bizarre majority opinion in last year’s case involving the administration’s desire to ask a citizenship question in the census. While acknowledging that the executive branch has broad discretion on what questions to ask, Roberts declared that Commerce Secretary Wilbur Ross’s rationale for wanting the question “appears to be contrived.” Because of timing pressures, the ruling effectively thwarted the administration’s interest without actually addressing the merits of the case; and it did so by peering into Ross’s head and purporting to discern what he was thinking. When laws are assessed based on that kind of rationale, the concept of “a nation of laws” is in serious danger.
In the current court session, Roberts also orchestrated a 6-3 decision stretching the language of the 1964 Civil Rights Act to include under the law’s protections sexual orientation and gender identity, notwithstanding that Congress had specifically rejected such actions. The Court, with Roberts and Justice Neil Gorsuch joining the liberals, essentially amended the statute from the bench, something Roberts had repeatedly criticized during his Senate confirmation proceedings.
But it is the DACA case that truly reveals Roberts’s willingness to tinker with the law and trifle with the Constitution to serve his institutional ends, whatever they may be. His actions left in place an unconstitutional executive-branch action by throwing up artificial roadblocks against a constitutional effort to undo that unconstitutional action.
Back when Roberts engineered the Affordable Care Act decision, The Wall Street Journal perceived what was emerging on the Court. “One thing is clear,” said the paper. “This was a one-man show, and that man is John Roberts.” Today that perception looks more and more like the central reality of the Supreme Court’s internal dynamics. That isn’t good news for conservatives.
Robert W. Merry, former Wall Street Journal Washington correspondent and Congressional Quarterly CEO, is the author most recently of President McKinley: Architect of the American Century (Simon & Schuster).
The post If John Roberts Isn’t a Conservative, What is He, Exactly? appeared first on The American Conservative.
There are four branches of government in the United States. In addition to the three you learned about in civics class, there is a vast civil service that is theoretically answerable to the executive but in practice is certainly not. During the Trump era, the independence of this fourth branch of government has been buttressed by the media’s willingness to portray any attempt to subject it to control by elected officials as a “purge,” and in an Orwellian inversion of reality, antithetical to democracy.
The firing of Geoffrey Berman in the Southern District of New York is just the latest example of this, resistance to which is based on the theory that U.S. Attorneys should not be subject to the Attorney General. The conspiracy theory concocted to obscure this false sense of where authority lies is that he was fired to cover up Trump’s corruption.
No evidence exists for this theory, and it’s just as likely he was fired for failing to bring charges against Jeffrey Epstein’s co-conspirators. Berman had recently clashed with Washington, being unwilling to put his name to a letter criticizing the New York Mayor’s double standards on quarantine enforcement, on the one hand encouraging protests while using the NYPD to break up a Jewish funeral. If the Department of Justice wishes to make an issue of that, as part of a duly elected executive government, they should be able to have prosecutors who are willing to do so.
This isn’t a difficult principle, and in matters of state the executive has even more latitude. Yet late last week, another agency made headlines as the victim of yet another Trump purge: the U.S. Agency for Global Media, home to the various quasi-journalism operations that form the United States’ public diplomacy apparatus. The heads of Radio Free Europe/Radio Liberty, Middle East Broadcasting Network, Radio Free Asia, and the Open Technology Fund were all sacked by incoming USAGM executive Michael Pack, who was recently confirmed by the Senate. No career employees were fired; all four had special contracts.
NPR’s David Folkenflik set the tune for coverage of the sackings, with his story alleging in the headline that they were “raising fears of meddling.” Meddling to what end, however, Folkenflik did not seem terribly clear about. Pressed by CBS about what this could portend for their editorial direction, he told the anchor that it could have ramifications for Radio Free Asia’s “bolder reporting” about Uyghurs in China, citing John Bolton’s revelation that Trump had downplayed China’s persecution in a meeting with Xi Jinping. He didn’t mention that Trump signed the Uyghur Human Rights Act the same week as the sackings took place.
Concerns about “editorial independence” are at the heart of the controversy, according to CBS. Editorial independence is one thing that the recently departed head of VOA, Amanda Bennett, had emphasized as part of the organization’s mission. In a statement to VOA staff before she departed, she said, “Michael Pack swore before Congress to respect and honor the firewall that guarantees VOA’s independence, which in turn plays the single most important role in the stunning trust our audiences around the world have in us.” Woe betide Pack if after being confirmed by the Senate he dares to exercise control over an institution chartered and funded by Congress.
Even if you believe that a government-funded broadcaster could, through the magic of some bureaucratic or legal fiction, exercise meaningful editorial independence, the lack thereof is a fact: VOA’s editorials are cleared by the State Department. If this shake-up portends an end to that fiction, that is a good thing.
The theory of Bennett and various Pack critics goes something like this: Other countries will recognize what a long leash we give our state-funded broadcasters, and therefore the editorial independence of VOA and its siblings itself becomes part of the soft-power pitch being made. It’s an interesting argument, but also somewhat self-serving. The VOA charter says, “VOA will present the policies of the United States clearly and effectively.”
It’s hard to argue with the fact that, at least at Radio Free Europe, the coverage sounds more like Trump’s Atlanticist critics: prone to seeing Russian meddling everywhere, skeptical of populist politicians like Salvini, Orban and Duda, and fond of NATO. You can think their view is the right one, but it’s also not the view of the president the American people put in the White House. By that standard, the VOA is out of step with its charter.
By any objective measure a shake-up at USAGM is fully justified, for years it has lingered at the bottom of the lists of federal agencies in terms of employee satisfaction. VOA had to fire half the staff of one of its Africa broadcasters for accepting bribes two years ago. Nor can the shake-up be attributed to partisanship: the former heads of the European and Middle East broadcasters, Jamie Fly and Alberto Fernandez, worked in the Bush administration. Perhaps this is why CNBC, for example, uses the term “Trump loyalists”—it’s true that Fly, a leading neoconservative and Never Trumper, is a Trump disloyalist. The big question is why they hired him in the first place.
A pattern common to many Trump-era controversies also holds true here: the hawks are mad about it. The leading interventionist of the senatorial opposition, Bob Menendez, said the firings were part of Trump’s “efforts to transform U.S. institutions rooted in the principles of democracy into tools for the President’s own personal agenda.” Texas Rep. Michael McCaul, an Iran hawk and ranking Republican on the House Foreign Affairs committee also expressed his concern. What they object to is American foreign policy ever being subject to democratic control: they claim to be defending democratic institutions while thwarting the demos’s revealed preference for a more restrained posture abroad.
What personal benefit Trump could possibly derive from getting his message out on an African radio station or a TV studio in Dubai is unclear, perhaps Senator Menendez could clarify that for us. As with Russiagate, the backlash to Pack’s shake-up is really about policy differences, not concerns about corruption. It remains to be seen what the administration intends to do with USAGM, but given the differences of opinion between the administration and many career officials involved in both public and official diplomacy, it’s hard to blame them for wanting both a shorter leash and a fresh start.
The post Michael Pack Is Right To Rein In State-Funded Broadcasters appeared first on The American Conservative.
The left’s war on America’s past crossed several new frontiers last week.
Portland’s statue of George Washington, the Father of his Country and the first president of the United States, the greatest man of his age, was toppled and desecrated.
While the statue stood, an American flag was draped over its head and set ablaze. After it was pulled down, a new fire was set on another American flag spread across the statue, and also burned. The vacated pedestal was painted with the words, “You’re on Native Land.”
In Portland also, a statue of Thomas Jefferson that stood at the entrance of a high school named for the author of the Declaration of Independence was torn down. In New York, city council members demanded that the Jefferson statue in city hall be removed.
Anticipating what was coming, the New York Museum of Natural History got the permission of city hall to have the giant statue of Theodore Roosevelt astride a horse, flanked by an African and a Native American, removed from the front of the museum.
What was wrong with the 80-year-old statue?
Said museum president Ellen Futter, the problem is its “hierarchical composition.” Only Roosevelt was mounted.
With Washington, Jefferson and Roosevelt all under attack, three of the four presidents on Mount Rushmore are now repudiated by the left.
Our Taliban have moved on, past Columbus and the Confederate generals, to dislodge and dishonor the Founding Fathers and their patriot sons.
In Philadelphia, the Tomb of the Unknown Soldier of the American Revolution, with its statue of Washington, was defaced. The tomb is the final resting place for thousands of soldiers, known but to God, who died in the struggle for American independence.
“Committed Genocide” is the charge scrawled on the memorial.
Local authorities or police did not stop the vandals. One wonders what will happen should the haters of Washington and Jefferson decide to torch their ancestral homes at Mount Vernon and Monticello.
Still another line was crossed last week in the war against the past.
A statue of Ulysses S. Grant in San Francisco’s Golden Gate Park was toppled. Police watched as hundreds gathered to take down the general and 18th president, who accepted the surrender of General Robert E. Lee’s Army of Northern Virginia.
Also pulled down in Golden Gate Park was a statue of Francis Scott Key, who wrote our national anthem, “The Star-Spangled Banner,” after he watched all through the night in 1814 as British warships bombarded Fort McHenry.
A third statue torn down in Golden Gate Park was that of Father Junipero Serra, the Franciscan priest who founded nine of the 21 Spanish missions in California that run from San Diego to San Francisco.
Serra lived in the 18th century, long before the U.S. acquired California and decades before Mexico won its independence. Pope Francis canonized him in 2015.
At the end of last week, the last statue of a Confederate soldier in the nation’s capital, that of Gen. Albert Pike, who spent his years after the war doing good works, was pulled down, while Mayor Muriel Bowser’s D.C. cops watched from police cruisers.
We erect statues to remember, revere and honor those whom we memorialize. And what is the motivation of the people who tear them down and desecrate them?
In a word, it is hate. A goodly slice of America’s young hates this country’s history and the men who made it. It hates the discoverers and explorers like Columbus, the conquistadores and colonists. It hates the Founding Fathers and the first 15 presidents, all of whom either had slaves or coexisted with the injustice of slavery. But hating history and denying history and tearing down the statues of the men who made that history does not change history.
So, where are we going?
Today, as was true in the 1960s, the American establishment is on the run. It recoils from mob action but cannot bring itself to condemn those tearing down the statues, for it basically agrees with them and seeks to marshal their energy to help it get back into power in November.
But this cannot go on. The political and propaganda war on the cops, the vandalism of the statues and memorials, the disgracing and dishonoring of American heroes cannot go on indefinitely.
At some point, in the near future, the establishment, and its questionable political instrument, Joe Biden, will have to have his Sister Souljah moment, and stand up and stay, “This should stop.”
For, whatever happens in this election, the American people will not stay united around a party and a movement built on the proposition that America has been, from before its birth, a racist criminal enterprise.
You cannot lead a people whose history and heroes you hate.
A house divided against itself cannot stand. And a society whose history is hated by millions of its members will not survive.
Patrick J. Buchanan is the author of Nixon’s White House Wars: The Battles That Made and Broke a President and Divided America Forever.
Ten years ago, “restraint” was considered code for “isolationism” and its purveyors were treated with nominal attention and barely disguised condescension. Today, agitated national security elites who can no longer ignore the restrainers—and the positive attention they’re getting—are trying to cut them down to size.
We saw this recently when Peter Feaver, Hal Brands, and William Imboden, who all made their mark promoting George W. Bush’s war policies after 9/11, published “In Defense of the Blob” for Foreign Affairs in April. My own pushback received an attempted drubbing in The Washington Post by national security professor Daniel Drezner (he of the Twitter fame): “For one thing, her essay repeatedly contradicts itself. The Blob is an exclusive cabal, and yet Vlahos also says it’s on the wane.”
One can be both, Professor. As they say, Rome didn’t fall in a day. What we are witnessing are individuals and institutions sensing existential vulnerabilities. The restrainers have found a nerve and the Blob is feeling the pinch. Now it’s starting to throw its tremendous girth around.
The latest example is from Michael J. Mazarr, senior political scientist at the Rand Corporation, which since 1948 has essentially provided the brainpower behind the Military Industrial Congressional Complex. Mazarr published this voluminous warrant against restrainers in the most recent issue of The Washington Quarterly, which is run by the Elliott School of International Affairs at George Washington University. Its editorial board reeks of the conventional internationalist thinking that has prevailed over the last 70 years.
In “Rethinking Restraint: Why It Fails in Practice,” Mazarr insists that the critics have it all wrong: “American primacy” is way overstated and the U.S. has been more moderate in military interventions than it’s given credit for. Moreover, he says, the restrainers divide current “US strategy into two broad caricatures—primacy or liberal hegemony at one extreme, and restraint at the other. …Such an approach overlooks a huge, untidy middle ground where the views of most US national security officials reside and where most US policies operate.”
There is much to unpack in his nearly 10,000-word brief, and much to counter it. For example, Monica Duffy Toft has done incredible research into the history of U.S. interventions over the last 70 years, in part studying the number of times we’ve used force in response to incidents of foreign aggression. While the United States engaged in 46 military interventions from 1948 to 1991, from 1992 to 2017, that number increased fourfold to 188 (chart below). Kind of calls Mazarr’s “frequent impulse to moderation” theory into question.
But I would like to zero in on the most infuriating charge, which mimics Drezner, Brands, Feaver, et al.: that the idea of a powerful, largely homogeneous foreign policy establishment dominating top levels of government, think tanks, media, and academia is really all in our heads. It’s not real.
This weak attempt to gaslight the rest of us is an insult to George Cukor’s 1944 Hollywood classic. It’s unworthy. In the section “There is No Sinister National Security Elite,” Mazarr turns to Stephen Walt (who wrote an entire book on the self-destructive Blob) and Andrew Bacevich (who has written that the ideology of American exceptionalism and primacy “serves the interests of those who created the national security state and those who still benefit from its continued existence”). This elite, both men charge, enjoy “status, influence, and considerable wealth” in return for supporting the consensus.
To this Mazarr contends, “Apart from collections of anecdotes, those convinced of the existence of such a homogenous elite offer no objective evidence—such as surveys, interviews, or comprehensive literature reviews—to back up these sweeping claims.” Then failing to offer his own evidence, he argues:
on specific policy questions—whether to go to war or conduct a humanitarian intervention, or what policy to adopt toward China or Cuba or Russia or Iran—debates in Washington are deep, intense, and sometimes bitter. To take just a single example from recent history, the Obama administration’s decision to endorse a surge in Afghanistan came only after extended deliberation and soul-searching, and it included a major, and highly controversial, element of restraint—a very public deadline to begin a graduated withdrawal.
Let’s go back to 2009, because some of us actually remember these “deep, intense, and sometimes bitter” times.
First, the only “bitter debates” were between the military, which wanted to “surge” 40,000 troops into Afghanistan in the first year of Obama’s presidency, and the president, who had promised to bring the war to an end. After months, Obama “compromised” when in December 2009, he announced a plan for 30,000 new troops (which would bring the then-current number to 98,000) and a timetable for withdrawal of 18 months hence, which really pleased no one, not even the outlier restrainers, like Mazarr suggests.
In fact, restrainers knew the timetable was bunk, and it was. In 2011, there were still 100,000 troops on the ground. In fact, it didn’t get down to pre-2009 levels until December 2013.
But let it be clear: the only contention in December 2009 was over the timetable (the hawks at the Heritage Foundation and AEI wanted an open-ended commitment) and whether the president should have been more deferential to his generals (General Stanley McCrystal had just been installed as commander in Afghanistan and the mainstream media was fawning). Otherwise, every major think tank in town and national security pundit blasted out press releases and op-eds supporting the presidents strategy with varying degrees of enthusiasm. None, aside from the usual TAC suspects, raised a serious note against it. Examples:
John “Eating Soup with a Knife” Nagl, Center for a New American Security: “This strategy will protect the Afghan population with international forces now and build Afghan security forces that in time will allow an American drawdown–leaving behind a more capable Afghan government and a more secure region which no longer threatens the United States and our allies.” Each of the CNAS fellows on this press release offer a variation on the same theme, with some more energetic than others. Ditto for this one from The Council on Foreign Relations.
Vanda Felhab-Brown, Brookings Institution: “there would have been no chance to turn the security situation around, take the momentum away from the Taliban, and hence, enable economic development and improvements in governance and rule of law, without the surge.”
David Ignatius, The Washington Post: “Obama has made what I think is the right decision: The only viable ‘exit strategy’ from Afghanistan is one that starts with a bang—by adding 30,000 more U.S. troops to secure the major population centers, so that control can be transferred to the Afghan army and police.”
Ahead of Obama’s decision (during the “bitter debate”), the Brookings Institution’s Michael O’Hanlon, a fixture on The Washington Post op-ed pages and cable news shows—was pushing for the maximum: “President Barack Obama should approve the full buildup his commanders are requesting, even as he also steels the nation for a difficult and uncertain mission ahead.”
Meanwhile, all of the so-called progressive national security groups, including the Center for American Progress, Third Way, and the National Security Network, heralded Obama’s plan as “a smarter, stronger strategy that stated clear objectives and is based on American security interests, namely preventing terrorist attacks.”
“Counterintuitively,” they said in a joint statement, “sending more troops will allow us to get out more quickly.”
Anthony Cordesman at the Center for Strategic and International Studies (CSIS) has always been a thoughtful skeptic, but he never fails to offer a hedge on whatever new plan comes down the pike. Here he is on Obama’s surge, exemplifying how difficult it was/is for the establishment to just call a failure a failure:
The strategy President Obama has set forth in broad terms can still win if the Afghan government and Afghan forces become more effective, if NATO/ISAF national contingents provide more unity of effort, if aid donors focus on the fact that development cannot succeed unless the Afghan people see real progress where they live in the near future, and if the United States shows strategic patience and finally provides the resources necessary to win.
That’s a lot of “ifs,” but they provide amazing cover for those who don’t want to admit the cause is lost—or can’t—because their work depends on giving the military and State Department something to do. This is what happens when your think tank relies on government contracts and grants and arms industry money. According to The New York Times, major defense contractors Lockheed Martin and Boeing gave some $77 million to a dozen think tanks between 2010 and 2016.
They aren’t getting the money to advocate that troops, contractors, NGO’s, and diplomats come home and stay put. Money and agenda underwrites who is heading the think tanks, who speaks for the national security programs, and who populates conferences, book launches, speeches, and television appearances. Mazarr doesn’t think this can be quantified but it’s rather easy. Google “2009 Afghanistan conference/panel/speakers” and plenty of events come up. Pick any year, the results are predictable.
Here’s a Brookings Panel in August 2009, assessing the Afghanistan election, including Anthony Cordesman, Kimberly Kagan, and Michael O’Hanlon. Not a lot of “diversity” there. Here’s a taste of the 2009 annual CNAS conference, which featured the usual suspects, including David Petraeus, Ambassador Nicholas Burns, and 1,400 people in attendance. Aside from Andrew “Skunk at the Garden Party” Bacevich, there was little to distinguish one world view from another among the panelists. (CNAS was originally founded in support of Hillary Clinton’s 2008 campaign; she spoke at the inaugural conference in 2007. Former president Michele Flournoy later landed in the E-Ring of the Pentagon.) Meanwhile, here’s a Hudson Institute tribute to David Petraeus, attended by Scooter Libby, and a December 2009 Atlantic Council panel with—you guessed it— Kimberly Kagan and two military representatives thrown in to pump up McChrystal and NATO and staying the course.
On top of it all, these events and their people never failed to get the attention of the major corporate media, which just loved the idea of warrior-monk generals “liberating” Afghanistan through a “government in a box” counterinsurgency (COIN) strategy.
Honestly, thank goodness for Cato, which before the new Quincy Institute, was the only think tank to feature COIN critics like Colonel Gian Gentile, and not just as foils. The Center for the National Interest also harbored skeptics of the president’s strategy. But they were outnumbered too.
This is what I want to convey. Mazarr boasts there is a galaxy of opinion today over U.S. policy in Iran, China, Russia, NATO. I would argue there is a narrow spectrum of technical and ideological disagreement in all these cases, but nowhere was it more important to have strong, competing voices than during the Iraq and Afghanistan wars, and there was none of that in any realistic sense of the word.
I challenge him and the others to take down the straw men and own the ecosystem to which they owe their success in Washington (Mazarr just published a piece called “Toward a New Theory of Power Projection” for goodness sake). Stop trying to pretend what is there isn’t. Realists and restrainers are happy to debate the merits of our different approaches, but gaslighting is for nefarious lovers and we’re no Ingrid Bergman.
The post Gaslighting Nobody, The Blob Struggles for Primacy appeared first on The American Conservative.
Depicting revolutionary France, Dickens wrote, “Six tumbrils roll along the streets. Change these back again to what they were, thou powerful enchanter, Time, and they shall be seen to be the carriages of absolute monarchs, the equipages of feudal nobles, the toilettes of flaring Jezebels, the churches that are not my father’s house but dens of thieves, the huts of millions of starving peasants!”
Today America’s tumbrils are clattering about, carrying toppled statues, ruined careers, unwoke brands. Over their sides peer those deemed racist by left-wing identitarians and sentenced to cancelation, even as the evidentiary standard for that crime falls through the floor. Rioters over the weekend destroyed a statue of Ulysses S. Grant, the general who finished off the Confederacy. Falsehoods and innuendoes outpace the truth: in Oakland, a panic arose over what were supposedly nooses in a public park; turns out they were just exercise equipment that had been there for months. But no matter. America’s Jacobins are in no mood to reason. As in Dickens’ France, genuine social problems have mushroomed into a national orgy of self-harm.
But who are these cultural revolutionaries? The conventional wisdom goes that this is the inner-cities erupting, economically disadvantaged victims of racism enraged over the murder of George Floyd. The reality is something more…bourgeoisie. As Kevin Williamson observed last week, “These are the idiot children of the American ruling class, toy radicals and Champagne Bolsheviks playing Jacobin for a while until they go back to graduate school.” Most of the culling is taking place not in the streets, but in the faculty lounge, the corporate boardroom, the upstart real estate firm with a socially conscious Twitter footprint and a penchant for Mean Girls GIFs. The most high-profile casualty so far isn’t even a person but a maple syrup, Aunt Jemima, whose threat to world peace seems rather manageable.
Such superficial victories are a clear sign of the bourgeoisie’s soft hand. Meaningful police legislation, the kind that might prevent future George Floyds, currently being worked on by serious reformers, is a difficult push. Whereas reducing policymaking to maximalist slogans is easy; spray-painting a statue is even easier; whining about a visage on a syrup bottle is easier still. And ease is the currency of these weekend warriors, these erstwhile stoppers of Kony. Who is the face of their revolution? It’s tempting to name Melissa Click, the white (check) communications professor (check) who at a 2015 protest over racial issues (check) exhorted others (check) to beat up a student journalist. (Click was later fired for her misconduct by the University of Missouri, only to be scooped up by Gonzaga. The culling, it seems, only ever goes in one direction.)
But there’s another figure who I think is even more representative than Click—not from the French Revolution or our present Tantrum of the Tenured, but the 1960s.
In early 1970, a townhouse in Greenwich Village exploded, leaving a charred hole in the facade on West 11th Street. Police later concluded that the cause was a short-circuited bomb, which young radicals had been building in the basement. This was the work of the Weather Underground, the left-wing terrorist group, which was planning to plant the explosive at an officers’ ball at Fort Dix. Among those killed in the blast was a young woman named Diana Oughton, who may have been holding the bomb when it accidentally went off.
It will not surprise you to learn that Oughton did not have a difficult childhood. Her father, James Oughton, was one of the wealthiest men in the state of Illinois thanks to his vast agricultural holdings. Diana grew up in a small town, Dwight, but amid immense privilege, and when she arrived right on time at Bryn Mawr, she was a paint-by-numbers Republican who supported abolishing Social Security. Her transformation over the intervening decade is one of the most instructive and fascinating cases of radicalization ever documented, one that’s inspired both Hollywood (the movie Katherine is loosely based on her life) and the news media (a four-part UPI profile of Oughton by journalists Lucinda Franks and Thomas Powers won a Pulitzer). Somehow Oughton went from a lively and caring rich girl to very nearly one of America’s worst mass murderers.
How did this happen? Some of it had to do with her volunteer work at a far-flung village in Guatemala, which opened her eyes to poverty, inequality, and the corruption of American foreign aid. Some of it had to do with her beau, leftist lowlife Bill Ayers, who later became one of the Weather Underground’s leaders. But a good deal more had to do with her gilded upbringing, which drew the contempt of her fellow radicals and seemed to turn her hatred inwards. It will also not surprise you to learn that the Weathermen were white. What drove them to madness was a cloying need to repudiate their privilege and prove themselves worthy comrades of the African Americans then fighting for liberation. (It didn’t work: the Black Panthers ultimately denounced them as “chauvinistic” and “scatterbrains.”)
Oughton, Franks and Powers note, came to detest “everything that she was.” They conclude, “She regarded the world she saw around her as the implacable enemy of everything she believed in. Like the rest of the Weathermen, the privileged children of that world, in the end Diana had only one ambition: to be its executioner.”
The Diana Oughtons of today aren’t about to start blowing up federal buildings, as did the Weather Underground. But they do share that mentality: in deploring their privilege, they’ve come to reject everything that bestowed it upon them, their history, their nationality, their traditions, their culture, most of the past and some of the present as well. As recently as a decade ago, President Barack Obama portrayed America as an imperfect but worthy project, applying its ideals of opportunity and equality to those still left behind. Today such incrementalism is a dirty word. The waypoints of societal change are intolerable. All must measure up to the uniform yet constantly changing woke yardstick. Thus did a CNN contributor casually suggest that the Washington Monument might be demolished because George Washington was a slaveowner. Anyone suspected of harboring racism—even a founding father who, however imperfect himself, helped codify principles that ended slavery—must be brought to his knees.
It’s worth repeating that this isn’t a working class production. It’s driven by a new generation of bratty Bolsheviks, those spoiled enough to think they can set a single standard and then tear down the world for not living up to it. In lashing out at the society that coddled them, they swing first at themselves. Once our tumbrils were police vans and pickup trucks with Confederate flag bumper stickers; now they’re vehicles for upper-class solipsism and masochism.