President Ronald Reagan: A Legacy of Freedom in Europe

President Ronald Reagan: A Legacy of Freedom in Europe

Ronald and Nancy Reagan walking together (AP Images)

A file photo of President and Mrs. Reagan at the White House in 1986.

By Jeanne Holden
Special Correspondent

Washington — Many people will celebrate on February 6 the 100th anniversary of the birth of Ronald Reagan because he played a major role in ending the Cold War and promoting freedom in Europe. In fact, many of Reagan’s policies outlasted his presidency and continue to serve U.S. interests to this day.

Among his accomplishments, President Reagan advanced three key principles that remain fundamental to security relationships in Europe: “trust, but verify”; no artificial divisions into “blocs” or “spheres of influence”; and “mutual assured destruction” is not an acceptable nuclear deterrence policy.

ARMS CONTROL

In 1982, Reagan restarted arms talks with the Soviet Union, but his goal was not limiting the arms race as the SALT talks had done. Rather, he sought substantial reductions of the superpowers’ stockpiles of nuclear weapons. In a 1982 speech to the British Parliament, Reagan declared that “our purpose is clear: reducing the risk of war by reducing the means of waging war on both sides.”

As Secretary of Defense Robert Gates told reporters in March 2010 in arguing for the New Strategic Arms Reduction Treaty (New START) with Russia, when President Reagan and Soviet leader Mikhail Gorbachev signed the Intermediate-Range Nuclear Forces Treaty in December 1987 — which eliminated an entire class of nuclear weapons — it marked the transition from arms control to the actual reduction of nuclear arsenals.

Reagan also believed that verification had to be a critical element in any nuclear arms reduction agreement and repeatedly used the Russian proverb “trust, but verify” in his speeches.

It is a phrase that has withstood the test of time. As Secretary of State Hillary Rodham Clinton said in 2010 in arguing for the New START accord, “Verification provides the transparency and builds the trust needed to reduce the chance for misunderstandings and miscalculations.”

THE COLD WAR

Probably Reagan’s most enduring foreign policy legacy stemmed from his steadfast belief that the Soviet Union had no moral right to dominate Eastern Europe. This rejection of spheres of influence has guided subsequent U.S. administrations.

As Secretary Clinton said in July 2010: “The United States does not recognize spheres of influence.”

Reagan did not believe that freedom could be left to wither in accommodation with autocracy. He told the British Parliament in 1982, “Our mission today: to preserve freedom as well as peace. … I believe the renewed strength of the democratic movement, complemented by a global campaign for freedom, will strengthen the prospects for arms control and a world at peace.”

Reagan believed that a U.S. military buildup, resolve in dealings with the Soviet Union, and open support for anti-communist resistance and opposition groups throughout the world would eventually bring the Soviet Union to the negotiating table. Although these policies were often described as confrontational, Reagan sought to achieve “peace through strength.” Moreover, Reagan made it clear in his public and private communications that he and his government were open to talks with Soviet leaders, and that he believed in the possibilities of negotiation and compromise.

Reagan was quick to seize the opportunity to engage with Gorbachev and the new generation of Soviet leaders who came to power in the mid 1980s. He was willing to explore bold disarmament initiatives and work with his Soviet counterpart to jointly dismantle the bipolar, Cold War world order despite criticism from many of his own supporters.

On June 12, 1987, Reagan challenged Gorbachev to improve the Soviet economy, to move forward on reforms, and to abolish the “Iron Curtain” splitting Europe. Speaking at the Berlin Wall in front of an audience of thousands, Reagan said: “General Secretary Gorbachev, if you seek peace, if you seek prosperity for the Soviet Union and Eastern Europe, if you seek liberalization, come here to this gate. Mr. Gorbachev, open this gate! Mr. Gorbachev, tear down this wall!”

In 1989 after the Soviet leader declared that the Soviet Union would no longer intervene in the affairs of the states of Eastern Europe, repressive Communist regimes across the region rapidly collapsed. On November 9, 1989, the Berlin Wall was torn down. Two years later, the Soviet Union itself ceased to exist.

When Reagan died in June 2004, former British Prime Minister Margaret Thatcher summed up Reagan’s legacy in her eulogy:

“Others prophesied the decline of the West; he inspired America and its allies with renewed faith in their mission of freedom. … Others hoped, at best, for an uneasy cohabitation with the Soviet Union; he won the Cold War — not only without firing a shot, but also by inviting his enemies out of their fortress and turning them into friends.”

Posted in Uncategorized | Tagged , , , , , , | Leave a comment

United States and Russia Conclude New START Arms Cut Pact

United States and Russia Conclude New START Arms Cut Pact

Sergey Lavrov and Hillary Clinton seated signing documents as aides stand by (AP Images)

Secretary Clinton, seated on right, and Russian Foreign Minister Sergey Lavrov sign documents in Munich for the New START nuclears arms reduction pact.

By Merle David Kellerhals Jr.
Staff Writer

Washington — Secretary of State Hillary Rodham Clinton and Russian Foreign Minister Sergey Lavrov exchanged diplomatic documents February 5 in Munich, concluding a two-year effort to reduce nuclear arsenals to their lowest levels in more than 50 years.

The instruments of ratification signed by Clinton and Lavrov to implement the New START pact govern reducing the number of nuclear warheads to 1,550 for each nation from 2,200 warheads. The treaty succeeds the 1991 Strategic Arms Reduction Treaty that expired in December 2009.

Clinton and Lavrov exchanged the agreements on the sidelines of the 47th annual Munich Security Conference, which is an informal meeting of some 350 major policymakers from around the world that examines security threats and challenges.

“We exchange the instruments of ratification for a treaty that lessens the nuclear dangers facing the Russian and American people and the world,” Clinton said February 5.

Clinton said this new treaty is a significant milestone in U.S.-Russian relations.

The treaty, which is the first major arms reduction pact since the last days of the Cold War, will reduce the two nations’ nuclear arsenals to 1,550 nuclear warheads each over seven years. The treaty is set to expire in 10 years unless it is extended for one five-year term. It also includes strict limits on the number of vehicles that can be used to launch the warheads. The United States and Russia hold 90 percent of the nuclear weapons in the world.

The treaty was signed April 8, 2010, by President Obama and Russian President Dmitry Medvedev in Prague. It is a centerpiece of Obama’s foreign policy and reflects his broader view of a world free of nuclear weapons. Obama was awarded the 2009 Nobel Peace Prize for his efforts to foster arms control and nuclear nonproliferation worldwide.

“This is the most significant arms control agreement in nearly two decades,” Obama said after the U.S. Senate voted in favor of the treaty. “And it will make us safer.”

The U.S. Senate approved ratification December 22, 2010, and Russia’s parliament gave its final approval in January. Obama signed ratification documents February 2.

But Clinton also said that reaching this agreement figures heavily into renewing close relations between two former Cold War foes.

“With the exchange of these instruments, we commit ourselves to a course of action that builds trust, lessens risks, and improves predictability, stability and security,” Clinton said. “Our countries will immediately begin notifying each other of changes in our strategic forces.”

Within 45 days, the United States and Russia will exchange full data on existing nuclear weapons and facilities and the means to deliver them. In 60 days, both nations will resume on-site inspections that “allow each side to trust, but verify.”

Posted in Uncategorized | Tagged , , , , , , , , , | Leave a comment

The Politics of Hope :ELECTION OF 2008 AND EMERGENCE OF BARACK OBAMA

The Politics of Hope

Barack Obama in front of crowd holding signs (AP Images)

Democratic presidential candidate Senator Barack Obama at a campaign rally in Charlotte, North Carolina, in September 2008.

“The strongest democracies flourish from frequent and lively debate, but they endure when people of every background and belief find a way to set aside smaller differences in service of a greater purpose.”
– President Barack Obama, 2009

ELECTION OF 2008 AND EMERGENCE OF BARACK OBAMA

Having served two terms, President George W. Bush was constitutionally prohibited from being elected again to the presidency. After a spirited preconvention campaign, the Republicans chose as their candidate Senator John McCain of Arizona. A Vietnam veteran respected for his heroic resistance as a prisoner of war, McCain possessed strong foreign policy credentials and was a relatively moderate conservative on domestic issues. He chose as his running mate Governor Sarah Palin of Alaska. Much admired by Christian evangelicals and cultural conservatives, she drew almost as much attention as McCain himself.

In late 2007, it seemed nearly certain that the Democratic nomination would go to Senator Hillary Rodham Clinton of New York. The wife of former president Bill Clinton, she had quickly established herself as a leading member of Congress and possessed a strong national constituency among women and liberal Democrats. However, she faced a phenomenon not unusual in democratic societies — a relatively unknown, but charismatic, challenger whose appeal rested not on ideological or programmatic differences but on style and personal background.

Barack Hussein Obama was only in his second year as a U.S. senator from Illinois, but his comparative youth and freshness were assets in a year when the electorate was weary of politics as usual. So was his multicultural background. He was born in Honolulu on August 4, 1961, to a Kenyan father studying at the University of Hawaii and a white mother originally from a small town in Kansas. In 1963, the senior Obama left his new family to pursue graduate study at Harvard and later to return to Kenya. When Obama was six his mother remarried and relocated to Indonesia, where Obama briefly attended a Muslim school. He eventually returned to Hawaii, living with his maternal grandparents while he attended a private U.S. high school. He went on to study at two of the best universities in the United States — Columbia and Harvard. His personal style mixed a rare speaking talent with a hip informality that had great appeal to younger voters. Americans of all ages could consider him an emblematic representative of their society’s tradition of providing opportunity for all.

After a close, hard-fought six months of party caucuses and primary elections, Obama eked out a narrow victory over Clinton. He made Senator Joseph Biden of Delaware his vice-presidential selection. Most measures of popular sentiment indicated that the public wanted a change. The two candidates began the fall campaign season as strong favorites.

Any chance that McCain and Palin could pull ahead was ended by the sharp financial crisis that began in the last half of September and sent the economy crashing. Caused by excessive speculation in risky mortgage-backed securities and other unstable investments, the crash led to the bankruptcy of the venerable Lehman Brothers investment house and momentarily imperiled the entire financial superstructure of the nation. The Federal Deposit Insurance Corporation (FDIC), created during the New Deal, shut down numerous banks without loss to depositors, but had no jurisdiction over the giant financial investment companies that did not engage in commercial banking. Moreover, it had only limited capabilities to deal with those corporations that did both.

Fearing a general financial meltdown reminiscent of the darkest days of the Great Depression, the U.S. Treasury and the Federal Reserve engineered a Troubled Assets Relief Program (TARP) that was funded by a $700 billion congressional appropriation. The TARP program kept the endangered investment banks afloat. What it could not do was stave off a sharp economic collapse in which millions of U.S. workers lost their jobs.

That November, the voters elected Obama president of the United States, with approximately 53 percent of the vote to McCain’s 46.

OBAMA: THE FIRST YEAR

Obama was inaugurated president of the United States on January 20, 2009, in an atmosphere of hope and high expectations. In his inaugural address, he declared: “The time has come to reaffirm our enduring spirit; to choose our better history; to carry forward that precious gift, that noble idea, passed on from generation to generation: the God-given promise that all are equal, all are free, and all deserve a chance to pursue their full measure of happiness.” He proclaimed an agenda of “remaking America” by reviving and transforming the economy in ways that would provide better and less-expensive health care for all, foster environmentally friendly energy, and develop an educational system better suited to the needs of a new century.

Speaking to the international community, he pledged U.S. cooperation in facing the problem of global warming. He also delivered a general message of international engagement based on compassion for poorer, developing countries and respect for other cultures. To Muslims around the world he said, “We seek a new way forward, based on mutual interest and mutual respect.”

The speech revealed the wide scope of Obama’s aspirations. His rhetoric and his strong personal presence won wide approval — so much so that in October, he was awarded the Nobel Peace Prize in recognition of his goals. But, as always in the complex system of American representative government, it was easier to state large ambitions than to realize them.

At home, the administration addressed the mounting economic crisis with a $787 billion stimulus act designed to bring growing unemployment down to manageable levels. The legislation doubtless saved or created many jobs, but it failed to prevent unemployment — officially estimated at 7.7 percent of the labor force when Obama took office — from increasing to a high of 10.1 percent, then receding just a bit. The loans to large investment and commercial banks begun during the Bush administration with the objective of restoring a stable financial system were mostly repaid with a profit to the government, but a few remained outstanding as the president began his second year in office. In addition, the government invested heavily in two giant auto makers — General Motors and Chrysler — shepherding them through bankruptcy and attempting to reestablish them as major manufacturers.

Obama’s other major objective — the establishment of a national health care system — had long been a goal of American liberalism. With large Democratic majorities in both houses of Congress, it seemed achievable. However, developing a plan that had to meet the medical needs of more than 300 million Americans proved extraordinarily difficult. The concerns of numerous interests had to be dealt with — insurance companies, hospitals, physicians, pharmaceutical companies, and the large majority of Americans who were already covered and reasonably satisfied. In addition, a comprehensive national plan had to find some way to control skyrocketing costs. In the spring of 2010, the president signed complex legislation that mandated health insurance for every American, with implementation to take place over several years.

In foreign policy, Obama sought to reach out to the non-Western world, and especially to Muslims who might interpret the American military actions in Iraq and Afghanistan as part of a general war on Islam. “America and Islam are not exclusive and need not be in competition,” he told an audience at Cairo University. In Tokyo, he reassured Asians that America would remain engaged with the world’s fastest-growing region. While hoping to distinguish itself in tone from the Bush administration, the Obama government found itself following the broad outlines of Bush’s War on Terror. It affirmed the existing agreement to withdraw American troops from Iraq in 2011 and reluctantly accepted military plans for a surge in Afghanistan. In his Nobel acceptance speech, President Obama quoted the celebrated American theologian Reinhold Niebuhr to the effect that evil existed in the world and could be defeated only by force.

At the conclusion of his first year in office, Obama remained, for many Americans, a compelling personification of their country’s ideals of liberty and equal opportunity.

AFTERWORD

From its origins as a set of obscure colonies hugging the Atlantic coast, the United States has undergone a remarkable transformation into what political analyst Ben Wattenberg has called “the first universal nation,” a population of almost 300 million people representing virtually every nationality and ethnic group on the globe. It is also a nation where the pace and extent of change — economic, technological, cultural, demographic, and social — is unceasing. The United States is often the harbinger of the modernization and change that inevitably sweep up other nations and societies in an increasingly interdependent, interconnected world.

Yet the United States also maintains a sense of continuity, a set of core values that can be traced to its founding. They include a faith in individual freedom and democratic government, and a commitment to economic opportunity and progress for all. The continuing task of the United States will be to ensure that its values of freedom, democracy, and opportunity — the legacy of a rich and turbulent history — are protected and flourish as the nation, and the world, move through the 21st century.

Posted in Uncategorized | Tagged , | Leave a comment

Bridge to the 21st Century: The Clinton and Bush II years

Bridge to the 21st Century

The Clinton and Bush II years

Firefighters beneath destroyed World Trade Center after September 11, 2001 (AP Images)

Firefighters beneath the destroyed World Trade Center after the September 11, 2001, terrorist attack in New York. (© AP Images)

“As we look ahead into the next century, leaders will be those who empower others.”

– Microsoft co-founder and chairman Bill Gates, 2007

For most Americans the 1990s would be a time of peace, prosperity, and rapid technological change. Some attributed this to the “Reagan Revolution” and the end of the Cold War, others to the return of a Democrat to the presidency. During this period. the majority of Americans – political affiliation aside – asserted their support for traditional family values, often grounded in their faiths. New York Times columnist David Brooks suggested that the country was experiencing “moral self-repair,” as “many of the indicators of social breakdown, which shot upward in the late 1960s and 1970s, and which plateaued at high levels in the 1980s,” were now in decline.

Improved crime and other social statistics aside, American politics remained ideological, emotional, and characterized by intense divisions. Shortly after the nation entered the new millennium, moreover, its post-Cold War sense of security was jolted by an unprecedented terrorist attack that launched it on a new and difficult international track.

1992 PRESIDENTIAL ELECTION

As the 1992 presidential election approached, Americans found themselves in a world transformed in ways almost unimaginable four years earlier. The familiar landmarks of the Cold War – from the Berlin Wall to intercontinental missiles and bombers on constant high alert – were gone. Eastern Europe was independent, the Soviet Union had dissolved, Germany was united, Arabs and Israelis were engaged in direct negotiations, and the threat of nuclear conflict was greatly diminished. It was as though one great history volume had closed and another had opened.

Yet at home, Americans were less sanguine, and they faced some deep and familiar problems. The United States found itself in its deepest recession since the early 1980s. Many of the job losses were occurring among white-collar workers in middle management positions, not solely, as earlier, among blue-collar workers in the manufacturing sector. Even when the economy began recovering in 1992, its growth was virtually imperceptible until late in the year. Moreover, the federal deficit continued to mount, propelled most strikingly by rising expenditures for health care.

President George Bush and Vice President Dan Quayle easily won renomination by the Republican Party. On the Democratic side, Bill Clinton, governor of Arkansas, defeated a crowded field of candidates to win his party’s nomination. As his vice presidential nominee, he selected Senator Al Gore of Tennessee, generally acknowledged as one of the Congress’s strongest advocates of environmental protection.

The country’s deep unease over the direction of the economy also sparked the emergence of a remarkable independent candidate, wealthy Texas entrepreneur H. Ross Perot. Perot tapped into a deep wellspring of frustration over the inability of Washington to deal effectively with economic issues, principally the federal deficit. He possessed a colorful personality and a gift for the telling one-line political quip. He would be the most successful third-party candidate since Theodore Roosevelt in 1912.

The Bush re-election effort was built around a set of ideas traditionally used by incumbents: experience and trust. George Bush, 68, the last of a line of presidents who had served in World War II, faced a young challenger in Bill Clinton who, at age 46, had never served in the military and had participated in protests against the Vietnam War. In emphasizing his experience as president and commander-in-chief, Bush drew attention to Clinton’s inexperience at the national level.

Bill Clinton organized his campaign around another of the oldest and most powerful themes in electoral politics: youth and change. As a high-school student, Clinton had once met President Kennedy; 30 years later, much of his rhetoric consciously echoed that of Kennedy in his 1960 campaign.

As governor of Arkansas for 12 years, Clinton could point to his experience in wrestling with the very issues of economic growth, education, and health care that were, according to public opinion polls, among President Bush’s chief vulnerabilities. Where Bush offered an economic program based on lower taxes and cuts in government spending, Clinton proposed higher taxes on the wealthy and increased spending on investments in education, transportation, and communications that, he believed, would boost the nation’s productivity and growth and thereby lower the deficit. Similarly, Clinton’s health care proposals called for much heavier involvement by the federal government than Bush’s.

Clinton proved to be a highly effective communicator, not least on television, a medium that highlighted his charm and intelligence. The incumbent’s very success in handling the end of the Cold War and reversing the Iraqi thrust into Kuwait lent strength to Clinton’s implicit argument that foreign affairs had become relatively less important, given pressing social and economic needs at home.

On November 3, Bill Clinton won election as the 42nd president of the United States, receiving 43 percent of the popular vote against 37 percent for Bush and 19 percent for Perot.

A NEW PRESIDENCY

Clinton was in many respects the perfect leader for a party divided between liberal and moderate wings. He tried to assume the image of a pragmatic centrist who could moderate the demands of various Democratic Party interest groups without alienating them.

Avoiding ideological rhetoric that declared big government to be a positive good, he proposed a number of programs that earned him the label “New Democrat.” Control of the federal bureaucracy and judicial appointments provided one means of satisfying political claims of organized labor and civil rights groups. On the ever-controversial abortion issue, Clinton supported the Roe v. Wade decision, but also declared that abortion should be “safe, legal, and rare.”

President Clinton’s closest collaborator was his wife, Hillary Rodham Clinton. In the campaign, he had quipped that those who voted for him “got two for the price of one.” She supported her husband against accusations about his personal life.

As energetic and as activist as her husband, Ms. Clinton assumed a more prominent role in the administration than any first lady before her, even Eleanor Roosevelt.  Her first important assignment would be to develop a national health program. In 2000, with her husband’s administration coming to a close, she would be elected a U.S. senator from New York.

LAUNCHING A NEW DOMESTIC POLICY

In practice, Clinton’s centrism demanded choices that sometimes elicited vehement emotions. The president’s first policy initiative was designed to meet the demands of gays, who, claiming a group status as victims of discrimination, had become an important Democratic constituency.

Immediately after his inauguration, President Clinton issued an executive order rescinding the long-established military policy of dismissing known gays from the service. The order quickly drew furious criticism from the military, most Republicans, and large segments of American society. Clinton quickly modified it with a “don’t ask, don’t tell” order that effectively restored the old policy but discouraged active investigation of one’s sexual practices.

The effort to achieve a national health plan proved to be a far larger setback. The administration set up a large task force, chaired by Hillary Clinton. Composed of prominent policy intellectuals and political activists, it labored in secrecy for months to develop a plan that would provide medical coverage for every American.

The working assumption behind the plan was that a government-managed “single-payer” plan could deliver health services to the entire nation more efficiently than the current decentralized system with its thousands of insurers and disconnected providers. As finally delivered to Congress in September 1993, however, the plan mirrored the complexity of its subject. Most Republicans and some Democrats criticized it as a hopelessly elaborate federal takeover of American medicine. After a year of discussion, it died without a vote in Congress.

Clinton was more successful on another matter with great repercussions for the domestic economy. The previous president, George Bush, had negotiated the North American Free Trade Agreement (NAFTA) to establish fully open trade between Canada, the United States, and Mexico. Key Democratic constituencies opposed the agreement. Labor unions believed it would encourage the export of jobs and undermine American labor standards. Environmentalists asserted that it would lead American industries to relocate to countries with weak pollution controls. These were the first indications of a growing movement on the left wing of American politics against the vision of an integrated world economic system.

President Clinton nonetheless accepted the argument that open trade was ultimately beneficial to all parties because it would lead to a greater flow of more efficiently produced goods and services. His administration not only submitted NAFTA to the Senate, it also backed the establishment of a greatly liberalized international trading system to be administered by the World Trade Organization (WTO). After a vigorous debate, Congress approved NAFTA in 1993. It would approve membership in the WTO a year later.

Although Clinton had talked about a “middle class tax cut” during the presidential campaign, he submitted to Congress a budget calling for a general tax increase. It originally included a wide tax on energy consumption designed to promote conservation, but that was quickly replaced by a nominal increase in the federal gasoline tax. It also taxed social security benefits for recipients of moderate income and above. The big emphasis, however, was on increasing the income tax for high earners. The subsequent debate amounted to a rerun of the arguments between tax cutters and advocates of “fiscal responsibility” that had marked the Reagan years. In the end, Clinton got his way, but very narrowly. The tax bill passed the House of Representatives by only one vote.

By then, the congressional election campaigns of 1994 were under way. Although the administration already had made numerous foreign policy decisions, issues at home were clearly most important to the voters. The Republicans depicted Clinton and the Democrats as unreformed tax and spenders. Clinton himself was already beleaguered with charges of past financial impropriety in an Arkansas real estate project and new claims of sexual impropriety. In November, the voters gave the Republicans control of both houses of Congress for the first time since the election of 1952. Many observers believed that Bill Clinton would likely be a one-term president. Apparently making a decision to conform to new political realities, Clinton instead moderated his political course. Policy initiatives for the remainder of his presidency were few.  Contrary to Republican predictions of doom, the tax increases of 1993 did not get in the way of a steadily improving economy.

The new Republican leadership in the House of Representatives, by contrast, pressed hard to achieve its policy objectives, a sharp contrast with the administration’s new moderate tone. When right-wing extremists bombed an Oklahoma City federal building in April 1995, Clinton responded with a tone of moderation and healing that heightened his stature and implicitly left some doubts about his conservative opponents. At the end of the year, he vetoed a Republican budget bill, shutting down the government for weeks. Most of the public seemed to blame the Republicans.

The president also co-opted part of the Republican program. In his State of the Union address of January 1996, he ostentatiously declared, “The era of big government is over.” That summer, on the eve of the presidential campaign, he signed a major welfare reform bill that was essentially a Republican product. Designed to end permanent support for most welfare recipients and move them to work, it was opposed by many in his own party. By and large, it would prove successful in operation over the next decade.

THE AMERICAN ECONOMY IN THE 1990s

By the mid-1990s, the country had not simply recovered from the brief, but sharp, recession of the Bush presidency. It was entering an era of booming prosperity, and doing so despite the decline of its traditional industrial base. Probably the major force behind this new growth was the blossoming of the personal computer (PC).

Less than 20 years after its introduction, the PC had become a familiar item, not simply in business offices of all types, but in homes throughout America. Vastly more powerful than anyone could have imagined two decades earlier, able to store enormous amounts of data, available at the cost of a good refrigerator, it became a common appliance in American homes.

Employing prepackaged software, people used it for bookkeeping, word processing, or as a depository for music, photos, and video. The rise of the Internet, which grew out of a previously closed defense data network, provided access to information of all sorts, created new shopping opportunities, and established e-mail as a common mode of communication. The popularity of the mobile phone created a huge new industry that cross-fertilized with the PC.

Instant communication and lightning-fast data manipulation speeded up the tempo of many businesses, greatly enhancing productivity and creating new opportunities for profit. Fledgling industries that fed demand for the new equipment became multi-billion-dollar companies almost overnight, creating an enormous new middle class of software technicians, managers, and publicists.

A final impetus was the turn of the millennium. A huge push to upgrade outdated computing equipment that might not recognize the year 2000 brought data technology spending to a peak.

These developments began to take shape during Clinton’s first term. By the end of his second one they were fueling a surging economy. When he had been elected president, unemployment was at 7.4 percent. When he stood for re-election in 1996, it was at 5.4 percent. When voters went to the polls to choose his successor in November 2000, it was 3.9 percent. In many places, the issue was less one of taking care of the jobless than of finding employable workers.

No less a figure than Federal Reserve Chairman Alan Greenspan viewed a rapidly escalating stock market with concern and warned of “irrational exuberance.” Investor exuberance, at its greatest since the 1920s, continued in the conviction that ordinary standards of valuation had been rendered obsolete by a “new economy” with unlimited potential. The good times were rolling dangerously fast, but most Americans were more inclined to enjoy the ride while it lasted than to plan for a coming bust.

THE ELECTION OF 1996 AND THE POLITICAL AFTERMATH

President Clinton undertook his campaign for re-election in 1996 under the most favorable of circumstances. If not an imposing personality in the manner of a Roosevelt, he was a natural campaigner, whom many felt had an infectious charm. He presided over a growing economic recovery. He had positioned himself on the political spectrum in a way that made him appear a man of the center leaning left. His Republican opponent, Senator Robert Dole of Kansas, Republican leader in the upper house, was a formidable legislator but less successful as a presidential candidate.

Clinton, promising to “build a bridge to the 21st century,” easily defeated Dole in a three-party race, 49.2 percent to 40.7 percent, with 8.4 percent to Ross Perot. He thus became the second American president to win two consecutive elections with less than a majority of the total vote. (The other was Woodrow Wilson in 1912 and 1916.) The Republicans, however, retained control of both the House of Representatives and the Senate.

Clinton never stated much of a domestic program for his second term. The highlight of its first year was an accord with Congress designed to balance the budget, further reinforcing the president’s standing as a fiscally responsible moderate liberal.

In 1998, American politics entered a period of turmoil with the revelation that Clinton had carried on an affair inside the White House with a young intern. At first the president denied this, telling the American people: “I did not have sexual relations with that woman.” The president had faced similar charges in the past. In a sexual harassment lawsuit filed by a woman he had known in Arkansas, Clinton denied under oath the White House affair. This fit most Americans’ definition of perjury. In October 1998, the House of Representatives began impeachment hearings, focusing on charges of perjury and obstruction of justice.

Whatever the merits of that approach, a majority of Americans seemed to view the matter as a private one to be sorted out with one’s family, a significant shift in public attitude. Also significantly, Hillary Clinton continued to support her husband. It surely helped also that the times were good. In the midst of the House impeachment debate, the president announced the largest budget surplus in 30 years. Public opinion polls showed Clinton’s approval rating to be the highest of his six years in office.

That November, the Republicans took further losses in the midterm congressional elections, cutting their majorities to razor-thin margins. House Speaker Newt Gingrich resigned, and the party attempted to develop a less strident image. Nevertheless, in December the House voted the first impeachment resolution against a sitting president since Andrew Johnson (1868), thereby handing the case to the Senate for a trial.

Clinton’s impeachment trial, presided over by the Chief Justice of the United States, held little suspense. In the midst of it, the president delivered his annual State of the Union address to Congress. He never testified, and no serious observer expected that any of the several charges against him would win the two-thirds vote required for removal from office. In the end, none got even a simple majority. On February 12, 1999, Clinton was acquitted of all charges.

AMERICAN FOREIGN RELATIONS IN THE CLINTON YEARS

Bill Clinton did not expect to be a president who emphasized foreign policy. However, like his immediate predecessors, he quickly discovered that all international crises seemed to take a road that led through Washington.

He had to deal with the messy aftermath of the 1991 Gulf War. Having failed to depose Saddam Hussein, the United States, backed by Britain, attempted to contain him. A United Nations-administered economic sanctions regime, designed to allow Iraq to sell enough oil to meet humanitarian needs, proved relatively ineffective. Saddam funneled much of the proceeds to himself, leaving large masses of his people in misery. Military “no-fly zones,” imposed to prevent the Iraqi government from deploying its air power against rebellious Kurds in the north and Shiites in the south, required constant U.S. and British air patrols, which regularly fended off anti-aircraft missiles.

The United States also provided the main backing for U.N. weapons inspection teams, whose mission was to ferret out Iraq’s chemical, biological, and nuclear programs, verify the destruction of existing weapons of mass destruction, and suppress ongoing programs to manufacture them. Increasingly obstructed, the U.N. inspectors were finally expelled in 1998. On this, as well as earlier occasions of provocation, the United States responded with limited missile strikes. Saddam, Secretary of State Madeline Albright declared, was still “in his box.”

The seemingly endless Israeli-Palestinian dispute inevitably engaged the administration, although neither President Clinton nor former President Bush had much to do with the Oslo agreement of 1993, which established a Palestinian “authority” to govern the Palestinian population within the West Bank and the Gaza Strip and obtained Palestinian recognition of Israel’s right to exist.

As with so many past Middle Eastern agreements in principle, however, Oslo eventually fell apart when details were discussed. Palestinian leader Yasser Arafat rejected final offers from peace-minded Israeli leader Ehud Barak in 2000 and January 2001. A full-scale Palestinian insurgency, marked by the use of suicide bombers, erupted. Barak fell from power, to be replaced by the far tougher Ariel Sharon. U.S. identification with Israel was considered by some a major problem in dealing with other issues in the region, but American diplomats could do little more than hope to contain the violence. After Arafat’s death in late 2004, new Palestinian leadership appeared more receptive to a peace agreement, and American policy makers resumed efforts to promote a settlement.

President Clinton also became closely engaged with “the troubles” in Northern Ireland. On one side was the violent Irish Republican Army, supported primarily by those Catholic Irish who wanted to incorporate these British counties into the Republic of Ireland. On the other side were Unionists, with equally violent paramilitary forces, supported by most of the Protestant Scots-Irish population, who wanted to remain in the United Kingdom.

Clinton gave the separatists greater recognition than they ever had obtained in the United States, but also worked closely with the British governments of John Major and Tony Blair. The ultimate result, the Good Friday peace accords of 1998, established a political process but left many details to be worked out. Over the next several years, peace and order held better in Northern Ireland than in the Middle East, but remained precarious. The final accord continued to elude negotiators.

The post-Cold War disintegration of Yugoslavia – a state ethnically and religiously divided among Serbs, Croats, Slovenes, Bosnian Muslims, and Albanian Kosovars – also made its way to Washington after European governments failed to impose order. The Bush administration had refused to get involved in the initial violence; the Clinton administration finally did so with great reluctance after being urged to do so by the European allies. In 1995, it negotiated an accord in Dayton, Ohio, to establish a semblance of peace in Bosnia. In 1999, faced with Serbian massacres of Kosovars, it led a three-month NATO bombing campaign against Serbia, which finally forced a settlement.

In 1994, the administration restored ousted President Jean-Bertrand Aristide to power in Haiti, where he would rule for nine years before being ousted again. The intervention was largely a result of Aristide’s carefully cultivated support in the United States and American fears of waves of Haitian illegal immigrants.

In sum, the Clinton administration remained primarily inward looking, willing to tackle international problems that could not be avoided and, in other instances, forced by the rest of the world to do so.

INTIMATIONS OF TERRORISM

Near the close of his administration, George H. W. Bush sent American troops to the chaotic East African nation of Somalia. Their mission was to spearhead a U.N. force that would allow the regular movement of food to a starving population.

Somalia became yet another legacy for the Clinton administration. Efforts to establish a representative government there became a “nation-building” enterprise. In October 1993, American troops sent to arrest a recalcitrant warlord ran into unexpectedly strong resistance, losing an attack helicopter and suffering 18 deaths. The warlord was never arrested. Over the next several months, all American combat units were withdrawn.

From the standpoint of the administration, it seemed prudent enough simply to end a marginal, ill-advised commitment and concentrate on other priorities. It only became clear later that the Somalian warlord had been aided by a shadowy and emerging organization that would become known as al-Qaida, headed by a fundamentalist Muslim named Osama bin Laden. A fanatical enemy of Western civilization, bin Laden reportedly felt confirmed in his belief that Americans would not fight when attacked.

By then the United States had already experienced an attack by Muslim extremists. In February 1993, a huge car bomb was exploded in an underground parking garage beneath one of the twin towers of the World Trade Center in lower Manhattan. The blast killed seven people and injured nearly a thousand, but it failed to bring down the huge building with its thousands of workers. New York and federal authorities treated it as a criminal act, apprehended four of the plotters, and obtained life prison sentences for them. Subsequent plots to blow up traffic tunnels, public buildings, and even the United Nations were all discovered and dealt with in a similar fashion.

Possible foreign terrorism was nonetheless overshadowed by domestic terrorism, primarily the Oklahoma City bombing. The work of right-wing extremists Timothy McVeigh and Terry Nichols, it killed 166 and injured hundreds, a far greater toll than the 1993 Trade Center attack. But on June 25, 1996, another huge bomb exploded at the Khobar Towers U.S. military housing complex in Saudi Arabia, killing 19 and wounding 515. A federal grand jury indicted 13 Saudis and one Lebanese man for the attack, but Saudi Arabia ruled out any extraditions.

Two years later, on August 7, 1998, powerful bombs exploding simultaneously destroyed U.S. embassies in Kenya and Tanzania, killing 301 people and injuring more than 5,000. In retaliation Clinton ordered missile attacks on terrorist training camps run by bin Laden in Afghanistan, but they appear to have been deserted. He also ordered a missile strike to destroy a suspect chemical factory in Sudan, a country which earlier had given sanctuary to bin Laden.

On October 12, 2000, suicide bombers rammed a speedboat into the U.S. Navy destroyer Cole, on a courtesy visit to Yemen. Heroic action by the crew kept the ship afloat, but 17 sailors were killed. Bin Laden had pretty clearly been behind the attacks in Saudi Arabia, Africa, and Yemen, but he was beyond reach unless the administration was prepared to invade Afghanistan to search for him.

The Clinton administration was never willing to take such a step. It even shrank from the possibility of assassinating him if others might be killed in the process. The attacks had been remote and widely separated. It was easy to accept them as unwelcome but inevitable costs associated with superpower status. Bin Laden remained a serious nuisance, but not a top priority for an administration that was nearing its end.

THE PRESIDENTIAL ELECTION OF 2000 AND THE WAR ON TERROR

The Democratic Party nominated Vice President Al Gore to head its ticket in 2000. To oppose him, the Republicans chose George W. Bush, the governor of Texas and son of former president George H.W. Bush.

Gore ran as a dedicated liberal, intensely concerned with damage to the environment and determined to seek more assistance for the less privileged sectors of American society. He seemed to position himself to the left of President Clinton.

Bush established a position on the right wing of the Republican Party, closer to the heritage of Ronald Reagan than to that of his father. He softened this image by displaying a special interest in education and calling himself a “compassionate conservative.” His embrace of evangelical Christianity, which he declared had changed his life after a misspent youth, was of particular note. It underscored an attachment to traditional cultural values that contrasted sharply to Gore’s technocratic modernism. Corporate critic Ralph Nader ran well to Gore’s left as the candidate of the Green Party. Conservative Republican Patrick Buchanan mounted an independent candidacy.

The final vote was nearly evenly divided nationally; so were the electoral votes. The pivotal state was Florida, where a razor-thin margin separated Bush and Gore, and thousands of ballots were disputed. After a series of court challenges at the state and federal levels, the U.S. Supreme Court handed down a narrow decision that effectively gave the election to Bush. The Republicans maintained control of both houses of Congress by a small margin.

The final totals underscored the tightness of the election: Bush won 271 electoral votes to Gore’s 266, but Gore led him in the national popular vote 48.4 percent to 47.9 percent. Nader polled 2.1 percent and Buchanan .4 percent. Gore, his states colored blue in media graphics, swept the Northeast and the West Coast; he also ran well in the Midwestern industrial heartland. Bush, whose states were colored red, rolled over his opponent in the South, the rest of the Midwest, and the mountain states. Commentators everywhere commented on the vast gap between “red” and “blue” America, a divide characterized by cultural and social, rather than economic, differences, and all the more deep-seated and emotional for that reason. George W. Bush took office in a climate of extreme partisan bitterness.

Bush expected to be a president primarily concerned with domestic policy. He wanted to meld traditional Republican Party belief in private enterprise, low taxation, and small government with a sense of social responsibility for the less fortunate groups in American society. He had talked during his campaign about reforming the Social Security system. Impressed by Reagan’s supply-side economics, he advocated lower taxes to stimulate economic growth.

The economy was beginning to slip back from its lofty peak of the late 1990s. This helped Bush secure passage of a tax cut in May 2001. Lower taxes would indeed buoy the economy, but at the cost of an ominously growing federal budget deficit. At the end of the year, Bush also obtained the “No Child Left Behind” Act, which required public schools to test reading and mathematical proficiency on an annual basis; it prescribed penalties for those schools unable to achieve a specified standard. Social Security remained unaddressed despite Bush’s efforts to make it a priority in his second term.

The Bush presidency changed irrevocably on September 11, 2001, as the United States suffered the most devastating foreign attack ever against its mainland. That morning, Middle Eastern terrorists simultaneously hijacked four passenger airplanes and used two of them as suicide vehicles to destroy the twin towers of the World Trade Center in New York City. A third crashed into the Pentagon building, the Defense Department headquarters just outside of Washington, D.C. The fourth, probably aimed at the U.S. Capitol, dived into the Pennsylvania countryside as passengers fought the hijackers.

The death toll, most of it consisting of civilians at the Trade Center, was approximately 3,000, exceeding that of the Japanese attack on Pearl Harbor. The economic costs were also heavy. Several other buildings near the Trade Center also were destroyed, shutting down the financial markets for several days. The effect was to prolong the already developing recession.

As the nation began to recover from the attack, an unknown person or group sent out letters containing small amounts of anthrax bacteria. Some went to members of Congress and administration officials, others to obscure individuals. No notable person was infected. But five victims died, and several others suffered serious illness. The mailings touched off a wave of national hysteria, then stopped as suddenly as they had begun, and remained a mystery. In June 2008, the Federal Bureau of Investigation announced that the likely culprit was a troubled government scientist; in July 2008 the suspected scientist committed suicide.

In a televised speech on September 20, 2001, President Bush told a joint session of Congress, “Our ‘war on terror’ begins with al-Qaeda, but it does not end there. It will not end until every terrorist group of global reach has been found, stopped and defeated.” The administration obtained passage of the USA Patriot Act in October 2001. Designed to fight domestic terrorism, the new law considerably broadened the search, seizure, and detention powers of the federal government. Its opponents argued that it amounted to a serious violation of constitutionally protected individual rights. Its backers responded that a country at war needed to protect itself.

After initial hesitation, the Bush administration also decided to support the establishment of a gigantic new Department of Homeland Security. Authorized in November 2002 and designed to coordinate the fight against domestic terrorist attack, the new department consolidated 22 federal agencies.

The administration, like its predecessor, had been unprepared for the unimaginable. However, it retaliated quickly. Determining that the attack had been an al-Qaida operation, it launched a military offensive against Osama bin Laden and the fundamentalist Muslim Taliban government of Afghanistan that had provided him refuge. The United States secured the passive cooperation of the Russian Federation, established relationships with the former Soviet republics that bordered Afghanistan, and, above all, resumed a long-neglected alliance with Pakistan, which provided political support and access to air bases.

Utilizing U.S. Army Special Forces and Central Intelligence Agency paramilitary operatives, the administration allied with long-marginalized Afghan rebels. Given effective air support the coalition ousted the Taliban government in two months. Bin Laden, Taliban leaders, and many of their fighters, however, escaped into remote, semi-autonomous areas of Northeastern Pakistan. From there they would try to regroup and attack the new Afghan government.

In the meantime, the Bush administration was looking elsewhere for sources of enemy terrorism. In his 2002 State of the Union address, the president identified an “axis of evil” that he thought threatened the nation: Iraq, Iran, and North Korea. Of these three, Iraq seemed to him and his advisers the most troublesome and probably easiest to bring down.

Saddam Hussein had ejected United Nations weapons inspectors. The economic sanctions against Iraq were breaking down, and, although the regime was not believed to be involved in the 9/11 attacks, it had engaged in some contacts with al-Qaida. It was widely believed, not just in the United States but throughout the world, that Iraq had large stockpiles of chemical and biological weapons and might be working to acquire a nuclear capability. Why else throw out the inspection teams and endure continuing sanctions?

Throughout the year, the administration pressed for a United Nations resolution demanding resumption of weapons inspection with full and free access. In October 2002, Iraq declared it would comply. Nonetheless, the new inspectors complained of bad faith. In January, their chief, Hans Blix, presented a report to the UN declaring that Iraq had failed to account for its weapons of mass destruction; he recommended resumption of weapons inspections.

Bush in the meantime had received a Senate authorization by a vote of 77–23 for the use of military force. The U.S. military began a buildup of personnel and materiel in Kuwait.

The American plans for war with Iraq encountered unusually strong opposition in much of Europe. France, Russia, and Germany all were against the use of force. Even in those nations whose governments supported the United States, there was strong popular hostility to cooperation. Britain became the major U.S. ally in the war that followed; most of the newly independent Eastern European nations contributed assistance. The governments of Italy and (for a time) Spain also lent their backing. Turkey, long a reliable American ally, declined to do so.

Nevertheless, on March 19, 2003, American and British troops, supported by small contingents from several other countries, began an invasion of Iraq from the South. Groups airlifted into the North coordinated with Kurdish militia. On both fronts, resistance was occasionally fierce, but usually melted away. Baghdad fell on April 8. On April 14, the military campaign in Iraq was declared over.

Taking Iraq turned out to be far easier than administering it. In the first days after the end of major combat, the country experienced pervasive looting. Hit-and-run attacks on allied troops followed and became increasingly organized, despite the capture of Saddam Hussein and the deaths of his two sons and heirs. Different Iraqi factions seemed on the verge of war with each other.

New weapons inspection teams were unable to find the expected stockpiles of chemical and biological weaponry. It became clear that Iraq had never restarted the nuclear program it had been pursuing before the first Gulf War. After his apprehension, Saddam Hussein admitted that he had engaged in a gigantic bluff to forestall attack from abroad or insurrection at home.

In the year and a quarter after the fall of Baghdad, the United States and the United Kingdom, with increasing cooperation from the United Nations, moved ahead with establishment of a provisional government that would assume sovereignty over Iraq. The effort occurred amid increasing violence that included attacks not only on allied troops, but also on Iraqis connected in any way with the new government. Most of the insurgents appeared to be Saddam loyalists; some were indigenous Muslim sectarians; others were foreign fighters.

2004 PRESIDENTIAL ELECTION AND GEORGE W. BUSH’S SECOND TERM

By mid-2004, with the United States facing a violent insurgency in Iraq, considerable foreign opposition to the war there, and increasingly sharp divisions about the conflict at home, the country faced another presidential election. The Democrats nominated Senator John Kerry of Massachusetts, a decorated Vietnam veteran in his fourth Senate term. Kerry’s dignified demeanor and speaking skills made him a formidable candidate. A reliable liberal on domestic issues, he was a critic of the Iraq war. Bush, renominated without opposition by the Republicans, portrayed himself as frank and consistent in speech and deed, a man of action willing to take all necessary steps to protect the United States.

Marked by intense feelings on both sides about the war and the cultural conflicts that increasingly defined the differences between the two major parties, the campaign revealed a nation nearly as divided as in 2000. The strong emotions of the race fueled a voter turnout 20 percent higher than four years earlier. Bush won a narrow victory, 51 percent to 48 percent with the remainder of the vote going to Ralph Nader and other independents. The Republicans scored small but important gains in Congress.

George W. Bush began his second term in January 2005, facing challenges aplenty: Iraq, increasing federal budget deficits, a chronic international balance-of-payments shortfall, the escalating cost of social entitlements, and a shaky currency. None were susceptible to quick or easy solutions.

Iraq was the largest and most visible problem. The country had adopted a new constitution and held parliamentary elections in 2005. Saddam Hussein, tried by an Iraqi tribunal, was executed in December 2006. All the same, American forces and the new government faced a mounting insurgency. Composed of antagonistic factions — among them Sunni supporters of Saddam and dissident Shiites aided by Iran — the insurgency could be contained, but not quelled without using harsh tactics that would be unacceptable at home and would alienate the Iraqi population. The constitutional Iraqi government lacked the power and stability needed to impose order, yet the costs — human and financial — of the American occupation eroded support at home.

In January 2007, the president adopted an anti-insurgency strategy advocated by General David Petraeus — one of outreach and support for Sunni leaders willing to accept a new democratic order in Iraq, along with continued backing of the predominantly Shiite government in Baghdad. He accompanied this with a “surge” of additional troops. Over the next year, the strategy appeared to calm the country. The United States began to turn over increased security responsibilities to the Iraqis and negotiated an agreement for complete withdrawal by 2011. Nonetheless, Iraq remained very unstable, its fragile peace regularly disrupted by bombings and assassinations, its Sunni-Shiite conflict complicated by Kurdish separatists. It was not clear whether a democratic nation could be created out of such chaos, but it was clear that the United States could not impose one if the Iraqis did not want it.

As Iraq progressed uncertainly toward stability, Afghanistan moved in the other direction. The post-Taliban government of Hamid Karzai proved unable to establish effective control over the historically decentralized country. Operating from the Pakistani tribal areas to which they had escaped in 2001, the Taliban and al-Qaida began to filter back into Afghanistan and establish significant areas of control in the southern provinces. Using remote-controlled drone aircraft equipped with guided missiles, U.S. forces staged attacks against enemy encampments and leaders within Pakistan. In 2009, the new American president, Barack Obama, approved a U.S. military buildup and anti-insurgency effort similar to the Iraq surge. As with Iraq, the outcome remained in doubt.

As the first decade of the 21st century drew to a close, the United States found itself adjusting to a world considerably more complex than that of the Cold War. The bipolar rivalry of that era, for all its dangers and challenges, had imposed an unprecedented simplicity on international affairs. The newer, messier world order (or disorder) featured the rapid rise of China as a major economic force. India and Brazil were not far behind. Post-Soviet Russia re-emerged as an oil and natural gas power seeking to regain lost influence in Eastern Europe. The United States remained the pre-eminent power in the world, but was now first in a complex multipolar international system.

At home, the nation remained generally prosperous through most of the Bush years. After a weak first year, gross domestic product grew at a relatively steady, if unspectacular, rate and unemployment held at fairly low levels. Yet the prosperity was fragile. Most noticeable was the rapid decline of American manufacturing, a trend that was well along by the time George W. Bush became president and was in sharp contrast to the rise of China as an industrial power. Increasingly, the economy was sustained by consumer spending, finance, and a construction boom led by residential housing. Federal policy, reflecting the American ideal that every person should have an opportunity to own a home, encouraged the extension of mortgage loans to individuals whose prospects for repayment were dim. The financial institutions in turn repackaged these loans into complex securities, represented them as sound investments, and sold them to institutional investors. These ultimately unsustainable investments were fueled to excess by an easy-money policy as the nation’s central bank, the Federal Reserve System, held interest rates at low levels. Similar economic currents flowed in much of the rest of the developed Western world, but the United States was the pacesetter.

In line with the theme of compassionate conservatism, Bush proposed a major overhaul of the Social Security system that would allow individuals some discretion in investing the taxes they paid into it. The plan aroused nearly unanimous Democratic opposition, generated little public enthusiasm, and never got to a vote in Congress. Bush’s other major project — the enhancement of Medicare by the addition of a voluntary prescription drug program — proved much more popular. It appeased conservative qualms about big government by subsidizing qualified private insurance plans, required fairly large out-of-pocket payments from those who bought into it, but still provided real savings to elderly patients who required multiple medications. Yet, as was the case with already existing Medicare provisions, the costs of the drug program were not fully covered. It added substantially to a federal deficit that seemed uncontrollable.

The growing deficit became a major issue among not simply opposition Democrats but many Republican conservatives, who thought their party was spending too freely. In addition, the difficult war in Iraq was increasingly unpopular. In the 2006 midterm elections, Republicans lost control of Congress to the opposition Democrats, who more than ever looked with confidence to the next presidential election.

Posted in Uncategorized | Tagged , , , , , , , , , , , , | Leave a comment

The New Conservatism and a New World Order : The Reagan Years

The New Conservatism and a New World Order

The Reagan Years

President Reagan and USSR President Gorbachev

President Reagan and USSR President Gorbachev after signing the Intermediate-Range Nuclear Forces Treaty. (Time Life Pictures)

(The following article is taken from the U.S. Department of State publication, Outline of American History.)

I have always believed that there was some divine plan that placed this great continent between two oceans to be sought out by those who were possessed of an abiding love of freedom and a special kind of courage.”
— California Governor Ronald Reagan, 1974

A SOCIETY IN TRANSITION

Shifts in the structure of American society, begun years or even decades earlier, had become apparent by the time the 1980s arrived. The composition of the population and the most important jobs and skills in American society had undergone major changes.

The dominance of service jobs in the economy became undeniable. By the mid-1980s, nearly three-fourths of all employees worked in the service sector, for instance, as retail clerks, office workers, teachers, physicians, and government employees.

Service-sector activity benefited from the availability and increased use of the computer. The information age arrived, with hardware and software that could aggregate previously unimagined amounts of data about economic and social trends. The federal government had made significant investments in computer technology in the 1950s and 1960s for its military and space programs.

In 1976, two young California entrepreneurs, working out of a garage, assembled the first widely marketed computer for home use, named it the Apple, and ignited a revolution. By the early 1980s, millions of microcomputers had found their way into U.S. businesses and homes, and in 1982, Time magazine dubbed the computer its “Machine of the Year.”

Meanwhile, America’s “smokestack industries” were in decline. The U.S. automobile industry reeled under competition from highly efficient Japanese carmakers.  By 1980 Japanese companies already manufactured a fifth of the vehicles sold in the United States.  American manufacturers struggled with some success to match the cost efficiencies and engineering standards of their Japanese rivals, but their former dominance of the domestic car market was gone forever.  The giant old-line steel companies shrank to relative insignificance as foreign steel makers adopted new technologies more readily.

Consumers were the beneficiaries of this ferocious competition in the manufacturing industries, but the painful struggle to cut costs meant the permanent loss of hundreds of thousands of blue-collar jobs.  Those who could made the switch to the service sector; others became unfortunate statistics.

Population patterns shifted as well. After the end of the postwar “baby boom” (1946 to 1964), the overall rate of population growth declined and the population grew older. Household composition also changed. In 1980 the percentage of family households dropped; a quarter of all groups were now classified as “nonfamily households,” in which two or more unrelated persons lived together.

New immigrants changed the character of American society in other ways. The 1965 reform in immigration policy shifted the focus away from Western Europe, facilitating a dramatic increase in new arrivals from Asia and Latin America.  In 1980, 808,000 immigrants arrived, the highest number in 60 years, as the country once more became a haven for people from around the world.

Additional groups became active participants in the struggle for equal opportunity. Homosexuals, using the tactics and rhetoric of the civil rights movement, depicted themselves as an oppressed group seeking recognition of basic rights. In 1975, the U.S. Civil Service Commission lifted its ban on employment of homosexuals.  Many states enacted anti-discrimination laws.

Then, in 1981, came the discovery of AIDS (Acquired Immune Deficiency Syndrome). Transmitted sexually or through blood transfusions, it struck homosexual men and intravenous drug users with particular virulence, although the general population proved vulnerable as well. By 1992, over 220,000 Americans had died of AIDS.  The AIDS epidemic has by no means been limited to the United States, and the effort to treat the disease now encompasses physicians and medical researchers throughout the world.

CONSERVATISM AND THE RISE OF RONALD REAGAN

For many Americans, the economic, social, and political trends of the previous two decades – crime and racial polarization in many urban centers, challenges to traditional values, the economic downturn and inflation of the Carter years – engendered a mood of disillusionment. It also strengthened a renewed suspicion of government and its ability to deal effectively with the country’s social and political problems.

Conservatives, long out of power at the national level, were well positioned politically in the context of this new mood.  Many Americans were receptive to their message of limited government, strong national defense, and the protection of traditional values.

This conservative upsurge had many sources. A large group of fundamentalist Christians were particularly concerned about crime and sexual immorality. They hoped to return religion or the moral precepts often associated with it to a central place in American life.  One of the most politically effective groups in the early 1980s, the Moral Majority, was led by a Baptist minister, Jerry Falwell. Another, led by the Reverend Pat Robertson, built an organization, the Christian Coalition, that by the 1990s was a significant force in the Republican Party.  Using television to spread their messages, Falwell, Robertson, and others like them developed substantial followings.

Another galvanizing issue for conservatives was divisive and emotional: abortion. Opposition to the 1973 Supreme Court decision, Roe v. Wade, which upheld a woman’s right to an abortion in the early months of pregnancy, brought together a wide array of organizations and individuals. They included, but were not limited to, Catholics, political conservatives, and religious evangelicals, most of whom regarded abortion under virtually any circumstances as tantamount to murder.  Pro-choice and pro-life (that is, pro- and anti-abortion rights) demonstrations became a fixture of the political landscape.

Within the Republican Party, the conservative wing grew dominant once again. They had briefly seized control of the Republican Party in 1964 with its presidential candidate, Barry Goldwater, then faded from the spotlight. By 1980, however, with the apparent failure of liberalism under Carter, a “New Right” was poised to return to dominance.

Using modern direct mail techniques as well as the power of mass communications to spread their message and raise funds, drawing on the ideas of conservatives like economist Milton Friedman, journalists William F. Buckley, and George Will, and research institutions like the Heritage Foundation, the New Right played a significant role in defining the issues of the 1980s.

The “Old” Goldwater Right had favored strict limits on government intervention in the economy. This tendency was reinforced by a significant group of “New Right” “libertarian conservatives” who distrusted government in general and opposed state interference in personal behavior.  But the New Right also encompassed a stronger, often evangelical faction determined to wield state power to encourage its views.  The New Right favored tough measures against crime, a strong national defense, a constitutional amendment to permit prayer in public schools, and opposition to abortion.

The figure that drew all these disparate strands together was Ronald Reagan.   Reagan, born in Illinois, achieved stardom as an actor in Hollywood movies and television before turning to politics. He first achieved political prominence with a nationwide televised speech in 1964 in support of Barry Goldwater. In 1966 Reagan won the governorship of California and served until 1975. He narrowly missed winning the Republican nomination for president in 1976 before succeeding in 1980 and going on to win the presidency from the incumbent, Jimmy Carter.

President Reagan’s unflagging optimism and his ability to celebrate the achievements and aspirations of the American people persisted throughout his two terms in office. He was a figure of reassurance and stability for many Americans. Wholly at ease before the microphone and the television camera, Reagan was called the “Great Communicator.”

Taking a phrase from the 17th-century Puritan leader John Winthrop, he told the nation that the United States was a “shining city on a hill,” invested with a God-given mission to defend the world against the spread of Communist totalitarianism.

Reagan believed that government intruded too deeply into American life. He wanted to cut programs he contended the country did not need, and to eliminate “waste, fraud, and abuse.”  Reagan accelerated the program of deregulation begun by Jimmy Carter.  He sought to abolish many regulations affecting the consumer, the workplace, and the environment.  These, he argued, were inefficient, expensive, and detrimental to economic growth.

Reagan also reflected the belief held by many conservatives that the law should be strictly applied against violators. Shortly after becoming president, he faced a nationwide strike by U.S. air transportation controllers.  Although the job action was forbidden by law, such strikes had been widely tolerated in the past.  When the air controllers refused to return to work, he ordered them all fired.  Over the next few years the system was rebuilt with new hires.

THE ECONOMY IN THE 1980s

President Reagan’s domestic program was rooted in his belief that the nation would prosper if the power of the private economic sector was unleashed. The guiding theory behind it, “supply side” economics, held that a greater supply of goods and services, made possible by measures to increase business investment, was the swiftest road to economic growth. Accordingly, the Reagan administration argued that a large tax cut would increase capital investment and corporate earnings, so that even lower  taxes on these larger earnings would increase government revenues.

Despite only a slim Republican majority in the Senate and a House of Representatives controlled by the Democrats, President Reagan succeeded during his first year in office in enacting the major components of his economic program, including a 25-percent tax cut for individuals to be phased in over three years. The administration also sought and won significant increases in defense spending to modernize the nation’s military and counter what it felt was a continual and growing threat from the Soviet Union.

Under Paul Volcker, the Federal Reserve’s draconian increases in interest rates squeezed the runaway inflation that had begun in the late 1970s. The recession hit bottom in 1982, with the prime interest rates approaching 20 percent and the economy falling sharply. That year, real gross domestic product (GDP) fell by 2 percent; the unemployment rate rose to nearly 10 percent, and almost one-third of America’s industrial plants lay idle. Throughout the Midwest, major firms like General Electric and International Harvester released workers.  Stubbornly high petroleum prices contributed to the decline.  Economic rivals like Germany and Japan won a greater share of world trade, and U.S. consumption of goods from other countries rose sharply.

Farmers also suffered hard times. During the 1970s, American farmers had helped India, China, the Soviet Union, and other countries suffering from crop shortages, and had borrowed heavily to buy land and increase production. But the rise in oil prices pushed up costs, and a worldwide economic slump in 1980 reduced the demand for agricultural products.  Their numbers declined, as production increasingly became concentrated in large operations.  Small farmers who survived had major difficulties making ends meet.

The increased military budget – combined with the tax cuts and the growth in government health spending – resulted in the federal government spending far more than it received in revenues each year. Some analysts charged that the deficits were part of a deliberate administration strategy to prevent further increases in domestic spending sought by the Democrats. However, both Democrats and Republicans in Congress refused to cut such spending. From $74,000-million in 1980, the deficit soared to $221,000-million in 1986 before falling back to $150,000-million in 1987.

The deep recession of the early 1980s successfully curbed the runaway inflation that had started during the Carter years.  Fuel prices, moreover, fell sharply, with at least part of the drop attributable to Reagan’s decision to abolish controls on the pricing and allocation of gasoline.  Conditions began to improve in late 1983.  By early 1984, the economy had rebounded.  By the fall of 1984, the recovery was well along, allowing Reagan to run for re-election on the slogan, “It’s morning again in America.”  He defeated his Democratic opponent, former Senator and Vice President Walter Mondale, by an overwhelming margin.

The United States entered one of the longest periods of sustained economic growth since World War II. Consumer spending increased in response to the federal tax cut.  The stock market climbed as it reflected the optimistic buying spree. Over a five-year period following the start of the recovery, Gross National Product grew at an annual rate of 4.2 percent. The annual inflation rate remained between 3 and 5 percent from 1983 to 1987, except in 1986 when it fell to just under 2 percent, the lowest level in decades. The nation’s GNP grew substantially during the 1980s; from 1982 to 1987, its economy created more than 13 million new jobs.

Steadfast in his commitment to lower taxes, Reagan signed the most sweeping federal tax-reform measure in 75 years during his second term. This measure, which had widespread Democratic as well as Republican support, lowered income tax rates, simplified tax brackets, and closed loopholes.

However, a significant percentage of this growth was based on deficit spending. Moreover, the national debt, far from being stabilized by strong economic growth, nearly tripled.  Much of the growth occurred in skilled service and technical areas.  Many poor and middle-class families did less well. The administration, although an advocate of free trade, pressured Japan to agree to a voluntary quota on its automobile exports to the United States.

The economy was jolted on October 19, 1987, “Black Monday,” when the stock market suffered the greatest one-day crash in its history, 22.6 percent.   The causes of the crash included the large U.S. international trade and federal-budget deficits, the high level of corporate and personal debt, and new computerized stock trading techniques that allowed instantaneous selling of stocks and futures.  Despite the memories of 1929 it evoked, however, the crash was a transitory event with little impact.  In fact, economic growth continued, with the unemployment rate dropping to a 14-year low of 5.2 percent in June 1988.

FOREIGN AFFAIRS

In foreign policy, Reagan sought a more assertive role for the nation, and Central America provided an early test. The United States provided El Salvador with a program of economic aid and military training when a guerrilla insurgency threatened to topple its government. It also actively encouraged the transition to an elected democratic government, but efforts to curb active right‑wing death squads were only partly successful. U.S. support helped stabilize the government, but the level of violence there remained undiminished.  A peace agreement was finally reached in early 1992.

U.S. policy toward Nicaragua was more controversial. In 1979 revolutionaries calling themselves Sandinistas overthrew the repressive right-wing Somoza regime and established a pro-Cuba, pro-Soviet dictatorship. Regional peace efforts ended in failure, and the focus of administration efforts shifted to support for the anti-Sandinista resistance, known as the contras.

Following intense political debate over this policy, Congress ended all military aid to the contras in October 1984, then, under administration pressure, reversed itself in the fall of 1986, and approved $100 million in military aid. However, a lack of success on the battlefield, charges of human rights abuses, and the revelation that funds from secret arms sales to Iran (see below) had been diverted to the contras undercut congressional support to continue this aid.

Subsequently, the administration of President George H.W. Bush, who succeeded Reagan as president in 1989, abandoned any effort to secure military aid for the contras. The Bush administration also exerted pressure for free elections and supported an opposition political coalition, which won an astonishing upset election in February 1990, ousting the Sandinistas from power.

The Reagan administration was more fortunate in witnessing a return to democracy throughout the rest of Latin America, from Guatemala to Argentina. The emergence of democratically elected governments was not limited to Latin America; in Asia, the “people power” campaign of Corazón Aquino overthrew the dictatorship of Ferdinand Marcos, and elections in South Korea ended decades of military rule.

By contrast, South Africa remained intransigent in the face of U.S. efforts to encourage an end to racial apartheid through the controversial policy of “constructive engagement,” quiet diplomacy coupled with public endorsement of reform. In 1986, frustrated at the lack of progress, the U.S. Congress overrode Reagan’s veto and imposed a set of economic sanctions on South Africa. In February 1990, South African President F.W. de Klerk announced Nelson Mandela’s release and began the slow dismantling of apartheid.

Despite its outspoken anti-Communist rhetoric, the Reagan administration’s direct use of military force was restrained. On October 25, 1983, U.S. forces landed on the Caribbean island of Grenada after an urgent appeal for help by neighboring countries. The action followed the assassination of Grenada’s leftist prime minister by members of his own Marxist-oriented party. After a brief period of fighting, U.S. troops captured hundreds of Cuban military and construction personnel and seized caches of Soviet-supplied arms. In December 1983, the last American combat troops left Grenada, which held democratic elections a year later.

The Middle East, however, presented a far more difficult situation.  A military presence in Lebanon, where the United States was attempting to bolster a weak, but moderate pro-Western government, ended tragically, when 241 U.S. Marines were killed in a terrorist bombing in October 1983. In April 1986, U.S. Navy and Air Force planes struck targets in Tripoli and Benghazi, Libya, in retaliation for Libyan-instigated terrorist attacks on U.S. military personnel in Europe.

In the Persian Gulf, the earlier breakdown in U.S.-Iranian relations and the Iran-Iraq War set the stage for U.S. naval activities in the region. Initially, the United States responded to a request from Kuwait for protection of its tanker fleet; but eventually the United States, along with naval vessels from Western Europe, kept vital shipping lanes open by escorting convoys of tankers and other neutral vessels traveling up and down the Gulf.

In late 1986 Americans learned that the administration had secretly sold arms to Iran in an attempt to resume diplomatic relations with the hostile Islamic government and win freedom for American hostages held in Lebanon by radical organizations that Iran controlled.  Investigation also revealed that funds from the arms sales had been diverted to the Nicaraguan contras during a period when Congress had prohibited such military aid.

The ensuing Iran-contra hearings before a joint House‑Senate committee examined issues of possible illegality as well as the broader question of defining American foreign policy interests in the Middle East and Central America. In a larger sense, the hearings were a constitutional debate about government secrecy and presidential versus congressional authority in the conduct of foreign relations.  Unlike the celebrated Senate Watergate hearings 14 years earlier, they found no grounds for impeaching the president and could reach no definitive conclusion about these perennial issues.

U.S.-SOVIET RELATIONS

In relations with the Soviet Union, President Reagan’s declared policy was one of peace through strength. He was determined to stand firm against the country he would in 1983 call an “evil empire.” Two early events increased U.S.-Soviet tensions: the suppression of the Solidarity labor movement in Poland in December 1981, and the destruction with 269 fatalities of an off-course civilian airliner, Korean Airlines Flight 007, by a Soviet jet fighter on September 1, 1983. The United States also condemned the continuing Soviet occupation of Afghanistan and continued aid begun by the Carter administration to the mujahedeen resistance there.

During Reagan’s first term, the United States spent unprecedented sums for a massive defense build-up, including the placement of intermediate-range nuclear missiles in Europe to counter Soviet deployments of similar missiles. And on March 23, 1983, in one of the most hotly debated policy decisions of his presidency, Reagan announced the Strategic Defense Initiative (SDI) research program to explore advanced technologies, such as lasers and high-energy projectiles, to defend against intercontinental ballistic missiles. Although many scientists questioned the technological feasibility of SDI and economists pointed to the extraordinary sums of money involved, the administration pressed ahead with the project.

After re-election in 1984, Reagan softened his position on arms control.

Moscow was amenable to agreement, in part because its economy already expended a far greater proportion of national output on its military than did the United States.  Further increases, Soviet leader Mikhail Gorbachev felt, would cripple his plans to liberalize the Soviet economy.

In November 1985, Reagan and Gorbachev agreed in principle to seek 50-percent reductions in strategic offensive nuclear arms as well as an interim agreement on intermediate-range nuclear forces. In December 1987, they signed the Intermediate-Range Nuclear Forces (INF) Treaty providing for the destruction of that entire category of nuclear weapons.  By then, the Soviet Union seemed a less menacing adversary.  Reagan could take much of the credit for a greatly diminished Cold War, but as his administration ended, almost no one realized just how shaky the USSR had become.

THE PRESIDENCY OF GEORGE H. W. BUSH

President Reagan enjoyed unusually high popularity at the end of his second term in office, but under the terms of the U.S. Constitution he could not run again in 1988.  The Republican nomination went to Vice President George Herbert Walker Bush, who was elected the 41st president of the United States.

Bush campaigned by promising voters a continuation of the prosperity Reagan had brought. In addition, he argued that he would support a strong defense for the United States more reliably than the Democratic candidate, Michael Dukakis. He also promised to work for “a kinder, gentler America.”  Dukakis, the governor of Massachusetts, claimed that less fortunate Americans were hurting economically and that the government had to help them while simultaneously bringing the federal debt and defense spending under control. The public was much more engaged, however, by Bush’s economic message:  No new taxes. In the balloting, Bush had a 54-to-46-percent popular vote margin.

During his first year in office, Bush followed a conservative fiscal program, pursuing policies on taxes, spending, and debt that were faithful to the Reagan administration’s economic program. But the new president soon found himself squeezed between a large budget deficit and a deficit-reduction law.  Spending cuts seemed necessary, and Bush possessed little leeway to introduce new budget items.

The Bush administration advanced new policy initiatives in areas not requiring major new federal expenditures.  Thus, in November 1990, Bush signed sweeping legislation imposing new federal standards on urban smog, automobile exhaust, toxic air pollution, and acid rain, but with industrial polluters bearing most of the costs. He accepted legislation requiring physical access for the disabled, but with no federal assumption of the expense of modifying buildings to accommodate wheelchairs and the like.  The president also launched a campaign to encourage volunteerism, which he called, in a memorable phrase, “a thousand points of light.”

BUDGETS AND DEFICITS

Bush administration efforts to gain control over the federal budget deficit, however, were more problematic. One source of the difficulty was the savings and loan crisis. Savings banks – formerly tightly regulated, low-interest safe havens for ordinary people – had been deregulated, allowing these institutions to compete more aggressively by paying higher interest rates and by making riskier loans.  Increases in the government’s deposit insurance guaranteed reduced consumer incentive to shun less-sound institutions.  Fraud, mismanagement, and the choppy economy produced widespread insolvencies among these thrifts (the umbrella term for consumer-oriented institutions like savings and loan associations and savings banks).  By 1993, the total cost of selling and shuttering failed thrifts was staggering, nearly $525,000-million.

In January 1990, President Bush presented his budget proposal to Congress.  Democrats argued that administration budget projections were far too optimistic, and that meeting the deficit-reduction law would require tax increases and sharper cuts in defense spending. That June, after protracted negotiations, the president agreed to a tax increase.  All the same, the combination of economic recession, losses from the savings and loan industry rescue operation, and escalating health care costs for Medicare and Medicaid offset all the deficit-reduction measures and produced a shortfall in 1991 at least as large as the previous year’s.

END TO THE COLD WAR

When Bush became president, the Soviet empire was on the verge of collapse. Gorbachev’s efforts to open up the USSR’s economy appeared to be floundering.  In 1989, the Communist governments in one Eastern European country after another simply collapsed, after it became clear that Russian troops would not be sent to prop them up.  In mid-1991, hard-liners attempted a coup d’etat, only to be foiled by Gorbachev rival Boris Yeltsin, president of the Russian republic.  At the end of that year, Yeltsin, now dominant, forced the dissolution of the Soviet Union.

The Bush administration adeptly brokered the end of the Cold War, working closely with Gorbachev and Yeltsin.  It led the negotiations that brought the unification of East and West Germany (September 1990), agreement on large arms reductions in Europe (November 1990), and large cuts in nuclear arsenals (July 1991).  After the liquidation of the Soviet Union, the United States and the new Russian Federation agreed to phase out all multiple-warhead missiles over a 10-year period.

The disposal of nuclear materials and the ever-present concerns of nuclear proliferation now superseded the threat of nuclear conflict between Washington and Moscow.

THE GULF WAR

The euphoria caused by the drawing down of the Cold War was dramatically overshadowed by the August 2, 1990, invasion of the small nation of Kuwait by Iraq. Iraq, under Saddam Hussein, and Iran, under its Islamic fundamentalist regime, had emerged as the two major military powers in the oil-rich Persian Gulf area.  The two countries had fought a long, inconclusive war in the 1980s.  Less hostile to the United States than Iran, Iraq had won some support from the Reagan and Bush administrations.  The occupation of Kuwait, posing a threat to Saudi Arabia, changed the diplomatic calculation overnight.

President Bush strongly condemned the Iraqi action, called for Iraq’s unconditional withdrawal, and sent a major deployment of U.S. troops to the Middle East.  He assembled one of the most extraordinary military and political coalitions of modern times, with military forces from Asia, Europe, and Africa, as well as the Middle East.

In the days and weeks following the invasion, the U.N. Security Council passed 12 resolutions condemning the Iraqi invasion and imposing wide-ranging economic sanctions on Iraq.  On November 29, it approved the use of force if Iraq did not withdraw from Kuwait by January 15, 1991.  Gorbachev’s Soviet Union, once Iraq’s major arms supplier, made no effort to protect its former client.

Bush also confronted a major constitutional issue.  The U.S. Constitution gives the legislative branch the power to declare war. Yet in the second half of the 20th century, the United States had become involved in Korea and Vietnam without an official declaration of war and with only murky legislative authorization. On January 12, 1991, three days before the U.N. deadline, Congress granted President Bush the authority he sought in the most explicit and sweeping war-making power given a president in nearly half a century.

The United States, in coalition with Great Britain, France, Italy, Saudi Arabia, Kuwait, and other countries, succeeded in liberating Kuwait with a devastating, U.S.-led air campaign that lasted slightly more than a month. It was followed by a massive invasion of Kuwait and Iraq by armored and airborne infantry forces. With their superior speed, mobility, and firepower, the allied forces overwhelmed the Iraqi forces in a land campaign lasting only 100 hours.

The victory, however, was incomplete and unsatisfying. The U.N. resolution, which Bush enforced to the letter, called only for the expulsion of Iraq from Kuwait.  Saddam Hussein remained in power, savagely repressing the Kurds in the north and the Shiites in the south, both of whom the United States had encouraged to rebel.  Hundreds of oil-well fires, deliberately set in Kuwait by the Iraqis, took until November 1991 to extinguish. Saddam’s regime also apparently thwarted U.N. inspectors who, operating in accordance with Security Council resolutions, worked to locate and destroy Iraq’s weapons of mass destruction, including nuclear facilities more advanced than had previously been suspected and huge stocks of chemical weapons.

The Gulf War enabled the United States to persuade the Arab states, Israel, and a Palestinian delegation to begin direct negotiations aimed at resolving the complex and interlocked issues that could eventually lead to a lasting peace in the region. The talks began in Madrid, Spain, on October 30, 1991. In turn, they set the stage for the secret negotiations in Norway that led to what at the time seemed a historic agreement between Israel and the Palestine Liberation Organization, signed at the White House on September 13, 1993.

PANAMA AND NAFTA

The president also received broad bipartisan congressional backing for the brief U.S. invasion of Panama on December 20, 1989, that deposed dictator General Manuel Antonio Noriega. In the 1980s, addiction to crack cocaine reached epidemic proportions, and President Bush put the “War on Drugs” at the center of his domestic agenda. Moreover, Noriega, an especially brutal dictator, had attempted to maintain himself in power with rather crude displays of anti-Americanism. After seeking refuge in the Vatican embassy, Noriega turned himself over to U.S. authorities.  He was later tried and convicted in U.S. federal court in Miami, Florida, of drug trafficking and racketeering.

On the economic front, the Bush administration negotiated the North America Free Trade Agreement (NAFTA) with Mexico and Canada.  It would be ratified after an intense debate in the first year of the Clinton administration.

Posted in Uncategorized | Tagged , , , , , , , , | Leave a comment

Decades of Change – 1960-1980: The Rise of cultural and ethnic pluralism

Decades of Change – 1960-1980

The Rise of cultural and ethnic pluralism

Astronaut on the moon, July 20, 1969

Astronaut on the moon, July 20, 1969. (NASA)

“I have a dream that one day on the red hills of Georgia, sons of former slaves and the sons of former slave owners will be able to sit down together at the table of brotherhood.”
– Martin Luther King Jr., 1963

By 1960, the United States was on the verge of a major social change.  American society had always been more open and fluid than that of the nations in most of the rest of the world.  Still, it had been dominated primarily by old-stock, white males.  During the 1960s, groups that previously had been submerged or subordinate began more forcefully and successfully to assert themselves: African Americans, Native Americans, women, the white ethnic offspring of the “new immigration,” and Latinos.  Much of the support they received came from a young population larger than ever, making its way through a college and university system that was expanding at an unprecedented pace.  Frequently embracing “countercultural” life styles and radical politics, many of the offspring of the World War II generation emerged as advocates of a new America characterized by a cultural and ethnic pluralism that their parents often viewed with unease.

THE CIVIL RIGHTS MOVEMENT, 1960-1980

The struggle of African Americans for equality reached its peak in the mid-1960s. After progressive victories in the 1950s, African Americans became even more committed to nonviolent direct action. Groups like the Southern Christian Leadership Conference (SCLC), made up of African-American clergy, and the Student Nonviolent Coordinating Committee (SNCC), composed of younger activists, sought reform through peaceful confrontation.

In 1960 African-American college students sat down at a segregated Woolworth’s lunch counter in North Carolina and refused to leave. Their sit-in captured media attention and led to similar demonstrations throughout the South. The next year, civil rights workers organized “freedom rides,” in which African Americans and whites boarded buses heading south toward segregated terminals, where confrontations might capture media attention and lead to change.

They also organized rallies, the largest of which was the “March on Washington” in 1963. More than 200,000 people gathered in the nation’s capital to demonstrate their commitment to equality for all. The high point of a day of songs and speeches came with the address of Martin Luther King Jr., who had emerged as the preeminent spokesman for civil rights. “I have a dream that one day on the red hills of Georgia the sons of former slaves and the sons of former slave owners will be able to sit down together at the table of brotherhood,” King proclaimed. Each time he used the refrain “I have a dream,” the crowd roared.

The level of progress initially achieved did not match the rhetoric of the civil rights movement.  President Kennedy was initially reluctant to press white Southerners for support on civil rights because he needed their votes on other issues.  Events, driven by African Americans themselves, forced his hand. When James Meredith was denied admission to the University of Mississippi in 1962 because of his race, Kennedy sent federal troops to uphold the law. After protests aimed at the desegregation of Birmingham, Alabama, prompted a violent response by the police, he sent Congress a new civil rights bill mandating the integration of public places.  Not even the March on Washington, however, could extricate the measure from a congressional committee, where it was still bottled up when Kennedy was assassinated in 1963.

President Lyndon B. Johnson was more successful.  Displaying negotiating skills he had so frequently employed during his years as Senate majority leader, Johnson persuaded the Senate to limit delaying tactics preventing a final vote on the sweeping Civil Rights Act of 1964, which outlawed discrimination in all public accommodations.  The next year’s Voting Rights Act of 1965 authorized the federal government to register voters where local officials had prevented African Americans from doing so. By 1968 a million African Americans were registered in the deep South.  Nationwide, the number of African-American elected officials increased substantially. In 1968, the Congress passed legislation banning discrimination in housing.

Once unleashed, however, the civil rights revolution produced leaders impatient with both the pace of change and the goal of channeling African Americans into mainstream white society.  Malcolm X, an eloquent activist, was the most prominent figure arguing for African-American separation from the white race. Stokely Carmichael, a student leader, became similarly disillusioned by the notions of nonviolence and interracial cooperation. He popularized the slogan “black power,” to be achieved by “whatever means necessary,” in the words of Malcolm X.

Violence accompanied militant calls for reform. Riots broke out in several big cities in 1966 and 1967.  In the spring of 1968, Martin Luther King Jr. fell before an assassin’s bullet. Several months later, Senator Robert Kennedy, a spokesman for the disadvantaged, an opponent of the Vietnam War, and the brother of the slain president, met the same fate. To many these two assassinations marked the end of an era of innocence and idealism.  The growing militancy on the left, coupled with an inevitable conservative backlash, opened a rift in the nation’s psyche that took years to heal.

By then, however, a civil rights movement supported by court decisions, congressional enactments, and federal administrative regulations was irreversibly woven into the fabric of American life.  The major issues were about implementation of equality and access, not about the legality of segregation or disenfranchisement.  The arguments of the 1970s and thereafter were over matters such as busing children out of their neighborhoods to achieve racial balance in metropolitan schools or about the use of “affirmative action.”  These policies and programs were viewed by some as active measures to ensure equal opportunity, as in education and employment, and by others as reverse discrimination.

The courts worked their way through these problems with decisions that were often inconsistent.  In the meantime, the steady march of African Americans into the ranks of the middle class and once largely white suburbs quietly reflected a profound demographic change.

THE WOMEN’S MOVEMENT

During the 1950s and 1960s, increasing numbers of married women entered the labor force, but in 1963 the average working woman earned only 63 percent of what a man made. That year Betty Friedan published The Feminine Mystique, an explosive critique of middle-class living patterns that articulated a pervasive sense of discontent that Friedan contended was felt by many women. Arguing that women often had no outlets for expression other than “finding a husband and bearing children,” Friedan encouraged her readers to seek new roles and responsibilities and to find their own personal and professional identities, rather than have them defined by a male-dominated society.

The women’s movement of the 1960s and 1970s drew inspiration from the civil rights movement. It was made up mainly of members of the middle class, and thus partook of the spirit of rebellion that affected large segments of middle-class youth in the 1960s.

Reform legislation also prompted change. During debate on the 1964 Civil Rights bill, opponents hoped to defeat the entire measure by proposing an amendment to outlaw discrimination on the basis of gender as well as race. First the amendment, then the bill itself, passed, giving women a valuable legal tool.

In 1966, 28 professional women, including Friedan, established the National Organization for Women (NOW) “to take action to bring American women into full participation in the mainstream of American society now.” While NOW and similar feminist organizations boast of substantial memberships today, arguably they attained their greatest influence in the early 1970s, a time that also saw the journalist Gloria Steinem and several other women found Ms. magazine. They also spurred the formation of counter-feminist groups, often led by women, including most prominently the political activist Phyllis Schlafly. These groups typically argued for more “traditional” gender roles and opposed the proposed “Equal Rights” constitutional amendment.

Passed by Congress in 1972, that amendment declared in part, “Equality of rights under the law shall not be denied or abridged by the United States or by any State on account of sex.” Over the next several years, 35 of the necessary 38 states ratified it. The courts also moved to expand women’s rights. In 1973 the Supreme Court in Roe v. Wade sanctioned women’s right to obtain an abortion during the early months of pregnancy – seen as a significant victory for the women’s movement – but Roe also spurred the growth of an anti-abortion movement.

In the mid- to late-1970s, however, the women’s movement seemed to stagnate. It failed to broaden its appeal beyond the middle class. Divisions arose between moderate and radical feminists. Conservative opponents mounted a campaign against the Equal Rights Amendment, and it died in 1982 without gaining the approval of the 38 states needed for ratification.

THE LATINO MOVEMENT

In post-World War II America, Americans of Mexican and Puerto Rican descent had faced discrimination.  New immigrants, coming from Cuba, Mexico, and Central America – often unskilled and unable to speak English – suffered from discrimination as well.  Some Hispanics worked as farm laborers and at times were cruelly exploited while harvesting crops; others gravitated to the cities, where, like earlier immigrant groups, they encountered difficulties in their quest for a better life.

Chicanos, or Mexican-Americans, mobilized in organizations like the radical Asociación Nacional Mexico-Americana, yet did not become confrontational until the 1960s. Hoping that Lyndon Johnson’s poverty program would expand opportunities for them, they found that bureaucrats failed to respond to less vocal groups. The example of black activism in particular taught Hispanics the importance of pressure politics in a pluralistic society.

The National Labor Relations Act of 1935 had excluded agricultural workers from its guarantee of the right to organize and bargain collectively.  But César Chávez, founder of the overwhelmingly Hispanic United Farm Workers, demonstrated that direct action could achieve employer recognition for his union.  California grape growers agreed to bargain with the union after Chávez led a nationwide consumer boycott. Similar boycotts of lettuce and other products were also successful. Though farm interests continued to try to obstruct Chávez’s organization, the legal foundation had been laid for representation to secure higher wages and improved working conditions.

Hispanics became politically active as well. In 1961 Henry B. González won election to Congress from Texas. Three years later Eligio (“Kika”) de la Garza, another Texan, followed him, and Joseph Montoya of New Mexico went to the Senate. Both González and de la Garza later rose to positions of power as committee chairmen in the House. In the 1970s and 1980s, the pace of Hispanic political involvement increased.  Several prominent Hispanics have served in the Bill Clinton and George W. Bush cabinets.

THE NATIVE-AMERICAN MOVEMENT

In the 1950s, Native Americans struggled with the government’s policy of moving them off reservations and into cities where they might assimilate into mainstream America. Many of the uprooted often had difficulties adjusting to urban life. In 1961, when the policy was discontinued, the U.S. Commission on Civil Rights noted that, for Native Americans, “poverty and deprivation are common.”

In the 1960s and 1970s, watching both the development of Third World nationalism and the progress of the civil rights movement, Native Americans became more aggressive in pressing for their own rights. A new generation of leaders went to court to protect what was left of tribal lands or to recover those which had been taken, often illegally, in previous times. In state after state, they challenged treaty violations, and in 1967 won the first of many victories guaranteeing long-abused land and water rights. The American Indian Movement (AIM), founded in 1968, helped channel government funds to Native-American-controlled organizations and assisted neglected Native Americans in the cities.

Confrontations became more common. In 1969 a landing party of 78 Native Americans seized Alcatraz Island in San Francisco Bay and held it until federal officials removed them in 1971. In 1973 AIM took over the South Dakota village of Wounded Knee, where soldiers in the late 19th century had massacred a Sioux encampment. Militants hoped to dramatize the poverty and alcoholism in the reservation surrounding the town. The episode ended after one Native American was killed and another wounded, with a government agreement to re-examine treaty rights.

Still, Native-American activism brought results. Other Americans became more aware of Native-American needs. Government officials responded with measures including the Education Assistance Act of 1975 and the 1996 Native-American Housing and Self-Determination Act.  The Senate’s first Native-American member, Ben Nighthorse Campbell of Colorado, was elected in 1992.

THE COUNTERCULTURE

The agitation for equal opportunity sparked other forms of upheaval. Young people in particular rejected the stable patterns of middle-class life their parents had created in the decades after World War II. Some plunged into radical political activity; many more embraced new standards of dress and sexual behavior.

The visible signs of the counterculture spread through parts of American society in the late 1960s and early 1970s. Hair grew longer and beards became common. Blue jeans and tee shirts took the place of slacks, jackets, and ties. The use of illegal drugs increased. Rock and roll grew, proliferated, and transformed into many musical variations. The Beatles, the Rolling Stones, and other British groups took the country by storm. “Hard rock” grew popular, and songs with a political or social commentary, such as those by singer‑songwriter Bob Dylan, became common. The youth counterculture reached its apogee in August 1969 at Woodstock, a three‑day music festival in rural New York State attended by almost half-a-million persons. The festival, mythologized in films and record albums, gave its name to the era, the Woodstock Generation.

A parallel manifestation of the new sensibility of the young was the rise of the New Left, a group of young, college-age radicals.  The New Leftists, who had close counterparts in Western Europe, were in many instances the children of the older generation of radicals.  Nonetheless, they rejected old-style Marxist rhetoric.  Instead, they depicted university students as themselves an oppressed class that possessed special insights into the struggle of other oppressed groups in American society.

New Leftists participated in the civil rights movement and the struggle against poverty.  Their greatest success – and the one instance in which they developed a mass following – was in opposing the Vietnam War, an issue of emotional interest to their draft-age contemporaries.  By the late 1970s, the student New Left had disappeared, but many of its activists made their way into mainstream politics.

ENVIRONMENTALISM

The energy and sensibility that fueled the civil rights movement, the counterculture, and the New Left also stimulated an environmental movement in the mid-1960s.  Many were aroused by the publication in 1962 of Rachel Carson’s book Silent Spring, which alleged that chemical pesticides, particularly DDT, caused cancer, among other ills.  Public concern about the environment continued to increase throughout the 1960s as many became aware of other pollutants surrounding them – automobile emissions, industrial wastes, oil spills – that threatened their health and the beauty of their surroundings. On April 22, 1970, schools and communities across the United States celebrated Earth Day for the first time. “Teach‑ins” educated Americans about the dangers of environmental pollution.

Few denied that pollution was a problem, but the proposed solutions involved expense and inconvenience.  Many believed these would reduce the economic growth upon which many Americans’ standard of living depended.  Nevertheless, in 1970, Congress amended the Clean Air Act of 1967 to develop uniform national air-quality standards. It also passed the Water Quality Improvement Act, which assigned to the polluter the responsibility of cleaning up off-shore oil spills. Also, in 1970, the Environmental Protection Agency (EPA) was created as an independent federal agency to spearhead the effort to bring abuses under control.  During the next three decades, the EPA, bolstered by legislation that increased its authority, became one of the most active agencies in the government, issuing strong regulations covering air and water quality.

KENNEDY AND THE RESURGENCE OF BIG GOVERNMENT LIBERALISM

By 1960 government had become an increasingly powerful force in people’s lives. During the Great Depression of the 1930s, new executive agencies were created to deal with many aspects of American life.  During World War II, the number of civilians employed by the federal government rose from one million to 3.8 million, then stabilized at 2.5 million in the 1950s. Federal expenditures, which had stood at $3,100-million in 1929, increased to $75,000-million in 1953 and passed $150,000-million in the 1960s.

Most Americans accepted government’s expanded role, even as they disagreed about how far that expansion should continue. Democrats generally wanted the government to ensure growth and stability. They wanted to extend federal benefits for education, health, and welfare. Many Republicans accepted a level of government responsibility, but hoped to cap spending and restore a larger measure of individual initiative.  The presidential election of 1960 revealed a nation almost evenly divided between these visions.

John F. Kennedy, the Democratic victor by a narrow margin, was at 43 the youngest man ever to win the presidency. On television, in a series of debates with opponent Richard Nixon, he appeared able, articulate, and energetic. In the campaign, he spoke of moving aggressively into the new decade, for “the New Frontier is here whether we seek it or not.” In his first inaugural address, he concluded with an eloquent plea: “Ask not what your country can do for you – ask what you can do for your country.” Throughout his brief presidency, Kennedy’s special combination of grace, wit, and style – far more than his specific legislative agenda – sustained his popularity and influenced generations of politicians to come.

Kennedy wanted to exert strong leadership to extend economic benefits to all citizens, but a razor‑thin margin of victory limited his mandate. Even though the Democratic Party controlled both houses of Congress, conservative Southern Democrats often sided with the Republicans on issues involving the scope of governmental intervention in the economy.  They resisted plans to increase federal aid to education, provide health insurance for the elderly, and create a new Department of Urban Affairs.  And so, despite his lofty rhetoric, Kennedy’s policies were often limited and restrained.

One priority was to end the recession, in progress when Kennedy took office, and restore economic growth. But Kennedy lost the confidence of business leaders in 1962, when he succeeded in rolling back what the administration regarded as an excessive price increase in the steel industry.  Though the president achieved his immediate goal, he alienated an important source of support. Persuaded by his economic advisers that a large tax cut would stimulate the economy, Kennedy backed a bill providing for one.  Conservative opposition in Congress, however, appeared to destroy any hopes of passing a bill most congressmen thought would widen the budget deficit.

The overall legislative record of the Kennedy administration was meager. The president made some gestures toward civil rights leaders but did not embrace the goals of the civil rights movement until demonstrations led by Martin Luther King Jr. forced his hand in 1963.  Like Truman before him, he could not secure congressional passage of federal aid to public education or for a medical care program limited to the elderly.  He gained only a modest increase in the minimum wage. Still, he did secure funding for a space program, and established the Peace Corps to send men and women overseas to assist developing countries in meeting their own needs.

KENNEDY AND THE COLD WAR

President Kennedy came into office pledged to carry on the Cold War vigorously, but he also hoped for accommodation and was reluctant to commit American power.  During his first year-and-a-half in office, he rejected American intervention after the CIA-guided Cuban exile invasion at the Bay of Pigs failed, effectively ceded the landlocked Southeast Asian nation of Laos to Communist control, and acquiesced in the building of the Berlin Wall.  Kennedy’s decisions reinforced impressions of weakness that Soviet Premier Nikita Khrushchev had formed in their only personal meeting, a summit meeting at Vienna in June 1961.

It was against this backdrop that Kennedy faced the most serious event of the Cold War, the Cuban missile crisis.

In the fall of 1962, the administration learned that the Soviet Union was secretly installing offensive nuclear missiles in Cuba. After considering different options, Kennedy decided on a quarantine to prevent Soviet ships from bringing additional supplies to Cuba.  He demanded publicly that the Soviets remove the weapons and warned that an attack from that island would bring retaliation against the USSR. After several days of tension, during which the world was closer than ever before to nuclear war, the Soviets agreed to remove the missiles. Critics charged that Kennedy had risked nuclear disaster when quiet diplomacy might have been effective.  But most Americans and much of the non-Communist world applauded his decisiveness.  The missile crisis made him for the first time the acknowledged leader of the democratic West.

In retrospect, the Cuban missile crisis marked a turning point in U.S.-Soviet relations. Both sides saw the need to defuse tensions that could lead to direct military conflict. The following year, the United States, the Soviet Union, and Great Britain signed a landmark Limited Test Ban Treaty prohibiting nuclear weapons tests in the atmosphere.

Indochina (Vietnam, Laos, Cambodia), a French possession before World War II, was still another Cold War battlefield.  The French effort to reassert colonial control there was opposed by Ho Chi Minh, a Vietnamese Communist, whose Viet Minh movement engaged in a guerrilla war with the French army.

Both Truman and Eisenhower, eager to maintain French support for the policy of containment in Europe, provided France with economic aid that freed resources for the struggle in Vietnam. But the French suffered a decisive defeat in Dien Bien Phu in May 1954.  At an international conference in Geneva, Laos and Cambodia were given their independence.  Vietnam was divided, with Ho in power in the North and Ngo Dinh Diem, a Roman Catholic anti-Communist in a largely Buddhist population, heading the government in the South. Elections were to be held two years later to unify the country.  Persuaded that the fall of Vietnam could lead to the fall of Burma, Thailand, and Indonesia, Eisenhower backed Diem’s refusal to hold elections in 1956 and effectively established South Vietnam as an American client state.

Kennedy increased assistance, and sent small numbers of military advisors, but a new guerrilla struggle between North and South continued. Diem’s unpopularity grew and the military situation worsened.  In late 1963, Kennedy secretly assented to a coup d’etat.  To the president’s surprise, Diem and his powerful brother-in-law, Ngo Dien Nu, were killed.  It was at this uncertain juncture that Kennedy’s presidency ended three weeks later.

THE SPACE PROGRAM

During Eisenhower’s second term, outer space had become an arena for U.S.-Soviet competition.  In 1957, the Soviet Union launched Sputnik – an artificial satellite – thereby demonstrating it could build more powerful rockets than the United States.  The United States launched its first satellite, Explorer I, in 1958.  But three months after Kennedy became president, the USSR put the first man in orbit.  Kennedy responded by committing the United States to land a man on the moon and bring him back “before this decade is out.”  With Project Mercury in 1962, John Glenn became the first U.S. astronaut to orbit the Earth.

After Kennedy’s death, President Lyndon Johnson enthusiastically supported the space program.  In the mid-1960s, U.S. scientists developed the two-person Gemini spacecraft. Gemini achieved several firsts, including an eight‑day mission in August 1965 – the longest space flight at that time – and in November 1966, the first automatically controlled reentry into the Earth’s atmosphere. Gemini also accomplished the first manned linkup of two spacecraft in flight as well as the first U.S. walks in space.

The three-person Apollo spacecraft achieved Kennedy’s goal and demonstrated to the world that the United States had surpassed Soviet capabilities in space. On July 20, 1969, with hundreds of millions of television viewers watching around the world, Neil Armstrong became the first human to walk on the surface of the moon.

Other Apollo flights followed, but many Americans began to question the value of manned space flight. In the early 1970s, as other priorities became more pressing, the United States scaled down the space program. Some Apollo missions were scrapped; only one of two proposed Skylab space stations was built.

DEATH OF A PRESIDENT

John Kennedy had gained world prestige by his management of the Cuban missile crisis and had won great popularity at home.  Many believed he would win re-election easily in 1964. But on November 22, 1963, he was assassinated while riding in an open car during a visit to Dallas, Texas. His death, amplified by television coverage, was a traumatic event, just as Roosevelt’s had been 18 years earlier.

In retrospect, it is clear that Kennedy’s reputation stems more from his style and eloquently stated ideals than from the implementation of his policies.  He had laid out an impressive agenda but at his death much remained blocked in Congress.  It was largely because of the political skill and legislative victories of his successor that Kennedy would be seen as a force for progressive change.

LYNDON JOHNSON AND THE GREAT SOCIETY

Lyndon Johnson, a Texan who was majority leader in the Senate before becoming Kennedy’s vice president, was a masterful politician. He had been schooled in Congress, where he developed an extraordinary ability to get things done. He excelled at pleading, cajoling, or threatening as necessary to achieve his ends. His liberal idealism was probably deeper than Kennedy’s.  As president, he wanted to use his power aggressively to eliminate poverty and spread the benefits of prosperity to all.

Johnson took office determined to secure the passage of Kennedy’s legislative agenda.  His immediate priorities were his predecessor’s bills to reduce taxes and guarantee civil rights. Using his skills of persuasion and calling on the legislators’ respect for the slain president, Johnson succeeded in gaining passage of both during his first year in office.  The tax cuts stimulated the economy.  The Civil Rights Act of 1964 was the most far-reaching such legislation since Reconstruction.

Johnson addressed other issues as well. By the spring of 1964, he had begun to use the name “Great Society” to describe his socio-economic program.  That summer he secured passage of a federal jobs program for impoverished young people.  It was the first step in what he called the “War on Poverty.”  In the presidential election that November, he won a landslide victory over conservative Republican Barry Goldwater.  Significantly, the 1964 election gave liberal Democrats firm control of Congress for the first time since 1938.  This would enable them to pass legislation over the combined opposition of Republicans and conservative Southern Democrats.

The War on Poverty became the centerpiece of the administration’s Great Society program. The Office of Economic Opportunity, established in 1964, provided training for the poor and established various community-action agencies, guided by an ethic of “participatory democracy” that aimed to give the poor themselves a voice in housing, health, and education programs.

Medical care came next. Under Johnson’s leadership, Congress enacted Medicare, a health insurance program for the elderly, and Medicaid, a program providing health-care assistance for the poor.

Johnson succeeded in the effort to provide more federal aid for elementary and secondary schooling, traditionally a state and local function.  The measure that was enacted gave money to the states based on the number of their children from low‑income families. Funds could be used to assist public- and private-school children alike.

Convinced the United States confronted an “urban crisis” characterized by declining inner cities, the Great Society architects devised a new housing act that provided rent supplements for the poor and established a Department of Housing and Urban Development.

Other legislation had an impact on many aspects of American life.  Federal assistance went to artists and scholars to encourage their work. In September 1966, Johnson signed into law two transportation bills. The first provided funds to state and local governments for developing safety programs, while the other set up federal safety standards for cars and tires. The latter program reflected the efforts of a crusading young radical, Ralph Nader.  In his 1965 book, Unsafe at Any Speed: The Designed‑In Dangers of the American Automobile, Nader argued that automobile manufacturers were sacrificing safety features for style, and charged that faulty engineering contributed to highway fatalities.

In 1965, Congress abolished the discriminatory 1924 national-origin immigration quotas.  This triggered a new wave of immigration, much of it from South and East Asia and Latin America.

The Great Society was the largest burst of legislative activity since the New Deal. But support weakened as early as 1966. Some of Johnson’s programs did not live up to expectations; many went underfunded.  The urban crisis seemed, if anything, to worsen.  Still, whether because of the Great Society spending or because of a strong economic upsurge, poverty did decline at least marginally during the Johnson administration.

THE WAR IN VIETNAM

Dissatisfaction with the Great Society came to be more than matched by unhappiness with the situation in Vietnam.   A series of South Vietnamese strong men proved little more successful than Diem in mobilizing their country.  The Viet Cong, insurgents supplied and coordinated from North Vietnam, gained ground in the countryside.

Determined to halt Communist advances in South Vietnam, Johnson made the Vietnam War his own. After a North Vietnamese naval attack on two American destroyers, Johnson won from Congress on August 7, 1964, passage of the Gulf of Tonkin Resolution, which allowed the president to “take all necessary measures to repel any armed attack against the forces of the United States and to prevent further aggression.” After his re-election in November 1964, he embarked on a policy of escalation. From 25,000 troops at the start of 1965, the number of soldiers – both volunteers and draftees – rose to 500,000 by 1968. A bombing campaign wrought havoc in both North and South Vietnam.

Grisly television coverage with a critical edge dampened support for the war.  Some Americans thought it immoral; others watched in dismay as the massive military campaign seemed to be ineffective.  Large protests, especially among the young, and a mounting general public dissatisfaction pressured Johnson to begin negotiating for peace.

THE ELECTION OF 1968

By 1968 the country was in turmoil over both the Vietnam War and civil disorder, expressed in urban riots that reflected African-American anger. On March 31, 1968, the president renounced any intention of seeking another term.  Just a week later, Martin Luther King Jr. was shot and killed in Memphis, Tennessee.  John Kennedy’s younger brother, Robert, made an emotional anti-war campaign for the Democratic nomination, only to be assassinated in June.

At the Democratic National Convention in Chicago, Illinois, protesters fought street battles with police. A divided Democratic Party nominated Vice President Hubert Humphrey, once the hero of the liberals but now seen as a Johnson loyalist.  White opposition to the civil rights measures of the 1960s galvanized the third-party candidacy of Alabama Governor George Wallace, a Democrat who captured his home state, Mississippi, and Arkansas, Louisiana, and Georgia, states typically carried in that era by the Democratic nominee.  Republican Richard Nixon, who ran on a plan to extricate the United States from the war and to increase “law and order” at home, scored a narrow victory.

NIXON, VIETNAM, AND THE COLD WAR

Determined to achieve “peace with honor,” Nixon slowly withdrew American troops while redoubling efforts to equip the South Vietnamese army to carry on the fight.  He also ordered strong American offensive actions. The most important of these was an invasion of Cambodia in 1970 to cut off North Vietnamese supply lines to South Vietnam. This led to another round of protests and demonstrations.  Students in many universities took to the streets.  At Kent State in Ohio, the national guard troops who had been called in to restore order panicked and killed four students.

By the fall of 1972, however, troop strength in Vietnam was below 50,000 and the military draft, which had caused so much campus discontent, was all but dead.  A cease-fire, negotiated for the United States by Nixon’s national security adviser, Henry Kissinger, was signed in 1973. Although American troops departed, the war lingered on into the spring of 1975, when Congress cut off assistance to South Vietnam and North Vietnam consolidated its control over the entire country.

The war left Vietnam devastated, with millions maimed or killed.  It also left the United States traumatized. The nation had spent over $150,000-million in a losing effort that cost more than 58,000 American lives. Americans were no longer united by a widely held Cold War consensus, and became wary of further foreign entanglements.

Yet as Vietnam wound down, the Nixon administration took historic steps toward closer ties with the major Communist powers. The most dramatic move was a new relationship with the People’s Republic of China. In the two decades since Mao Zedong’s victory, the United States had argued that the Nationalist government on Taiwan represented all of China. In 1971 and 1972, Nixon softened the American stance, eased trading restrictions, and became the first U.S. president ever to visit Beijing.  The “Shanghai Communique” signed during that visit established a new U.S. policy:  that there was one China, that Taiwan was a part of China, and that a peaceful settlement of the dispute of the question by the Chinese themselves was a U.S. interest.

With the Soviet Union, Nixon was equally successful in pursuing the policy he and his Secretary of State Henry Kissinger called détente.  He held several cordial meetings with Soviet leader Leonid Brezhnev in which they agreed to limit stockpiles of missiles, cooperate in space, and ease trading restrictions. The Strategic Arms Limitation Talks (SALT) culminated in 1972 in an arms control agreement limiting the growth of nuclear arsenals and restricting anti-ballistic missile systems.

NIXON’S ACCOMPLISHMENTS AND DEFEATS

Vice president under Eisenhower before his unsuccessful run for the presidency in 1960, Nixon was seen as among the shrewdest of American politicians.  Although Nixon subscribed to the Republican value of fiscal responsibility, he accepted a need for government’s expanded role and did not oppose the basic contours of the welfare state. He simply wanted to manage its programs better.  Not opposed to African-American civil rights on principle, he was wary of large federal civil rights bureaucracies.  Nonetheless, his administration vigorously enforced court orders on school desegregation even as it courted Southern white voters.

Perhaps his biggest domestic problem was the economy.  He inherited both a slowdown from its Vietnam peak under Johnson, and a continuing inflationary surge that had been a by-product of the war.  He dealt with the first by becoming the first Republican president to endorse deficit spending as a way to stimulate the economy; the second by imposing wage and price controls, a policy in which the Right had no long-term faith, in 1971.  In the short run, these decisions stabilized the economy and established favorable conditions for Nixon’s re-election in 1972.  He won an overwhelming victory over peace-minded Democratic Senator George McGovern.

Things began to sour very quickly into the president’s second term.  Very early on, he faced charges that his re-election committee had managed a break-in at the Watergate building headquarters of the Democratic National Committee and that he had participated in a cover-up.  Special prosecutors and congressional committees dogged his presidency thereafter.

Factors beyond Nixon’s control undermined his economic policies. In 1973 the war between Israel and Egypt and Syria prompted Saudi Arabia to embargo oil shipments to Israel’s ally, the United States. Other member nations of the Organization of the Petroleum Exporting Countries (OPEC) quadrupled their prices. Americans faced both shortages, exacerbated in the view of many by over-regulation of distribution, and rapidly rising prices. Even when the embargo ended the next year, prices remained high and affected all areas of American economic life: In 1974, inflation reached 12 percent, causing disruptions that led to even higher unemployment rates. The unprecedented economic boom America had enjoyed since 1948 was grinding to a halt.

Nixon’s rhetoric about the need for “law and order” in the face of rising crime rates, increased drug use, and more permissive views about sex resonated with more Americans than not.  But this concern was insufficient to quell concerns about the Watergate break-in and the economy.  Seeking to energize and enlarge his own political constituency, Nixon lashed out at demonstrators, attacked the press for distorted coverage, and sought to silence his opponents.  Instead, he left an unfavorable impression with many who saw him on television and perceived him as unstable.  Adding to Nixon’s troubles, Vice President Spiro Agnew, his outspoken point man against the media and liberals, was forced to resign in 1973, pleading “no contest” to a criminal charge of tax evasion.

Nixon probably had not known in advance of the Watergate burglary, but he had tried to cover it up, and had lied to the American people about it.  Evidence of his involvement mounted.  On July 27, 1974, the House Judiciary Committee voted to recommend his impeachment.  Facing certain ouster from office, he resigned on August 9, 1974.

THE FORD INTERLUDE

Nixon’s vice president, Gerald Ford (appointed to replace Agnew), was an unpretentious man who had spent most of his public life in Congress.  His first priority was to restore trust in the government.  However, feeling it necessary to head off the spectacle of a possible prosecution of Nixon, he issued a blanket pardon to his predecessor.  Although it was perhaps necessary, the move was nonetheless unpopular.

In public policy, Ford followed the course Nixon had set.  Economic problems remained serious, as inflation and unemployment continued to rise.  Ford first tried to reassure the public, much as Herbert Hoover had done in 1929. When that failed, he imposed measures to curb inflation, which sent unemployment above 8 percent.  A tax cut, coupled with higher unemployment benefits, helped a bit but the economy remained weak.

In foreign policy, Ford adopted Nixon’s strategy of detente.  Perhaps its major manifestation was the Helsinki Accords of 1975, in which the United States and Western European nations effectively recognized Soviet hegemony in Eastern Europe in return for Soviet affirmation of human rights.  The agreement had little immediate significance, but over the long run may have made maintenance of the Soviet empire more difficult.  Western nations effectively used periodic “Helsinki review meetings” to call attention to various abuses of human rights by Communist regimes of the Eastern bloc.

THE CARTER YEARS

Jimmy Carter, former Democratic governor of Georgia, won the presidency in 1976. Portraying himself during the campaign as an outsider to Washington politics, he promised a fresh approach to governing, but his lack of experience at the national level complicated his tenure from the start. A naval officer and engineer by training, he often appeared to be a technocrat, when Americans wanted someone more visionary to lead them through troubled times.

In economic affairs, Carter at first permitted a policy of deficit spending.  Inflation rose to 10 percent a year when the Federal Reserve Board, responsible for setting monetary policy, increased the money supply to cover deficits. Carter responded by cutting the budget, but cuts affected social programs at the heart of Democratic domestic policy.  In mid-1979, anger in the financial community practically forced him to appoint Paul Volcker as chairman of the Federal Reserve. Volcker was an “inflation hawk” who increased interest rates in an attempt to halt price increases, at the cost of negative consequences for the economy.

Carter also faced criticism for his failure to secure passage of an effective energy policy. He presented a comprehensive program, aimed at reducing dependence on foreign oil, that he called the “moral equivalent of war.” Opponents thwarted it in Congress.

Though Carter called himself a populist, his political priorities were never wholly clear. He endorsed government’s protective role, but then began the process of deregulation, the removal of governmental controls in economic life. Arguing that some restrictions over the course of the past century limited competition and increased consumer costs, he favored decontrol in the oil, airline, railroad, and trucking industries.

Carter’s political efforts failed to gain either public or congressional support. By the end of his term, his disapproval rating reached 77 percent, and Americans began to look toward the Republican Party again.

Carter’s greatest foreign policy accomplishment was the negotiation of a peace settlement between Egypt, under President Anwar al-Sadat, and Israel, under Prime Minister Menachem Begin.  Acting as both mediator and participant, he persuaded the two leaders to end a 30-year state of war. The subsequent peace treaty was signed at the White House in March 1979.

After protracted and often emotional debate, Carter also secured Senate ratification of treaties ceding the Panama Canal to Panama by the year 2000.  Going a step farther than Nixon, he extended formal diplomatic recognition to the People’s Republic of China.

But Carter enjoyed less success with the Soviet Union. Though he assumed office with detente at high tide and declared that the United States had escaped its “inordinate fear of Communism,” his insistence that “our commitment to human rights must be absolute” antagonized the Soviet government. A SALT II agreement further limiting nuclear stockpiles was signed, but not ratified by the U.S. Senate, many of whose members felt the treaty was unbalanced.  The 1979 Soviet invasion of Afghanistan killed the treaty and triggered a Carter defense build‑up that paved the way for the huge expenditures of the 1980s.

Carter’s most serious foreign policy challenge came in Iran. After an Islamic fundamentalist revolution led by Shiite Muslim leader Ayatollah Ruhollah Khomeini replaced a corrupt but friendly regime, Carter admitted the deposed shah to the United States for medical treatment. Angry Iranian militants, supported by the Islamic regime, seized the American embassy in Tehran and held 53 American hostages for more than a year.  The long-running hostage crisis dominated the final year of his presidency and greatly damaged his chances for re-election.

Posted in Uncategorized | Tagged , , , , , , , , , , , , , , , , , | Leave a comment

Postwar America : U.S. dominates global affairs

Postwar America

U.S. dominates global affairs

Moving day in a newly opened suburban community, 1953

Moving day in a newly opened suburban community, 1953. (J.R Eyerman/Time Life Pictures/Getty Images)

(The following article is taken from the U.S. Department of State publication, Outline of American History.)

“We must build a new world, a far better world – one in which the eternal dignity of man is respected.”
– President Harry S Truman, 1945

CONSENSUS AND CHANGE

The United States dominated global affairs in the years immediately after World War II. Victorious in that great struggle, its homeland undamaged from the ravages of war, the nation was confident of its mission at home and abroad. U.S. leaders wanted to maintain the democratic structure they had defended at tremendous cost and to share the benefits of prosperity as widely as possible. For them, as for publisher Henry Luce of Time magazine, this was the “American Century.”

For 20 years most Americans remained sure of this confident approach. They accepted the need for a strong stance against the Soviet Union in the Cold War that unfolded after 1945. They endorsed the growth of government authority and accepted the outlines of the rudimentary welfare state first formulated during the New Deal. They enjoyed a postwar prosperity that created new levels of affluence.

But gradually some began to question dominant assumptions.  Challenges on a variety of fronts shattered the consensus. In the 1950s, African Americans launched a crusade, joined later by other minority groups and women, for a larger share of the American dream. In the 1960s, politically active students protested the nation’s role abroad, particularly in the corrosive war in Vietnam.  A youth counterculture emerged to challenge the status quo.  Americans from many walks of life sought to establish a new social and political equilibrium.

COLD WAR AIMS

The Cold War was the most important political and diplomatic issue of the early postwar period. It grew out of longstanding disagreements between the Soviet Union and the United States that developed after the Russian Revolution of 1917. The Soviet Communist Party under V.I. Lenin considered itself the spearhead of an international movement that would replace the existing political orders in the West, and indeed throughout the world.  In 1918 American troops participated in the Allied intervention in Russia on behalf of anti-Bolshevik forces. American diplomatic recognition of the Soviet Union did not come until 1933. Even then, suspicions persisted. During World War II, however, the two countries found themselves allied and downplayed their differences to counter the Nazi threat.

At the war’s end, antagonisms surfaced again. The United States hoped to share with other countries its conception of liberty, equality, and democracy. It sought also to learn from the perceived mistakes of the post-WWI era, when American political disengagement and economic protectionism were thought to have contributed to the rise of dictatorships in Europe and elsewhere. Faced again with a postwar world of civil wars and disintegrating empires, the nation hoped to provide the stability to make peaceful reconstruction possible. Recalling the specter of the Great Depression (1929-1940), America now advocated open trade for two reasons: to create markets for American agricultural and industrial products, and to ensure the ability of Western European nations to export as a means of rebuilding their economies. Reduced trade barriers, American policy makers believed, would promote economic growth at home and abroad, bolstering U.S. friends and allies in the process.

The Soviet Union had its own agenda. The Russian historical tradition of centralized, autocratic government contrasted with the American emphasis on democracy. Marxist-Leninist ideology had been downplayed during the war but still guided Soviet policy. Devastated by the struggle in which 20 million Soviet citizens had died, the Soviet Union was intent on rebuilding and on protecting itself from another such terrible conflict. The Soviets were particularly concerned about another invasion of their territory from the west. Having repelled Hitler’s thrust, they were determined to preclude another such attack. They demanded “defensible” borders and “friendly” regimes in Eastern Europe and seemingly equated both with the spread of Communism, regardless of the wishes of native populations. However, the United States had declared that one of its war aims was the restoration of independence and self-government to Poland, Czechoslovakia, and the other countries of Central and Eastern Europe.

HARRY TRUMAN’S LEADERSHIP

The nation’s new chief executive, Harry S Truman, succeeded Franklin D. Roosevelt as president before the end of the war. An unpretentious man who had previously served as Democratic senator from Missouri, then as vice president, Truman initially felt ill-prepared to govern.  Roosevelt had not discussed complex postwar issues with him, and he had little experience in international affairs.  “I’m not big enough for this job,” he told a former colleague.

Still, Truman responded quickly to new challenges. Sometimes impulsive on small matters, he proved willing to make hard and carefully considered decisions on large ones.  A small sign on his White House desk declared, “The Buck Stops Here.”  His judgments about how to respond to the Soviet Union ultimately determined the shape of the early Cold War.

ORIGINS OF THE COLD WAR

The Cold War developed as differences about the shape of the postwar world created suspicion and distrust between the United States and the Soviet Union.  The first – and most difficult – test case was Poland, the eastern half of which had been invaded and occupied by the USSR in 1939.  Moscow demanded a government subject to Soviet influence; Washington wanted a more independent, representative government following the Western model. The Yalta Conference of February 1945 had produced an agreement on Eastern Europe open to different interpretations. It included a promise of “free and unfettered” elections.

Meeting with Soviet Minister of Foreign Affairs Vyacheslav Molotov less than two weeks after becoming president, Truman stood firm on Polish self-determination, lecturing the Soviet diplomat about the need to implement the Yalta accords. When Molotov protested, “I have never been talked to like that in my life,” Truman retorted, “Carry out your agreements and you won’t get talked to like that.” Relations deteriorated from that point onward.

During the closing months of World War II, Soviet military forces occupied all of Central and Eastern Europe. Moscow used its military power to support the efforts of the Communist parties in Eastern Europe and crush the democratic parties. Communists took over one nation after another.  The process concluded with a shocking coup d’etat in Czechoslovakia in 1948.

Public statements defined the beginning of the Cold War. In 1946 Stalin declared that international peace was impossible “under the present capitalist development of the world economy.”  Former British Prime Minister Winston Churchill delivered a dramatic speech in Fulton, Missouri, with Truman sitting on the platform.  “From Stettin in the Baltic to Trieste in the Adriatic,” Churchill said, “an iron curtain has descended across the Continent.” Britain and the United States, he declared, had to work together to counter the Soviet threat.

CONTAINMENT

Containment of the Soviet Union became American policy in the postwar years. George Kennan, a top official at the U.S. embassy in Moscow, defined the new approach in the Long Telegram he sent to the State Department in 1946.  He extended his analysis in an article under the signature “X” in the prestigious journal Foreign Affairs.  Pointing to Russia’s traditional sense of insecurity, Kennan argued that the Soviet Union would not soften its stance under any circumstances. Moscow, he wrote, was “committed fanatically to the belief that with the United States there can be no permanent modus vivendi, that it is desirable and necessary that the internal harmony of our society be disrupted.” Moscow’s pressure to expand its power had to be stopped through “firm and vigilant containment of Russian expansive tendencies. …”

The first significant application of the containment doctrine came in the Middle East and eastern Mediterranean. In early 1946, the United States demanded, and obtained, a full Soviet withdrawal from Iran, the northern half of which it had occupied during the war.  That summer, the United States pointedly supported Turkey against Soviet demands for control of the Turkish straits between the Black Sea and the Mediterranean.  In early 1947, American policy crystallized when Britain told the United States that it could no longer afford to support the government of Greece against a strong Communist insurgency.

In a strongly worded speech to Congress, Truman declared, “I believe that it must be the policy of the United States to support free peoples who are resisting attempted subjugation by armed minorities or by outside pressures.”  Journalists quickly dubbed this statement the “Truman Doctrine.”  The president asked Congress to provide $400 million for economic and military aid, mostly to Greece but also to Turkey.  After an emotional debate that resembled the one between interventionists and isolationists before World War II, the money was appropriated.

Critics from the left later charged that to whip up American support for the policy of containment, Truman overstated the Soviet threat to the United States. In turn, his statements inspired a wave of hysterical anti-Communism throughout the country.  Perhaps so.  Others, however, would counter that this argument ignores the backlash that likely would have occurred if Greece, Turkey, and other countries had fallen within the Soviet orbit with no opposition from the United States.

Containment also called for extensive economic aid to assist the recovery of war-torn Western Europe. With many of the region’s nations economically and politically unstable, the United States feared that local Communist parties, directed by Moscow, would capitalize on their wartime record of resistance to the Nazis and come to power. “The patient is sinking while the doctors deliberate,” declared Secretary of State George C. Marshall. In mid-1947 Marshall asked troubled European nations to draw up a program “directed not against any country or doctrine but against hunger, poverty, desperation, and chaos.”

The Soviets participated in the first planning meeting, then departed rather than share economic data and submit to Western controls on the expenditure of the aid. The remaining 16 nations hammered out a request that finally came to $17,000 million for a four‑year period. In early 1948 Congress voted to fund the “Marshall Plan,” which helped underwrite the economic resurgence of Western Europe. It is generally regarded as one of the most successful foreign policy initiatives in U.S. history.

Postwar Germany was a special problem.  It had been divided into U.S., Soviet, British, and French zones of occupation, with the former German capital of Berlin (itself divided into four zones), near the center of the Soviet zone. When the Western powers announced their intention to create a consolidated federal state from their zones, Stalin responded. On June 24, 1948, Soviet forces blockaded Berlin, cutting off all road and rail access from the West.

American leaders feared that losing Berlin would be a prelude to losing Germany and subsequently all of Europe. Therefore, in a successful demonstration of Western resolve known as the Berlin Airlift, Allied air forces took to the sky, flying supplies into Berlin. U.S., French, and British planes delivered nearly 2,250,000 tons of goods, including food and coal. Stalin lifted the blockade after 231 days and 277,264 flights.

By then, Soviet domination of Eastern Europe, and especially the Czech coup, had alarmed the Western Europeans.  The result, initiated by the Europeans, was a military alliance to complement economic efforts at containment. The Norwegian historian Geir Lundestad has called it “empire by invitation.”  In 1949 the United States and 11 other countries established the North Atlantic Treaty Organization (NATO).  An attack against one was to be considered an attack against all, to be met by appropriate force.  NATO was the first peacetime “entangling alliance” with powers outside the Western hemisphere in American history.

The next year, the United States defined its defense aims clearly. The National Security Council (NSC) – the forum where the President, Cabinet officers, and other executive branch members consider national security and foreign affairs issues – undertook a full-fledged review of American foreign and defense policy. The resulting document, known as NSC-68, signaled a new direction in American security policy. Based on the assumption that “the Soviet Union was engaged in a fanatical effort to seize control of all governments wherever possible,” the document committed America to assist allied nations anywhere in the world that seemed threatened by Soviet aggression. After the start of the Korean War, a reluctant Truman approved the document.  The United States proceeded to increase defense spending dramatically.

THE COLD WAR IN ASIA AND THE MIDDLE EAST

While seeking to prevent Communist ideology from gaining further adherents in Europe, the United States also responded to challenges elsewhere. In China, Americans worried about the advances of Mao Zedong and his Communist Party.  During World War II, the Nationalist government under Chiang Kai-shek and the Communist forces waged a civil war even as they fought the Japanese. Chiang had been a war-time ally, but his government was hopelessly inefficient and corrupt.  American policy makers had little hope of saving his regime and considered Europe vastly more important. With most American aid moving across the Atlantic, Mao’s forces seized power in 1949.  Chiang’s government fled to the island of Taiwan.  When China’s new ruler announced that he would support the Soviet Union against the “imperialist” United States, it appeared that Communism was spreading out of control, at least in Asia.

The Korean War brought armed conflict between the United States and China. The United States and the Soviet Union had divided Korea along the 38th parallel after liberating it from Japan at the end of World War II.  Originally a matter of military convenience, the dividing line became more rigid as both major powers set up governments in their respective occupation zones and continued to support them even after departing.

In June 1950, after consultations with and having obtained the assent of the Soviet Union, North Korean leader Kim Il-sung dispatched his Soviet-supplied army across the 38th parallel and attacked southward, overrunning Seoul. Truman, perceiving the North Koreans as Soviet pawns in the global struggle, readied American forces and ordered World War II hero General Douglas MacArthur to Korea. Meanwhile, the United States was able to secure a U.N. resolution branding North Korea as an aggressor. (The Soviet Union, which could have vetoed any action had it been occupying its seat on the Security Council, was boycotting the United Nations to protest a decision not to admit Mao’s new Chinese regime.)

The war seesawed back and forth. U.S. and Korean forces were initially pushed into an enclave far to the south around the city of Pusan. A daring amphibious landing at Inchon, the port for the city of Seoul, drove the North Koreans back and threatened to occupy the entire peninsula.  In November, China entered the war, sending massive forces across the Yalu River. U.N. forces, largely American, retreated once again in bitter fighting.  Commanded by General Matthew B. Ridgway, they stopped the overextended Chinese, and slowly fought their way back to the 38th parallel.  MacArthur meanwhile challenged Truman’s authority by attempting to orchestrate public support for bombing China and assisting an invasion of the mainland by Chiang Kai-shek’s forces.  In April 1951, Truman relieved him of his duties and replaced him with Ridgway.

The Cold War stakes were high.  Mindful of the European priority, the U.S. government decided against sending more troops to Korea and was ready to settle for the prewar status quo.   The result was frustration among many Americans who could not understand the need for restraint. Truman’s popularity plunged to a 24-percent approval rating, the lowest to that time of any president since pollsters had begun to measure presidential popularity.  Truce talks began in July 1951. The two sides finally reached an agreement in July 1953, during the first term of Truman’s successor, Dwight Eisenhower.

Cold War struggles also occurred in the Middle East. The region’s strategic importance as a supplier of oil had provided much of the impetus for pushing the Soviets out of Iran in 1946.  But two years later, the United States officially recognized the new state of Israel 15 minutes after it was proclaimed – a decision Truman made over strong resistance from Marshall and the State Department. The result was an enduring dilemma – how to maintain ties with Israel while keeping good relations with bitterly anti-Israeli (and oil-rich) Arab states.

EISENHOWER AND THE COLD WAR

In 1953, Dwight D. Eisenhower became the first Republican president in 20 years.  A war hero rather than a career politician, he had a natural, common touch that made him widely popular. “I like Ike” was the campaign slogan of the time.  After serving as Supreme Commander of Allied Forces in Western Europe during World War II, Eisenhower had been army chief of staff, president of Columbia University, and military head of NATO before seeking the Republican presidential nomination.  Skillful at getting people to work together, he functioned as a strong public spokesman and an executive manager somewhat removed from detailed policy making.

Despite disagreements on detail, he shared Truman’s basic view of American foreign policy.  He, too, perceived Communism as a monolithic force struggling for world supremacy.  In his first inaugural address, he declared, “Forces of good and evil are massed and armed and opposed as rarely before in history. Freedom is pitted against slavery, lightness against dark.”

The new president and his secretary of state, John Foster Dulles, had argued that containment did not go far enough to stop Soviet expansion. Rather, a more aggressive policy of liberation was necessary, to free those subjugated by Communism. But when a democratic rebellion broke out in Hungary in 1956, the United States stood back as Soviet forces suppressed it.

Eisenhower’s basic commitment to contain Communism remained, and to that end he increased American reliance on a nuclear shield. The United States had created the first atomic bombs. In 1950 Truman had authorized the development of a new and more powerful hydrogen bomb.  Eisenhower, fearful that defense spending was out of control, reversed Truman’s NSC-68 policy of a large conventional military buildup.

Relying on what Dulles called “massive retaliation,” the administration signaled it would use nuclear weapons if the nation or its vital interests were attacked.

In practice, however, the nuclear option could be used only against extremely critical attacks.  Real Communist threats were generally peripheral.  Eisenhower rejected the use of nuclear weapons in Indochina, when the French were ousted by Vietnamese Communist forces in 1954.  In 1956, British and French forces attacked Egypt following Egyptian nationalization of the Suez Canal and Israel invaded the Egyptian Sinai.  The president exerted heavy pressure on all three countries to withdraw.  Still, the nuclear threat may have been taken seriously by Communist China, which refrained not only from attacking Taiwan, but from occupying small islands held by Nationalist Chinese just off the mainland.  It may also have deterred Soviet occupation of Berlin, which reemerged as a festering problem during Eisenhower’s last two years in office.

THE COLD WAR AT HOME

Not only did the Cold War shape U.S. foreign policy, it also had a profound effect on domestic affairs. Americans had long feared radical subversion.  These fears could at times be overdrawn, and used to justify otherwise unacceptable political restrictions, but it also was true that individuals under Communist Party discipline and many “fellow traveler” hangers-on gave their political allegiance not to the United States, but to the international Communist movement, or, practically speaking, to Moscow. During the Red Scare of 1919-1920, the government had attempted to remove perceived threats to American society. After World War II, it made strong efforts against Communism within the United States. Foreign events, espionage scandals, and politics created an anti-Communist hysteria.

When Republicans were victorious in the midterm congressional elections of 1946 and appeared ready to investigate subversive activity, President Truman established a Federal Employee Loyalty Program.  It had little impact on the lives of most civil servants, but a few hundred were dismissed, some unfairly.

In 1947 the House Committee on Un-American Activities investigated the motion-picture industry to determine whether Communist sentiments were being reflected in popular films. When some writers (who happened to be secret members of the Communist Party) refused to testify, they were cited for contempt and sent to prison.  After that, the film companies refused to hire anyone with a marginally questionable past.

In 1948, Alger Hiss, who had been an assistant secretary of state and an adviser to Roosevelt at Yalta, was publicly accused of being a Communist spy by Whittaker Chambers, a former Soviet agent. Hiss denied the accusation, but in 1950 he was convicted of perjury.  Subsequent evidence indicates that he was indeed guilty.

In 1949 the Soviet Union shocked Americans by testing its own atomic bomb.  In 1950, the government uncovered a British-American spy network that transferred to the Soviet Union materials about the development of the atomic bomb.  Two of its operatives, Julius Rosenberg and his wife Ethel, were sentenced to death.  Attorney General J. Howard McGrath declared there were many American Communists, each bearing “the germ of death for society.”

The most vigorous anti-Communist warrior was Senator Joseph R. McCarthy, a Republican from Wisconsin. He gained national attention in 1950 by claiming that he had a list of 205 known Communists in the State Department. Though McCarthy subsequently changed this figure several times and failed to substantiate any of his charges, he struck a responsive public chord.

McCarthy gained power when the Republican Party won control of the Senate in 1952. As a committee chairman, he now had a forum for his crusade. Relying on extensive press and television coverage, he continued to search for treachery among second-level officials in the Eisenhower administration.  Enjoying the role of a tough guy doing dirty but necessary work, he pursued presumed Communists with vigor.

McCarthy overstepped himself by challenging the U.S. Army when one of his assistants was drafted.  Television brought the hearings into millions of homes. Many Americans saw McCarthy’s savage tactics for the first time, and public support began to wane.  The Republican Party, which had found McCarthy useful in challenging a Democratic administration when Truman was president, began to see him as an embarrassment.  The Senate finally condemned him for his conduct.

McCarthy in many ways represented the worst domestic excesses of the Cold War.  As Americans repudiated him, it became natural for many to assume that the Communist threat at home and abroad had been grossly overblown.  As the country moved into the 1960s, anti-Communism became increasingly suspect, especially among intellectuals and opinion-shapers.

THE POSTWAR ECONOMY: 1945‑1960

In the decade and a half after World War II, the United States experienced phenomenal economic growth and consolidated its position as the world’s richest country. Gross national product (GNP), a measure of all goods and services produced in the United States, jumped from about $200,000-million in 1940 to $300,000-million in 1950 to more than $500,000-million in 1960. More and more Americans now considered themselves part of the middle class.

The growth had different sources. The economic stimulus provided by large-scale public spending for World War II helped get it started.  Two basic middle-class needs did much to keep it going.  The number of automobiles produced annually quadrupled between 1946 and 1955. A housing boom, stimulated in part by easily affordable mortgages for returning servicemen, fueled the expansion. The rise in defense spending as the Cold War escalated also played a part.

After 1945 the major corporations in America grew even larger. There had been earlier waves of mergers in the 1890s and in the 1920s; in the 1950s another wave occurred.  Franchise operations like McDonald’s fast-food restaurants allowed small entrepreneurs to make themselves part of large, efficient enterprises.  Big American corporations also developed holdings overseas, where labor costs were often lower.

Workers found their own lives changing as industrial America changed. Fewer workers produced goods; more provided services. As early as 1956 a majority of employees held white-collar jobs, working as managers, teachers, salespersons, and office operatives.  Some firms granted a guaranteed annual wage, long-term employment contracts, and other benefits. With such changes, labor militancy was undermined and some class distinctions began to fade.

Farmers – at least those with small operations – faced tough times. Gains in productivity led to agricultural consolidation, and farming became a big business.  More and more family farmers left the land.

Other Americans moved too.  The West and the Southwest grew with increasing rapidity, a trend that would continue through the end of the century.  Sun Belt cities like Houston, Texas; Miami, Florida; Albuquerque, New Mexico; and Phoenix, Arizona, expanded rapidly. Los Angeles, California, moved ahead of Philadelphia, Pennsylvania, as the third largest U.S. city and then surpassed Chicago, metropolis of the Midwest. The 1970 census showed that California had displaced New York as the nation’s largest state.  By 2000, Texas had moved ahead of New York into second place.

An even more important form of movement led Americans out of inner cities into new suburbs, where they hoped to find affordable housing for the larger families spawned by the postwar baby boom. Developers like William J. Levitt built new communities – with homes that all looked alike – using the techniques of mass production. Levitt’s houses were prefabricated – partly assembled in a factory rather than on the final location – and modest, but Levitt’s methods cut costs and allowed new owners to possess a part of the American dream.

As suburbs grew, businesses moved into the new areas. Large shopping centers containing a great variety of stores changed consumer patterns. The number of these centers rose from eight at the end of World War II to 3,840 in 1960.  With easy parking and convenient evening hours, customers could avoid city shopping entirely.  An unfortunate by-product was the “hollowing-out” of formerly busy urban cores.

New highways created better access to the suburbs and its shops. The Highway Act of 1956 provided $26,000-million, the largest public works expenditure in U.S. history, to build more than 64,000 kilometers of limited access interstate highways to link the country together.

Television, too, had a powerful impact on social and economic patterns.  Developed in the 1930s, it was not widely marketed until after the war. In 1946 the country had fewer than 17,000 television sets. Three years later consumers were buying 250,000 sets a month, and by 1960 three-quarters of all families owned at least one set. In the middle of the decade, the average family watched television four to five hours a day. Popular shows for children included Howdy Doody Time and The Mickey Mouse Club; older viewers preferred situation comedies like I Love Lucy and Father Knows Best. Americans of all ages became exposed to increasingly sophisticated advertisements for products said to be necessary for the good life.

THE FAIR DEAL

The Fair Deal was the name given to President Harry Truman’s domestic program. Building on Roosevelt’s New Deal, Truman believed that the federal government should guarantee economic opportunity and social stability.  He struggled to achieve those ends in the face of fierce political opposition from legislators determined to reduce the role of government.

Truman’s first priority in the immediate postwar period was to make the transition to a peacetime economy. Servicemen wanted to come home quickly, but once they arrived they faced competition for housing and employment. The G.I. Bill, passed before the end of the war, helped ease servicemen back into civilian life by providing benefits such as guaranteed loans for home-buying and financial aid for industrial training and university education.

More troubling was labor unrest. As war production ceased, many workers found themselves without jobs. Others wanted pay increases they felt were long overdue. In 1946, 4.6 million workers went on strike, more than ever before in   American history. They challenged the automobile, steel, and electrical industries. When they took on the railroads and soft-coal mines, Truman intervened to stop union excesses, but in so doing he alienated many workers.

While dealing with immediately pressing issues, Truman also provided a broader agenda for action. Less than a week after the war ended, he presented Congress with a 21-point program, which provided for protection against unfair employment practices, a higher minimum wage, greater unemployment compensation, and housing assistance. In the next several months, he added proposals for health insurance and atomic energy legislation. But this scattershot approach often left Truman’s priorities unclear.

Republicans were quick to attack. In the 1946 congressional elections they asked, “Had enough?” and voters responded that they had. Republicans, with majorities in both houses of Congress for the first time since 1928, were determined to reverse the liberal direction of the Roosevelt years.

Truman fought with the Congress as it cut spending and reduced taxes. In 1948 he sought reelection, despite polls indicating that he had little chance. After a vigorous campaign, Truman scored one of the great upsets in American politics, defeating the Republican nominee, Thomas Dewey, governor of New York. Reviving the old New Deal coalition, Truman held on to labor, farmers, and African-American voters.

When Truman finally left office in 1953, his Fair Deal was but a mixed success. In July 1948 he banned racial discrimination in federal government hiring practices and ordered an end to segregation in the military. The minimum wage had risen, and social security programs had expanded. A housing program brought some gains but left many needs unmet. National health insurance, aid-to-education measures, reformed agricultural subsidies, and his legislative civil rights agenda never made it through Congress. The president’s pursuit of the Cold War, ultimately his most important objective, made it especially difficult to develop support for social reform in the face of intense opposition.

EISENHOWER’S APPROACH

When Dwight Eisenhower succeeded Truman as president, he accepted the basic framework of government responsibility established by the New Deal, but sought to hold the line on programs and expenditures.  He termed his approach “dynamic conservatism” or “modern Republicanism,” which meant, he explained, “conservative when it comes to money, liberal when it comes to human beings.” A critic countered that Eisenhower appeared to argue that he would “strongly recommend the building of a great many schools…but not provide the money.”

Eisenhower’s first priority was to balance the budget after years of deficits.  He wanted to cut spending and taxes and maintain the value of the dollar.  Republicans were willing to risk unemployment to keep inflation in check.  Reluctant to stimulate the economy too much, they saw the country suffer three economic recessions in the eight years of the Eisenhower presidency, but none was very severe.

In other areas, the administration transferred control of offshore oil lands from the federal government to the states. It also favored private development of electrical power rather than the public approach the Democrats had initiated.  In general, its orientation was sympathetic to business.

Compared to Truman, Eisenhower had only a modest domestic program.  When he was active in promoting a bill, it likely was to trim the New Deal legacy a bit-as in reducing agricultural subsidies or placing mild restrictions on labor unions.  His disinclination to push fundamental change in either direction was in keeping with the spirit of the generally prosperous Fifties.  He was one of the few presidents who left office as popular as when he entered it.

THE CULTURE OF THE 1950S

During the 1950s, many cultural commentators argued that a sense of uniformity pervaded American society. Conformity, they asserted, was numbingly common.  Though men and women had been forced into new employment patterns during World War II, once the war was over, traditional roles were reaffirmed. Men expected to be the breadwinners in each family; women, even when they worked, assumed their proper place was at home.  In his influential book, The Lonely Crowd, sociologist David Riesman called this new society “other-directed,” characterized by conformity, but also by stability.  Television, still very limited in the choices it gave its viewers, contributed to the homogenizing cultural trend by providing young and old with a shared experience reflecting accepted social patterns.

Yet beneath this seemingly bland surface, important segments of American society seethed with rebellion.  A number of writers, collectively known as the “beat generation,” went out of their way to challenge the patterns of respectability and shock the rest of the culture.   Stressing spontaneity and spirituality, they preferred intuition over reason, Eastern mysticism over Western institutionalized religion.

The literary work of the beats displayed their sense of alienation and quest for self-realization.  Jack Kerouac typed his best-selling novel On the Road on a 75-meter roll of paper. Lacking traditional punctuation and paragraph structure, the book glorified the possibilities of the free life. Poet Allen Ginsberg gained similar notoriety for his poem “Howl,” a scathing critique of modern, mechanized civilization. When police charged that it was obscene and seized the published version, Ginsberg successfully challenged the ruling in court.

Musicians and artists rebelled as well. Tennessee singer Elvis Presley was the most successful of several white performers who popularized a sensual and pulsating style of African-American music, which began to be called “rock and roll.” At first, he outraged middle-class Americans with his ducktail haircut and undulating hips. But in a few years his performances would seem relatively tame alongside the antics of later performances such as the British Rolling Stones.  Similarly, it was in the 1950s that painters like Jackson Pollock discarded easels and laid out gigantic canvases on the floor, then applied paint, sand, and other materials in wild splashes of color. All of these artists and authors, whatever the medium, provided models for the wider and more deeply felt social revolution of the 1960s.

ORIGINS OF THE CIVIL RIGHTS MOVEMENT

African Americans became increasingly restive in the postwar years. During the war they had challenged discrimination in the military services and in the work force, and they had made limited gains. Millions of African Americans had left Southern farms for Northern cities, where they hoped to find better jobs. They found instead crowded conditions in urban slums. Now, African-American servicemen returned home, many intent on rejecting second-class citizenship.

Jackie Robinson dramatized the racial question in 1947 when he broke baseball’s color line and began playing in the major leagues. A member of the Brooklyn Dodgers, he often faced trouble with opponents and teammates as well.  But an outstanding first season led to his acceptance and eased the way for other African-American players, who now left the Negro leagues to which they had been confined.

Government officials, and many other Americans, discovered the connection between racial problems and Cold War politics. As the leader of the free world, the United States sought support in Africa and Asia. Discrimination at home impeded the effort to win friends in other parts of the world.

Harry Truman supported the early civil rights movement. He personally believed in political equality, though not in social equality, and recognized the growing importance of the African-American urban vote. When apprised in 1946 of a spate of lynchings and anti-black violence in the South, he appointed a committee on civil rights to investigate discrimination.  Its report, To Secure These Rights, issued the next year, documented African Americans’ second-class status in American life and recommended numerous federal measures to secure the rights guaranteed to all citizens.

Truman responded by sending a 10-point civil rights program to Congress.   Southern Democrats in Congress were able to block its enactment.  A number of the angriest, led by Governor Strom Thurmond of South Carolina, formed a States Rights Party to oppose the president in 1948.  Truman thereupon issued an executive order barring discrimination in federal employment, ordered equal treatment in the armed forces, and appointed a committee to work toward an end to military segregation, which was largely ended during the Korean War.

African Americans in the South in the 1950s still enjoyed few, if any, civil and political rights.  In general, they could not vote.  Those who tried to register faced the likelihood of beatings, loss of job, loss of credit, or eviction from their land. Occasional lynchings still occurred.  Jim Crow laws enforced segregation of the races in streetcars, trains, hotels, restaurants, hospitals, recreational facilities, and employment.

DESEGREGATION

The National Association for the Advancement of Colored People (NAACP) took the lead in efforts to overturn the judicial doctrine, established in the Supreme Court case Plessy v. Ferguson in 1896, that segregation of African-American and white students was constitutional if facilities were “separate but equal.”  That decree had been used for decades to sanction rigid segregation in all aspects of Southern life, where facilities were seldom, if ever, equal.

African Americans achieved their goal of overturning Plessy in 1954 when the Supreme Court – presided over by an Eisenhower appointee, Chief Justice Earl Warren – handed down its Brown v. Board of Education ruling. The Court declared unanimously that “separate facilities are inherently unequal,” and decreed that the “separate but equal” doctrine could no longer be used in public schools. A year later, the Supreme Court demanded that local school boards move “with all deliberate speed” to implement the decision.

Eisenhower, although sympathetic to the needs of the South as it faced a major transition, nonetheless acted to see that the law was upheld in the face of massive resistance from much of the South.  He faced a major crisis in Little Rock, Arkansas, in 1957, when Governor Orval Faubus attempted to block a desegregation plan calling for the admission of nine black students to the city’s previously all-white Central High School.  After futile efforts at negotiation, the president sent federal troops to Little Rock to enforce the plan.

Governor Faubus responded by ordering the Little Rock high schools closed down for the 1958-59 school year.  However, a federal court ordered them reopened the following year.  They did so in a tense atmosphere with a tiny number of African-American students.  Thus, school desegregation proceeded at a slow and uncertain pace throughout much of the South.

Another milestone in the civil rights movement occurred in 1955 in Montgomery, Alabama. Rosa Parks, a 42-year-old African-American seamstress who was also secretary of the state chapter of the NAACP, sat down in the front of a bus in a section reserved by law and custom for whites. Ordered to move to the back, she refused. Police came and arrested her for violating the segregation statutes.  African-American leaders, who had been waiting for just such a case, organized a boycott of the bus system.

Martin Luther King Jr., a young minister of the Baptist church where the African Americans met, became a spokesman for the protest. “There comes a time,” he said, “when people get tired … of being kicked about by the brutal feet of oppression.” King was arrested, as he would be again and again; a bomb damaged the front of his house.  But African Americans in Montgomery sustained the boycott. About a year later, the Supreme Court affirmed that bus segregation, like school segregation, was unconstitutional. The boycott ended. The civil rights movement had won an important victory – and discovered its most powerful, thoughtful, and eloquent leader in Martin Luther King Jr.

African Americans also sought to secure their voting rights. Although the 15th Amendment to the U.S. Constitution guaranteed the right to vote, many states had found ways to circumvent the law.  The states would impose a poll (“head”) tax or a literacy test – typically much more stringently interpreted for African Americans – to prevent poor African Americans with little education from voting.  Eisenhower, working with Senate majority leader Lyndon B. Johnson, lent his support to a congressional effort to guarantee the vote. The Civil Rights Act of 1957, the first such measure in 82 years, marked a step forward, as it authorized federal intervention in cases where African Americans were denied the chance to vote. Yet loopholes remained, and so activists pushed successfully for the Civil Rights Act of 1960, which provided stiffer penalties for interfering with voting, but still stopped short of authorizing federal officials to register African Americans.

Relying on the efforts of African Americans themselves, the civil rights movement gained momentum in the postwar years. Working through the Supreme Court and through Congress, civil rights supporters had created the groundwork for a dramatic yet peaceful “revolution” in American race relations in the 1960s.

Posted in Uncategorized | Tagged , , , , , , , , , , , , | Leave a comment

The New Deal and World War II Roosevelt’s leadership through economic reconstruction, war

The New Deal and World War II

Roosevelt’s leadership through economic reconstruction, war

U.S. battleships burning at Pearl Harbor (National Archives)

U.S. battleships, following the Japanese attack on Pearl Harbor, December 7, 1941. (The National Archives)

(The following article is taken from the U.S. Department of State publication Outline of American History.)

“We must be the great arsenal of democracy.”
‑ President Franklin D. Roosevelt, 1941

ROOSEVELT AND THE NEW DEAL

In 1933 the new president, Franklin D. Roosevelt, brought an air of confidence and optimism that quickly rallied the people to the banner of his program, known as the New Deal. “The only thing we have to fear is fear itself,” the president declared in his inaugural address to the nation.

In one sense, the New Deal merely introduced social and economic reforms familiar to many Europeans for more than a generation. Moreover, the New Deal represented the culmination of a long-range trend toward abandonment of “laissez-faire” capitalism, going back to the regulation of the railroads in the 1880s, and the flood of state and national reform legislation introduced in the Progressive era of Theodore Roosevelt and Woodrow Wilson.

What was truly novel about the New Deal, however, was the speed with which it accomplished what previously had taken generations. Many of its reforms were hastily drawn and weakly administered; some actually contradicted others. Moreover, it never succeeded in restoring prosperity. Yet its actions provided tangible help for millions of Americans, laid the basis for a powerful new political coalition, and brought to the individual citizen a sharp revival of interest in government.

THE FIRST NEW DEAL

Banking and Finance. When Roosevelt took the presidential oath, the banking and credit system of the nation was in a state of paralysis. With astonishing rapidity the nation’s banks were first closed – and then reopened only if they were solvent. The administration adopted a policy of moderate currency inflation to start an upward movement in commodity prices and to afford some relief to debtors. New governmental agencies brought generous credit facilities to industry and agriculture. The Federal Deposit Insurance Corporation (FDIC) insured savings ‑ bank deposits up to $5,000. Federal regulations were imposed upon the sale of securities on the stock exchange.

Unemployment. Roosevelt faced unprecedented mass unemployment. By the time he took office, as many as 13 million Americans – more than a quarter of the labor force – were out of work. Bread lines were a common sight in most cities. Hundreds of thousands roamed the country in search of food, work, and shelter. “Brother, can you spare a dime?” was the refrain of a popular song.

An early step for the unemployed came in the form of the Civilian Conservation Corps (CCC), a program that brought relief to young men between 18 and 25 years of age. CCC enrollees worked in camps administered by the army. About two million took part during the decade. They participated in a variety of conservation projects: planting trees to combat soil erosion and maintain national forests; eliminating stream pollution; creating fish, game, and bird sanctuaries; and conserving coal, petroleum, shale, gas, sodium, and helium deposits.

A Public Works Administration (PWA) provided employment for skilled construction workers on a wide variety of mostly medium- to large-sized projects. Among the most memorable of its many accomplishments were the Bonneville and Grand Coulee Dams in the Pacific Northwest, a new Chicago sewer system, the Triborough Bridge in New York City, and two aircraft carriers (Yorktown and Enterprise) for the U.S. Navy.

The Tennessee Valley Authority (TVA), both a work relief program and an exercise in public planning, developed the impoverished Tennessee River valley area through a series of dams built for flood control and hydroelectric power generation. Its provision of cheap electricity for the area stimulated some economic progress, but won it the enmity of private electric companies. New Dealers hailed it as an example of “grass roots democracy.”

The Federal Emergency Relief Administration (FERA), in operation from 1933 to 1935, distributed direct relief to hundreds of thousands of people, usually in the form of direct payments. Sometimes, it assumed the salaries of schoolteachers and other local public service workers. It also developed numerous small-scale public works projects, as did the Civil Works Administration (CWA) from late 1933 into the spring of 1934. Criticized as “make work,” the jobs funded ranged from ditch digging to highway repairs to teaching. Roosevelt and his key officials worried about costs but continued to favor unemployment programs based on work relief rather than welfare.

Agriculture. In the spring of 1933, the agricultural sector of the economy was in a state of collapse. It thereby provided a laboratory for the New Dealers’ belief that greater regulation would solve many of the country’s problems. In 1933, Congress passed the Agricultural Adjustment Act (AAA) to provide economic relief to farmers. The AAA proposed to raise crop prices by paying farmers a subsidy to compensate for voluntary cutbacks in production. Funds for the payments would be generated by a tax levied on industries that processed crops. By the time the act had become law, however, the growing season was well under way, and the AAA paid farmers to plow under their abundant crops. Crop reduction and further subsidies through the Commodity Credit Corporation, which purchased commodities to be kept in storage, drove output down and farm prices up.

Between 1932 and 1935, farm income increased by more than 50 percent, but only partly because of federal programs. During the same years that farmers were being encouraged to take land out of production – displacing tenants and sharecroppers – a severe drought hit the Plains states. Violent wind and dust storms during the 1930s created what became known as the “Dust Bowl.”  Crops were destroyed and farms ruined.

By 1940, 2.5 million people had moved out of the Plains states, the largest migration in American history. Of those, 200,000 moved to California. The migrants were not only farmers, but also professionals, retailers, and others whose livelihoods were connected to the health of the farm communities. Many ended up competing for seasonal jobs picking crops at extremely low wages.

The government provided aid in the form of the Soil Conservation Service, established in 1935. Farm practices that damaged the soil had intensified the impact of the drought. The service taught farmers measures to reduce erosion. In addition, almost 30,000 kilometers of trees were planted to break the force of winds.

Although the AAA had been mostly successful, it was abandoned in 1936, when its tax on food processors was ruled unconstitutional by the Supreme Court. Congress quickly passed a farm-relief act, which authorized the government to make payments to farmers who took land out of production for the purpose of soil conservation. In 1938, with a pro-New Deal majority on the Supreme Court, Congress reinstated the AAA.

By 1940 nearly six million farmers were receiving federal subsidies. New Deal programs also provided loans on surplus crops, insurance for wheat, and a system of planned storage to ensure a stable food supply. Economic stability for the farmer was substantially achieved, albeit at great expense and with extraordinary government oversight.

Industry and Labor. The National Recovery Administration (NRA), established in 1933 with the National Industrial Recovery Act (NIRA), attempted to end cut-throat competition by setting codes of fair competitive practice to generate more jobs and thus more buying. Although welcomed initially, the NRA was soon criticized for over-regulation and was unable to achieve industrial recovery. It was declared unconstitutional in 1935.

The NIRA had guaranteed to labor the right of collective bargaining through labor unions representing individual workers, but the NRA had failed to overcome strong business opposition to independent unionism. After its demise in 1935, Congress passed the National Labor Relations Act, which restated that guarantee and prohibited employers from unfairly interfering with union activities. It also created the National Labor Relations Board to supervise collective bargaining, administer elections, and ensure workers the right to choose the organization that should represent them in dealing with employers.

The great progress made in labor organization brought working people a growing sense of common interests, and labor’s power increased not only in industry but also in politics. Roosevelt’s Democratic Party benefited enormously from these developments.

THE SECOND NEW DEAL

In its early years, the New Deal sponsored a remarkable series of legislative initiatives and achieved significant increases in production and prices – but it did not bring an end to the Depression. As the sense of immediate crisis eased, new demands emerged. Businessmen mourned the end of “laissez-faire” and chafed under the regulations of the NIRA. Vocal attacks also mounted from the political left and right as dreamers, schemers, and politicians alike emerged with economic panaceas that drew wide audiences. Dr. Francis E. Townsend advocated generous old-age pensions. Father Charles Coughlin, the “radio priest,” called for inflationary policies and blamed international bankers in speeches increasingly peppered with anti-Semitic imagery. Most formidably, Senator Huey P. Long of Louisiana, an eloquent and ruthless spokesman for the displaced, advocated a radical redistribution of wealth. (If he had not been assassinated in September 1935, Long very likely would have launched a presidential challenge to Franklin Roosevelt in 1936.)

In the face of these pressures, President Roosevelt backed a new set of economic and social measures. Prominent among them were measures to fight poverty, create more work for the unemployed, and provide a social safety net.

The Works Progress Administration (WPA), the principal relief agency of the so-called second New Deal, was the biggest public works agency yet. It pursued small-scale projects throughout the country, constructing buildings, roads, airports, and schools. Actors, painters, musicians, and writers were employed through the Federal Theater Project, the Federal Art Project, and the Federal Writers Project. The National Youth Administration gave part-time employment to students, established training programs, and provided aid to unemployed youth. The WPA only included about three million jobless at a time; when it was abandoned in 1943, it had helped a total of nine million people.

The New Deal’s cornerstone, according to Roosevelt, was the Social Security Act of 1935. Social Security created a system of state-administered welfare payments for the poor, unemployed, and disabled based on matching state and federal contributions. It also established a national system of retirement benefits drawing on a “trust fund” created by employer and employee contributions. Many other industrialized nations had already enacted such programs, but calls for such an initiative in the United States had gone unheeded. Social Security today is the largest domestic program administered by the U.S. government.

To these, Roosevelt added the National Labor Relations Act, the “Wealth Tax Act” that increased taxes on the wealthy, the Public Utility Holding Company Act to break up large electrical utility conglomerates, and a Banking Act that greatly expanded the power of the Federal Reserve Board over the large private banks. Also notable was the establishment of the Rural Electrification Administration, which extended electricity into farming areas throughout the country.

A NEW COALITION

In the 1936 election, Roosevelt won a decisive victory over his Republican opponent, Alf Landon of Kansas. He was personally popular, and the economy seemed near recovery. He took 60 percent of the vote and carried all but two states. A broad new coalition aligned with the Democratic Party emerged, consisting of labor, most farmers, most urban ethnic groups, African Americans, and the traditionally Democratic South. The Republican Party received the support of business as well as middle-class members of small towns and suburbs. This political alliance, with some variation and shifting, remained intact for several decades.

Roosevelt’s second term was a time of consolidation. The president made two serious political missteps: an ill-advised, unsuccessful attempt to enlarge the Supreme Court and a failed effort to “purge” increasingly recalcitrant Southern conservatives from the Democratic Party. When he cut high government spending, moreover, the economy collapsed. These events led to the rise of a conservative coalition in Congress that was unreceptive to new initiatives.

From 1932 to 1938 there was widespread public debate on the meaning of New Deal policies to the nation’s political and economic life. Americans clearly wanted the government to take greater responsibility for the welfare of ordinary people, however uneasy they might be about big government in general. The New Deal established the foundations of the modern welfare state in the United States. Roosevelt, perhaps the most imposing of the 20th-century presidents, had established a new standard of mass leadership.

No American leader, then or since, used the radio so effectively. In a radio address in 1938, Roosevelt declared: “Democracy has disappeared in several other great nations, not because the people of those nations disliked democracy, but because they had grown tired of unemployment and insecurity, of seeing their children hungry while they sat helpless in the face of government confusion and government weakness through lack of leadership.” Americans, he concluded, wanted to defend their liberties at any cost and understood that “the first line of the defense lies in the protection of economic security.”

WAR AND UNEASY NEUTRALITY

Before Roosevelt’s second term was well under way, his domestic program was overshadowed by the expansionist designs of totalitarian regimes in Japan, Italy, and Germany. In 1931 Japan had invaded Manchuria, crushed Chinese resistance, and set up the puppet state of Manchukuo. Italy, under Benito Mussolini, enlarged its boundaries in Libya and in 1935 conquered Ethiopia. Germany, under Nazi leader Adolf Hitler, militarized its economy and reoccupied the Rhineland (demilitarized by the Treaty of Versailles) in 1936. In 1938, Hitler incorporated Austria into the German Reich and demanded cession of the German-speaking Sudetenland from Czechoslovakia. By then, war seemed imminent.

The United States, disillusioned by the failure of the crusade for democracy in World War I, announced that in no circumstances could any country involved in the conflict look to it for aid. Neutrality legislation, enacted piecemeal from 1935 to 1937, prohibited trade in arms with any warring nations, required cash for all other commodities, and forbade American flag merchant ships from carrying those goods. The objective was to prevent, at almost any cost, the involvement of the United States in a foreign war.

With the Nazi conquest of Poland in 1939 and the outbreak of World War II, isolationist sentiment increased, even though Americans clearly favored the victims of Hitler’s aggression and supported the Allied democracies, Britain and France. Roosevelt could only wait until public opinion regarding U.S. involvement was altered by events.

After the fall of France and the beginning of the German air war against Britain in mid-1940, the debate intensified between those in the United States who favored aiding the democracies and the antiwar faction known as the isolationists. Roosevelt did what he could to nudge public opinion toward intervention. The United States joined Canada in a Mutual Board of Defense, and aligned with the Latin American republics in extending collective protection to the nations in the Western Hemisphere.

Congress, confronted with the mounting crisis, voted immense sums for rearmament, and in September 1940 passed the first peacetime conscription bill ever enacted in the United States. In that month also, Roosevelt concluded a daring executive agreement with British Prime Minister Winston Churchill. The United States gave the British Navy 50 “overage” destroyers in return for British air and naval bases in Newfoundland and the North Atlantic.

The 1940 presidential election campaign demonstrated that the isolationists, while vocal, were a minority. Roosevelt’s Republican opponent, Wendell Wilkie, leaned toward intervention. Thus the November election yielded another majority for the president, making Roosevelt the first, and last, U. S. chief executive to be elected to a third term.

In early 1941, Roosevelt got Congress to approve the Lend-Lease Program, which enabled him to transfer arms and equipment to any nation (notably Great Britain, later the Soviet Union and China) deemed vital to the defense of the United States.  Total Lend-Lease aid by war’s end would amount to more than $50 billion.

Most remarkably, in August, he met with Prime Minister Churchill off the coast of Newfoundland. The two leaders issued a “joint statement of war aims,” which they called the Atlantic Charter. Bearing a remarkable resemblance to Woodrow Wilson’s Fourteen Points, it called for these objectives: no territorial aggrandizement; no territorial changes without the consent of the people concerned; the right of all people to choose their own form of government; the restoration of self-government to those deprived of it; economic collaboration between all nations; freedom from war, from fear, and from want for all peoples; freedom of the seas; and the abandonment of the use of force as an instrument of international policy.

America was now neutral in name only.

JAPAN, PEARL HARBOR, AND WAR

While most Americans anxiously watched the course of the European war, tension mounted in Asia. Taking advantage of an opportunity to improve its strategic position, Japan boldly announced a “new order” in which it would exercise hegemony over all of the Pacific. Battling for survival against Nazi Germany, Britain was unable to resist, abandoning its concession in Shanghai and temporarily closing the Chinese supply route from Burma. In the summer of 1940, Japan won permission from the weak Vichy government in France to use airfields in northern Indochina (North Vietnam). That September the Japanese formally joined the Rome-Berlin Axis. The United States countered with an embargo on the export of scrap iron to Japan.

In July 1941 the Japanese occupied southern Indochina (South Vietnam), signaling a probable move southward toward the oil, tin, and rubber of British Malaya and the Dutch East Indies. The United States, in response, froze Japanese assets and initiated an embargo on the one commodity Japan needed above all others – oil.

General Hideki Tojo became prime minister of Japan that October. In mid-November, he sent a special envoy to the United States to meet with Secretary of State Cordell Hull. Among other things, Japan demanded that the United States release Japanese assets and stop U.S. naval expansion in the Pacific. Hull countered with a proposal for Japanese withdrawal from all its conquests. The swift Japanese rejection on December 1 left the talks stalemated.

On the morning of December 7, Japanese carrier-based planes executed a devastating surprise attack against the U.S. Pacific Fleet at Pearl Harbor , Hawaii.

Twenty-one ships were destroyed or temporarily disabled; 323 aircraft were destroyed or damaged; 2,388 soldiers, sailors, and civilians were killed.  However, the U.S. aircraft carriers that would play such a critical role in the ensuing naval war in the Pacific were at sea and not anchored at Pearl Harbor.

American opinion, still divided about the war in Europe, was unified overnight by what President Roosevelt called “a day that will live in infamy.” On December 8, Congress declared a state of war with Japan; three days later Germany and Italy declared war on the United States.

MOBILIZATION FOR TOTAL WAR

The nation rapidly geared itself for mobilization of its people and its entire industrial capacity. Over the next three-and-a-half years, war industry achieved staggering production goals – 300,000 aircraft, 5,000 cargo ships, 60,000 landing craft, 86,000 tanks. Women workers, exemplified by “Rosie the Riveter,” played a bigger part in industrial production than ever before. Total strength of the U.S. armed forces at the end of the war was more than 12 million. All the nation’s activities – farming, manufacturing, mining, trade, labor, investment, communications, even education and cultural undertakings – were in some fashion brought under new and enlarged controls.

As a result of Pearl Harbor and the fear of Asian espionage, Americans also committed what was later recognized as an act of intolerance: the internment of Japanese Americans. In February 1942, nearly 120,000 Japanese Americans residing in California were removed from their homes and interned behind barbed wire in 10 wretched temporary camps, later to be moved to “relocation centers” outside isolated Southwestern towns.

Nearly 63 percent of these Japanese Americans were American-born U.S. citizens. A few were Japanese sympathizers, but no evidence of espionage ever surfaced. Others volunteered for the U.S. Army and fought with distinction and valor in two infantry units on the Italian front. Some served as interpreters and translators in the Pacific.

In 1983 the U.S. government acknowledged the injustice of internment with limited payments to those Japanese Americans of that era who were still living.

THE WAR IN NORTH AFRICA AND EUROPE

Soon after the United States entered the war, the United States, Britain, and the Soviet Union (at war with Germany since June 22, 1941) decided that their primary military effort was to be focused in Europe.

Throughout 1942, British and German forces fought inconclusive back-and-forth battles across Libya and Egypt for control of the Suez Canal. But on October 23, British forces commanded by General Sir Bernard Montgomery struck at the Germans from El Alamein. Equipped with a thousand tanks, many made in America, they defeated General Erwin Rommel’s army in a grinding two-week campaign. On November 7, American and British armed forces landed in French North Africa. Squeezed between forces advancing from east and west, the Germans were pushed back and, after fierce resistance, surrendered in May 1943.

The year 1942 was also the turning point on the Eastern Front. The Soviet Union, suffering immense losses, stopped the Nazi invasion at the gates of Leningrad and Moscow. In the winter of 1942-43, the Red Army defeated the Germans at Stalingrad (Volgograd) and began the long offensive that would take them to Berlin in 1945.

In July 1943 British and American forces invaded Sicily and won control of the island in a month. During that time, Benito Mussolini fell from power in Italy. His successors began negotiations with the Allies and surrendered immediately after the invasion of the Italian mainland in September. However, the German Army had by then taken control of the peninsula. The fight against Nazi forces in Italy was bitter and protracted. Rome was not liberated until June 4, 1944.  As the Allies slowly moved north, they built airfields from which they made devastating air raids against railroads, factories, and weapon emplacements in southern Germany and central Europe, including the oil installations at Ploesti, Romania.

Late in 1943 the Allies, after much debate over strategy, decided to open a front in France to compel the Germans to divert far larger forces from the Soviet Union.

U.S. General Dwight D. Eisenhower was appointed Supreme Commander of Allied Forces in Europe. After immense preparations, on June 6, 1944, a U.S., British, and Canadian invasion army, protected by a greatly superior air force, landed on five beaches in Normandy. With the beachheads established after heavy fighting, more troops poured in, and pushed the Germans back in one bloody engagement after another. On August 25 Paris was liberated.

The Allied offensive stalled that fall, then suffered a setback in eastern Belgium during the winter, but in March, the Americans and British were across the Rhine and the Russians advancing irresistibly from the East. On May 7, Germany surrendered unconditionally.

THE WAR IN THE PACIFIC

U.S. troops were forced to surrender in the Philippines in early 1942, but the Americans rallied in the following months. General James “Jimmy” Doolittle led U.S. Army bombers on a raid over Tokyo in April; it had little actual military significance, but gave Americans an immense psychological boost.

In May, at the Battle of the Coral Sea – the first naval engagement in history in which all the fighting was done by carrier-based planes – a Japanese naval invasion fleet sent to strike at southern New Guinea and Australia was turned back by a U.S. task force in a close battle. A few weeks later, the naval Battle of Midway in the central Pacific resulted in the first major defeat of the Japanese Navy, which lost four aircraft carriers. Ending the Japanese advance across the central Pacific, Midway was the turning point.

Other battles also contributed to Allied success. The six-month land and sea battle for the island of Guadalcanal (August 1942-February 1943) was the first major U.S. ground victory in the Pacific. For most of the next two years, American and Australian troops fought their way northward from the South Pacific and westward from the Central Pacific, capturing the Solomons, the Gilberts, the Marshalls, and the Marianas in a series of amphibious assaults.

THE POLITICS OF WAR

Allied military efforts were accompanied by a series of important international meetings on the political objectives of the war. In January 1943 at Casablanca, Morocco, an Anglo-American conference decided that no peace would be concluded with the Axis and its Balkan satellites except on the basis of “unconditional surrender.” This term, insisted upon by Roosevelt, sought to assure the people of all the fighting nations that no separate peace negotiations would be carried on with representatives of Fascism and Nazism and there would be no compromise of the war’s idealistic objectives. Axis propagandists, of course, used it to assert that the Allies were engaged in a war of extermination.

At Cairo, in November 1943, Roosevelt and Churchill met with Nationalist Chinese leader Chiang Kai-shek to agree on terms for Japan, including the relinquishment of gains from past aggression. At Tehran, shortly afterward, Roosevelt, Churchill, and Soviet leader Joseph Stalin made basic agreements on the postwar occupation of Germany and the establishment of a new international organization, the United Nations.

In February 1945, the three Allied leaders met again at Yalta (now in Ukraine), with victory seemingly secure. There, the Soviet Union secretly agreed to enter the war against Japan three months after the surrender of Germany. In return, the USSR would gain effective control of Manchuria and receive the Japanese Kurile Islands as well as the southern half of Sakhalin Island. The eastern boundary of Poland was set roughly at the Curzon line of 1919, thus giving the USSR half its prewar territory. Discussion of reparations to be collected from Germany – payment demanded by Stalin and opposed by Roosevelt and Churchill – was inconclusive. Specific arrangements were made concerning Allied occupation in Germany and the trial and punishment of war criminals. Also at Yalta it was agreed that the great powers in the Security Council of the proposed United Nations should have the right of veto in matters affecting their security.

Two months after his return from Yalta, Franklin Roosevelt died of a cerebral hemorrhage while vacationing in Georgia. Few figures in U.S. history have been so deeply mourned, and for a time the American people suffered from a numbing sense of irreparable loss. Vice President Harry Truman, a former senator from Missouri, succeeded him.

WAR, VICTORY, AND THE BOMB

The final battles in the Pacific were among the war’s bloodiest. In June 1944, the Battle of the Philippine Sea effectively destroyed Japanese naval air power, forcing the resignation of Japanese Prime Minister Tojo. General Douglas MacArthur – who had reluctantly left the Philippines two years before to escape Japanese capture – returned to the islands in October. The accompanying Battle of Leyte Gulf, the largest naval engagement ever fought, was the final decisive defeat of the Japanese Navy. By February 1945, U.S. forces had taken Manila.

Next, the United States set its sight on the strategic island of Iwo Jima in the Bonin Islands, about halfway between the Marianas and Japan. The Japanese, trained to die fighting for the Emperor, made suicidal use of natural caves and rocky terrain. U.S. forces took the island by mid-March, but not before losing the lives of some 6,000 U.S. Marines. Nearly all the Japanese defenders perished. By now the United States was undertaking extensive air attacks on Japanese shipping and airfields and wave after wave of incendiary bombing attacks against Japanese cities.

At Okinawa (April 1-June 21, 1945), the Americans met even fiercer resistance. With few of the defenders surrendering, the U.S. Army and Marines were forced to wage a war of annihilation. Waves of Kamikaze suicide planes pounded the offshore Allied fleet, inflicting more damage than at Leyte Gulf. Japan lost 90-100,000 troops and probably as many Okinawan civilians. U.S. losses were more than 11,000 killed and nearly 34,000 wounded. Most Americans saw the fighting as a preview of what they would face in a planned invasion of Japan.

The heads of the U.S., British, and Soviet governments met at Potsdam, a suburb outside Berlin, from July 17 to August 2, 1945, to discuss operations against Japan, the peace settlement in Europe, and a policy for the future of Germany. Perhaps presaging the coming end of the alliance, they had no trouble on vague matters of principle or the practical issues of military occupation, but reached no agreement on many tangible issues, including reparations.

The day before the Potsdam Conference began, U.S. nuclear scientists engaged in the secret Manhattan Project exploded an atomic bomb near Alamogordo, New Mexico. The test was the culmination of three years of intensive research in laboratories across the United States. It lay behind the Potsdam Declaration, issued on July 26 by the United States and Britain, promising that Japan would neither be destroyed nor enslaved if it surrendered. If Japan continued the war, however, it would meet “prompt and utter destruction.”  President Truman, calculating that an atomic bomb might be used to gain Japan’s surrender more quickly and with fewer casualties than an invasion of the mainland, ordered that the bomb be used if the Japanese did not surrender by August 3.

A committee of U.S. military and political officials and scientists had considered the question of targets for the new weapon. Secretary of War Henry L. Stimson argued successfully that Kyoto, Japan’s ancient capital and a repository of many national and religious treasures, be taken out of consideration. Hiroshima, a center of war industries and military operations, became the first objective.

On August 6, a U.S. plane, the Enola Gay, dropped an atomic bomb on the city of Hiroshima. On August 9, a second atomic bomb was dropped, this time on  Nagasaki.  The bombs destroyed large sections of both cities, with massive loss of life. On August 8, the USSR declared war on Japan and attacked Japanese forces in Manchuria. On August 14, Japan agreed to the terms set at Potsdam. On September 2, 1945, Japan formally surrendered. Americans were relieved that the bomb hastened the end of the war. The realization of the full implications of nuclear weapons’ awesome destructiveness would come later.

Within a month, on October 24, the United Nations came into existence following the meeting of representatives of 50 nations in San Francisco, California. The constitution they drafted outlined a world organization in which international differences could be discussed peacefully and common cause made against hunger and disease. In contrast to its rejection of U.S. membership in the League of Nations after World War I, the U.S. Senate promptly ratified the U.N. Charter by an 89 to 2 vote. This action confirmed the end of the spirit of isolationism as a dominating element in American foreign policy.

In November 1945 at Nuremberg, Germany, the criminal trials of 22 Nazi leaders, provided for at Potsdam, took place. Before a group of distinguished jurists from Britain, France, the Soviet Union, and the United States, the Nazis were accused not only of plotting and waging aggressive war but also of violating the laws of war and of humanity in the systematic genocide, known as the Holocaust, of European Jews and other peoples. The trials lasted more than 10 months. Twenty-two defendants were convicted, 12 of them sentenced to death. Similar proceedings would be held against Japanese war leaders.

Posted in Uncategorized | Tagged , , , , , , , , , , , , , , , , , , , , , | Leave a comment

War, Prosperity, and Depression : U.S. triumphs in World War I, suffers through downturn

War, Prosperity, and Depression

U.S. triumphs in World War I, suffers through downturn

Depression era soup line, 1930s

Depression era soup line, 1930s. (The American History Slide Collection, © (IRC))

(The following article is taken from the U.S. Department of State publication, Outline of American History.)

“The chief business of the American people is business.”
‑ President Calvin Coolidge, 1925

WAR AND NEUTRAL RIGHTS

To the American public of 1914, the outbreak of war in Europe – with Germany and Austria-Hungary fighting Britain, France, and Russia – came as a shock.  At first the encounter seemed remote, but its economic and political effects were swift and deep. By 1915 U.S. industry, which had been mildly depressed, was prospering again with munitions orders from the Western Allies. Both sides used propaganda to arouse the public passions of Americans – a third of whom were either foreign-born or had one or two foreign-born parents. Moreover, Britain and Germany both acted against U.S. shipping on the high seas, bringing sharp protests from President Woodrow Wilson.

Britain, which controlled the seas, stopped and searched American carriers, confiscating “contraband” bound for Germany. Germany employed its major naval weapon, the submarine, to sink shipping bound for Britain or France.  President Wilson warned that the United States would not forsake its traditional right as a neutral to trade with belligerent nations.  He also declared that the nation would hold Germany to “strict accountability” for the loss of American vessels or lives. On May 7, 1915, a German submarine sunk the British liner Lusitania, killing 1,198 people, 128 of them Americans.  Wilson, reflecting American outrage, demanded an immediate halt to attacks on liners and merchant ships.

Anxious to avoid war with the United States, Germany agreed to give warning to commercial vessels – even if they flew the enemy flag – before firing on them. But after two more attacks – the sinking of the British steamer Arabic in August 1915, and the torpedoing of the French liner Sussex in March 1916 – Wilson issued an ultimatum threatening to break diplomatic relations unless Germany abandoned submarine warfare.  Germany agreed and refrained from further attacks through the end of the year.

Wilson won reelection in 1916, partly on the slogan: “He kept us out of war.”  Feeling he had a mandate to act as a peacemaker, he delivered a speech to the Senate, January 22, 1917, urging the warring nations to accept a “peace without victory.”

UNITED STATES ENTERS WORLD WAR I

On January 31, 1917, however, the German government resumed unrestricted submarine warfare.  After five U.S. vessels were sunk, Wilson on April 2, 1917, asked for a declaration of war. Congress quickly approved.  The government rapidly mobilized military resources, industry, labor, and agriculture. By October 1918, on the eve of Allied victory, a U.S. army of over 1,750,000 had been deployed in France.

In the summer of 1918, fresh American troops under the command of General John J. Pershing played a decisive role in stopping a last-ditch German offensive.  That fall, Americans were key participants in the Meuse-Argonne offensive, which cracked Germany’s vaunted Hindenburg Line.

President Wilson contributed greatly to an early end to the war by defining American war aims that characterized the struggle as being waged not against the German people but against their autocratic government. His Fourteen Points, submitted to the Senate in January 1918, called for: abandonment of secret international agreements; freedom of the seas; free trade between nations; reductions in national armaments; an adjustment of colonial claims in the interests of the inhabitants affected; self-rule for subjugated European nationalities; and, most importantly, the establishment of an association of nations to afford “mutual guarantees of political independence and territorial integrity to great and small states alike.”

In October 1918, the German government, facing certain defeat, appealed to Wilson to negotiate on the basis of the Fourteen Points.  After a month of secret negotiations that gave Germany no firm guarantees, an armistice (technically a truce, but actually a surrender) was concluded on November 11.

THE LEAGUE OF NATIONS

It was Wilson’s hope that the final treaty, drafted by the victors, would be even-handed, but the passion and material sacrifice of more than four years of war caused the European Allies to make severe demands.  Persuaded that his greatest hope for peace, a League of Nations, would never be realized unless he made concessions, Wilson compromised somewhat on the issues of self-determination, open diplomacy, and other specifics.  He successfully resisted French demands for the entire Rhineland, and somewhat moderated that country’s insistence upon charging Germany the whole cost of the war.  The final agreement (the Treaty of Versailles), however, provided for French occupation of the coal and iron rich Saar Basin, and a very heavy burden of reparations upon Germany.

In the end, there was little left of Wilson’s proposals for a generous and lasting peace but the League of Nations itself, which he had made an integral part of the treaty.  Displaying poor judgment, however, the president had failed to involve leading Republicans in the treaty negotiations.  Returning with a partisan document, he then refused to make concessions necessary to satisfy Republican concerns about protecting American sovereignty.

With the treaty stalled in a Senate committee, Wilson began a national tour to appeal for support.  On September 25, 1919, physically ravaged by the rigors of peacemaking and the pressures of the wartime presidency, he suffered a crippling stroke.  Critically ill for weeks, he never fully recovered.  In two separate votes – November 1919 and March 1920 – the Senate once again rejected the Versailles Treaty and with it the League of Nations.

The League of Nations would never be capable of maintaining world order.  Wilson’s defeat showed that the American people were not yet ready to play a commanding role in world affairs.  His utopian vision had briefly inspired the nation, but its collision with reality quickly led to widespread disillusion with world affairs.  America reverted to its instinctive isolationism.

POSTWAR UNREST

The transition from war to peace was tumultuous. A postwar economic boom coexisted with rapid increases in consumer prices.  Labor unions that had refrained from striking during the war engaged in several major job actions.  During the summer of 1919, race riots occurred, reflecting apprehension over the emergence of a “New Negro” who had seen military service or gone north to work in war industry.

Reaction to these events merged with a widespread national fear of a new international revolutionary movement.  In 1917, the Bolsheviks had seized power in Russia; after the war, they attempted revolutions in Germany and Hungary.  By 1919, it seemed they had come to America.  Excited by the Bolshevik example, large numbers of militants split from the Socialist Party to found what would become the Communist Party of the United States.   In April 1919, the postal service intercepted nearly 40 bombs addressed to prominent citizens. Attorney General A. Mitchell Palmer’s residence in Washington was bombed.  Palmer, in turn, authorized federal roundups of radicals and deported many who were not citizens.   Major strikes were often blamed on radicals and depicted as the opening shots of a revolution.

Palmer’s dire warnings fueled a “Red Scare” that subsided by mid-1920.  Even a murderous bombing in Wall Street in September failed to reawaken it.  From 1919 on, however, a current of militant hostility toward revolutionary communism would simmer not far beneath the surface of American life.

THE BOOMING 1920s

Wilson, distracted by the war, then laid low by his stroke, had mishandled almost every postwar issue. The booming economy began to collapse in mid-1920.  The Republican candidates for president and vice president, Warren G. Harding and Calvin Coolidge, easily defeated their Democratic opponents, James M. Cox and Franklin D. Roosevelt.

Following ratification of the 19th Amendment to the Constitution, women voted in a presidential election for the first time.

The first two years of Harding’s administration saw a continuance of the economic recession that had begun under Wilson.  By 1923, however, prosperity was back.  For the next six years the country enjoyed the strongest economy in its history, at least in urban areas.  Governmental economic policy during the 1920s was eminently conservative. It was based upon the belief that if government fostered private business, benefits would radiate out to most of the rest of the population.

Accordingly, the Republicans tried to create the most favorable conditions for U.S. industry. The Fordney-McCumber Tariff of 1922 and the Hawley-Smoot Tariff of 1930 brought American trade barriers to new heights, guaranteeing U.S. manufacturers in one field after another a monopoly of the domestic market, but blocking a healthy trade with Europe that would have reinvigorated the international economy.  Occurring at the beginning of the Great Depression, Hawley-Smoot triggered retaliation from other manufacturing nations and contributed greatly to a collapsing cycle of world trade that intensified world economic misery.

The federal government also started a program of tax cuts, reflecting Treasury Secretary Andrew Mellon’s belief that high taxes on individual incomes and corporations discouraged investment in new industrial enterprises. Congress, in laws passed between 1921 and 1929, responded favorably to his proposals.

“The chief business of the American people is business,” declared Calvin Coolidge, the Vermont-born vice president who succeeded to the presidency in 1923 after Harding’s death, and was elected in his own right in 1924. Coolidge hewed to the conservative economic policies of the Republican Party, but he was a much abler administrator than the hapless Harding, whose administration was mired in charges of corruption in the months before his death.

Throughout the 1920s, private business received substantial encouragement, including construction loans, profitable mail-carrying contracts, and other indirect subsidies. The Transportation Act of 1920, for example, had already restored to private management the nation’s railways, which had been under government control during the war. The Merchant Marine, which had been owned and largely operated by the government, was sold to private operators.

Republican policies in agriculture, however, faced mounting criticism, for farmers shared least in the prosperity of the 1920s. The period since 1900 had been one of rising farm prices.  The unprecedented wartime demand for U.S. farm products had provided a strong stimulus to expansion.  But by the close of 1920, with the abrupt end of wartime demand, the commercial agriculture of staple crops such as wheat and corn fell into sharp decline. Many factors accounted for the depression in American agriculture, but foremost was the loss of foreign markets. This was partly in reaction to American tariff policy, but also because excess farm production was a worldwide phenomenon.  When the Great Depression struck in the 1930s, it devastated an already fragile farm economy.

The distress of agriculture aside, the Twenties brought the best life ever to most Americans.  It was the decade in which the ordinary family purchased its first automobile, obtained refrigerators and vacuum cleaners, listened to the radio for entertainment, and went regularly to motion pictures. Prosperity was real and broadly distributed.  The Republicans profited politically, as a result, by claiming credit for it.

TENSIONS OVER IMMIGRATION

During the 1920s, the United States sharply restricted foreign immigration for the first time in its history.  Large inflows of foreigners long had created a certain amount of social tension, but most had been of Northern European stock and, if not quickly assimilated, at least possessed a certain commonality with most Americans.  By the end of the 19th century, however, the flow was predominantly from southern and Eastern Europe.  According to the census of 1900, the population of the United States was just over 76 million.  Over the next 15 years, more than 15 million immigrants entered the country.

Around two-thirds of the inflow consisted of “newer” nationalities and ethnic groups–Russian Jews, Poles, Slavic peoples, Greeks, southern Italians.  They were non-Protestant, non-“Nordic,” and, many Americans feared, nonassimilable.  They did hard, often dangerous, low-pay work – but were accused of driving down the wages of native-born Americans.  Settling in squalid urban ethnic enclaves, the new immigrants were seen as maintaining Old World customs, getting along with very little English, and supporting unsavory political machines that catered to their needs.  Nativists wanted to send them back to Europe; social workers wanted to Americanize them.  Both agreed that they were a threat to American identity.

Halted by World War I, mass immigration resumed in 1919, but quickly ran into determined opposition from groups as varied as the American Federation of Labor and the reorganized Ku Klux Klan.  Millions of old-stock Americans who belonged to neither organization accepted commonly held assumptions about the inferiority of non-Nordics and backed restrictions.  Of course, there were also practical arguments in favor of a maturing nation putting some limits on new arrivals.

In 1921, Congress passed a sharply restrictive emergency immigration act.  It was supplanted in 1924 by the Johnson-Reed National Origins Act, which established an immigration quota for each nationality.  Those quotas were pointedly based on the census of 1890, a year in which the newer immigration had not yet left its mark.  Bitterly resented by southern and Eastern European ethnic groups, the new law reduced immigration to a trickle.  After 1929, the economic impact of the Great Depression would reduce the trickle to a reverse flow – until refugees from European fascism began to press for admission to the country.

CLASH OF CULTURES

Some Americans expressed their discontent with the character of modern life in the 1920s by focusing on family and religion, as an increasingly urban, secular society came into conflict with older rural traditions. Fundamentalist preachers such as Billy Sunday provided an outlet for many who yearned for a return to a simpler past.

Perhaps the most dramatic demonstration of this yearning was the religious fundamentalist crusade that pitted Biblical texts against the Darwinian theory of biological evolution. In the 1920s, bills to prohibit the teaching of evolution began appearing in Midwestern and Southern state legislatures. Leading this crusade was the aging William Jennings Bryan, long a spokesman for the values of the countryside as well as a progressive politician.  Bryan skillfully reconciled his anti-evolutionary activism with his earlier economic radicalism, declaring that evolution “by denying the need or possibility of spiritual regeneration, discourages all reforms.”

The issue came to a head in 1925, when a young high school teacher, John Scopes, was prosecuted for violating a Tennessee law that forbade the teaching of evolution in the public schools.  The case became a national spectacle, drawing intense news coverage.  The American Civil Liberties Union retained the renowned attorney Clarence Darrow to defend Scopes.  Bryan wrangled an appointment as special prosecutor, then foolishly allowed Darrow to call him as a hostile witness.  Bryan’s confused defense of Biblical passages as literal rather than metaphorical truth drew widespread criticism.  Scopes, nearly forgotten in the fuss, was convicted, but his fine was reversed on a technicality.  Bryan died shortly after the trial ended.  The state wisely declined to retry Scopes.  Urban sophisticates ridiculed fundamentalism, but it continued to be a powerful force in rural, small-town America.

Another example of a powerful clash of cultures – one with far greater national consequences – was Prohibition. In 1919, after almost a century of agitation, the 18th Amendment to the Constitution was enacted, prohibiting the manufacture, sale, or transportation of alcoholic beverages.  Intended to eliminate the saloon and the drunkard from American society, Prohibition created thousands of illegal drinking places called “speakeasies,” made intoxication fashionable, and created a new form of criminal activity  – the transportation of illegal liquor, or “bootlegging.” Widely observed in rural America, openly evaded in urban America, Prohibition was an emotional issue in the prosperous Twenties.  When the Depression hit, it seemed increasingly irrelevant.  The 18th Amendment would be repealed in 1933.

Fundamentalism and Prohibition were aspects of a larger reaction to a modernist social and intellectual revolution most visible in changing manners and morals that caused the decade to be called the Jazz Age, the Roaring Twenties, or the era of “flaming youth.”  World War I had overturned the Victorian social and moral order.  Mass prosperity enabled an open and hedonistic life style for the young middle classes.

The leading intellectuals were supportive.  H.L. Mencken, the decade’s most important social critic, was unsparing in denouncing sham and venality in American life.  He usually found these qualities in rural areas and among businessmen.  His counterparts of the progressive movement had believed in “the people” and sought to extend democracy.  Mencken, an elitist and admirer of Nietzsche, bluntly called democratic man a boob and characterized the American middle class as the “booboisie.”

Novelist F. Scott Fitzgerald captured the energy, turmoil, and disillusion of the

decade in such works as The Beautiful and the Damned (1922) and The Great Gatsby (1925).  Sinclair Lewis, the first American to win a Nobel Prize for literature, satirized mainstream America in Main Street (1920) and Babbitt (1922).  Ernest Hemingway vividly portrayed the malaise wrought by the war in The Sun Also Rises (1926) and A Farewell to Arms (1929).  Fitzgerald, Hemingway, and many other writers dramatized their alienation from America by spending much of the decade in Paris.

African-American culture flowered.  Between 1910 and 1930, huge numbers of African Americans moved from the South to the North in search of jobs and personal freedom.  Most settled in urban areas, especially New York City’s Harlem, Detroit, and Chicago.  In 1910 W.E.B. Du Bois and other intellectuals had founded the National Association for the Advancement of Colored People (NAACP), which helped African Americans gain a national voice that would grow in importance with the passing years.

An African‑American literary and artistic movement, called the “Harlem Renaissance,” emerged. Like the “Lost Generation,” its writers, such as the poets Langston Hughes and Countee Cullen, rejected middle-class values and conventional literary forms, even as they addressed the realities of African-American experience.  African-American musicians – Duke Ellington, King Oliver, Louis Armstrong – first made jazz a staple of American culture in the 1920’s.

THE GREAT DEPRESSION

In October 1929 the booming stock market crashed, wiping out many investors.  The collapse did not in itself cause the Great Depression, although it reflected excessively easy credit policies that had allowed the market to get out of hand.  It also aggravated fragile economies in Europe that had relied heavily on American loans.  Over the next three years, an initial American recession became part of a worldwide depression.  Business houses closed their doors, factories shut down, banks failed with the loss of depositors’ savings. Farm income fell some 50 percent.  By November 1932, approximately one of every five American workers was unemployed.

The presidential campaign of 1932 was chiefly a debate over the causes and possible remedies of the Great Depression.  President Herbert Hoover, unlucky in entering the White House only eight months before the stock market crash, had tried harder than any other president before him to deal with economic hard times.  He had attempted to organize business, had sped up public works schedules, established the Reconstruction Finance Corporation to support businesses and financial institutions, and had secured from a reluctant Congress an agency to underwrite home mortgages.  Nonetheless, his efforts had little impact, and he was a picture of defeat.

His Democratic opponent, Franklin D. Roosevelt, already popular as the governor of New York during the developing crisis, radiated infectious optimism.  Prepared to use the federal government’s authority for even bolder experimental remedies, he scored a smashing victory – receiving 22,800,000 popular votes to Hoover’s 15,700,000. The United States was about to enter a new era of economic and political change.

Posted in Uncategorized | Tagged , , , , , , , | Leave a comment

Discontent and Reform: Late-19th century American farmers experienced recurring periods of hardship

Discontent and Reform

Late-19th century American farmers experienced recurring periods of hardship

Suffragists march on Pennsylvania Avenue, Washington, D.C.

Suffragists march on Pennsylvania Avenue, Washington, D.C., March 3, 1913. (Library of Congress)

(The following article is taken from the U.S. Department of State publication, Outline of American History.)

“A great democracy will be neither great nor a democracy if it is not progressive.”
— Former President Theodore Roosevelt, circa 1910

AGRARIAN DISTRESS AND THE RISE OF POPULISM

In spite of their remarkable progress, late-19th century American farmers experienced recurring periods of hardship.  Mechanical improvements greatly increased yield per hectare.  The amount of land under cultivation grew rapidly throughout the second half of the century, as the railroads and the gradual displacement of the Plains Indians opened up new areas for western settlement. A similar expansion of agricultural lands in countries such as Canada, Argentina, and Australia compounded these problems in the international market, where much of U.S. agricultural production was now sold.  Everywhere, heavy supply pushed the price of agricultural commodities downward.

Midwestern farmers were increasingly restive over what they considered excessive railroad freight rates to move their goods to market.  They believed that the protective tariff, a subsidy to big business, drove up the price of their increasingly expensive equipment. Squeezed by low market prices and high costs, they resented ever-heavier debt loads and the banks that held their mortgages.  Even the weather was hostile.  During the late 1880s droughts devastated the western Great Plains and bankrupted thousands of settlers.

In the South, the end of slavery brought major changes.  Much agricultural land was now worked by sharecroppers, tenants who gave up to half of their crop to a landowner for rent, seed, and essential supplies. An estimated 80 percent of the South’s African-American farmers and 40 percent of its white ones lived under this debilitating system.   Most were locked in a cycle of debt, from which the only hope of escape was increased planting. This led to the over-production of cotton and tobacco, and thus to declining prices and the further exhaustion of the soil.

The first organized effort to address general agricultural problems was by the Patrons of Husbandry, a farmer’s group popularly known as the Grange movement. Launched in 1867 by employees of the U.S. Department of Agriculture, the Granges focused initially on social activities to counter the isolation most farm families encountered. Women’s participation was actively encouraged. Spurred by the Panic of 1873, the Grange soon grew to 20,000 chapters and one-and-a-half million members.

The Granges set up their own marketing systems, stores, processing plants, factories, and cooperatives, but most ultimately failed.  The movement also enjoyed some political success. During the 1870s, a few states passed “Granger laws,” limiting railroad and warehouse fees.

By 1880 the Grange was in decline and being replaced by the Farmers’ Alliances, which were similar in many respects but more overtly political.  By 1890 the alliances, initially autonomous state organizations, had about 1.5 million members from New York to California.  A parallel African-American group, the Colored Farmers National Alliance, claimed over a million members.  Federating into two large Northern and Southern blocs, the alliances promoted elaborate economic programs to “unite the farmers of America for their protection against class legislation and the encroachments of concentrated capital.”

By 1890 the level of agrarian distress, fueled by years of hardship and hostility toward the McKinley tariff, was at an all-time high. Working with sympathetic Democrats in the South or small third parties in the West, the Farmers’ Alliances made a push for political power.  A third political party, the People’s (or Populist) Party, emerged. Never before in American politics had there been anything like the Populist fervor that swept the prairies and cotton lands. The elections of 1890 brought the new party into power in a dozen Southern and Western states, and sent a score of Populist senators and representatives to Congress.

The first Populist convention was in 1892.  Delegates from farm, labor, and reform organizations met in Omaha, Nebraska, determined to overturn a U.S. political system they viewed as hopelessly corrupted by the industrial and financial trusts. Their platform stated:

We are met, in the midst of a nation brought to the verge of moral,
political, and material ruin. Corruption dominates the ballot‑box, the
legislatures, the Congress, and touches even the ermine of the bench
[courts]. … From the same prolific womb of governmental injustice we breed
the two great classes – tramps and millionaires.

The pragmatic portion of their platform called for the nationalization of the railroads; a low tariff; loans secured by non-perishable crops stored in government-owned warehouses; and, most explosively, currency inflation through Treasury purchase and the unlimited coinage of silver at the “traditional” ratio of 16 ounces of silver to one ounce of gold.

The Populists showed impressive strength in the West and South, and their candidate for president polled more than a million votes. But the currency question soon overshadowed all other issues. Agrarian spokesmen, convinced that their troubles stemmed from a shortage of money in circulation, argued that increasing the volume of money would indirectly raise prices for farm products and drive up industrial wages, thus allowing debts to be paid with inflated currency. Conservative groups and the financial classes, on the other hand, responded that the 16:1 price ratio was nearly twice the market price for silver.  A policy of unlimited purchase would denude the U.S. Treasury of all its gold holdings, sharply devalue the dollar, and destroy the purchasing power of the working and middle classes.  Only the gold standard, they said, offered stability.

The financial panic of 1893 heightened the tension of this debate. Bank failures abounded in the South and Midwest; unemployment soared and crop prices fell badly. The crisis and President Grover Cleveland’s defense of the gold standard sharply divided the Democratic Party. Democrats who were silver supporters went over to the Populists as the presidential elections of 1896 neared.

The Democratic convention that year was swayed by one of the most famous speeches in U.S. political history. Pleading with the convention not to “crucify mankind on a cross of gold,” William Jennings Bryan, the young Nebraskan champion of silver, won the Democrats’ presidential nomination. The Populists also endorsed Bryan.

In the epic contest that followed, Bryan carried almost all the Southern and Western states.  But he lost the more populated, industrial North and East – and the election – to Republican candidate William McKinley.

The following year the country’s finances began to improve, in part owing to the discovery of gold in Alaska and the Yukon. This provided a basis for a conservative expansion of the money supply.  In 1898 the Spanish-American War drew the nation’s attention further from Populist issues.  Populism and the silver issue were dead.  Many of the movement’s other reform ideas, however, lived on.

THE STRUGGLES OF LABOR

The life of a 19th-century American industrial worker was hard. Even in good times wages were low, hours long, and working conditions hazardous.  Little of the wealth that the growth of the nation had generated went to its workers.  Moreover, women and children made up a high percentage of the work force in some industries and often received but a fraction of the wages a man could earn. Periodic economic crises swept the nation, further eroding industrial wages and producing high levels of unemployment.

At the same time, technological improvements, which added so much to the nation’s productivity, continually reduced the demand for skilled labor. Yet the unskilled labor pool was constantly growing, as unprecedented numbers of immigrants – 18 million between 1880 and 1910 – entered the country, eager for work.

Before 1874, when Massachusetts passed the nation’s first legislation limiting the number of hours women and child factory workers could perform to 10 hours a day, virtually no labor legislation existed in the country.  It was not until the 1930s that the federal government would become actively involved. Until then, the field was left to the state and local authorities, few of whom were as responsive to the workers as they were to wealthy industrialists.

The laissez-faire capitalism that dominated the second half of the 19th century and fostered huge concentrations of wealth and power was backed by a judiciary that time and again ruled against those who challenged the system.  In this, they were merely following the prevailing philosophy of the times.  Drawing on a simplified understanding of Darwinian science, many social thinkers believed that both the growth of large business at the expense of small enterprise and the wealth of a few alongside the poverty of many was “survival of the fittest,” and an unavoidable by-product of progress.

American workers, especially the skilled among them, appear to have lived at least as well as their counterparts in industrial Europe.  Still, the social costs were high.  As late as the year 1900, the United States had the highest job-related fatality rate of any industrialized nation in the world. Most industrial workers still worked a 10-hour day (12 hours in the steel industry), yet earned less than the minimum deemed necessary for a decent life. The number of children in the work force doubled between 1870 and 1900.

The first major effort to organize workers’ groups on a nationwide basis appeared with the Noble Order of the Knights of Labor in 1869. Originally a secret, ritualistic society organized by Philadelphia garment workers and advocating a cooperative program, it was open to all workers, including African Americans, women, and farmers. The Knights grew slowly until its railway workers’ unit won a strike against the great railroad baron, Jay Gould, in 1885.  Within a year they added 500,000 workers to their rolls, but, not attuned to pragmatic trade unionism and unable to repeat this success, the Knights soon fell into a decline.

Their place in the labor movement was gradually taken by the American Federation of Labor (AFL).  Rather than open membership to all, the AFL, under former cigar union official Samuel Gompers, was a group of unions focused on skilled workers.  Its objectives were “pure and simple” and apolitical: increasing wages, reducing hours, and improving working conditions. It did much to turn the labor movement away from the socialist views of most European labor movements.

Nonetheless, both before the founding of the AFL and after, American labor history was violent.  In the Great Rail Strike of 1877, rail workers across the nation went out in response to a 10-percent pay cut. Attempts to break the strike led to rioting and wide-scale destruction in several cities: Baltimore, Maryland; Chicago, Illinois; Pittsburgh, Pennsylvania; Buffalo, New York; and San Francisco, California. Federal troops had to be sent to several locations before the strike was ended.

Nine years later, in Chicago’s Haymarket Square incident, someone threw a bomb at police about to break up an anarchist rally in support of an ongoing strike at the McCormick Harvester Company in Chicago. In the ensuing melee, seven policemen and at least four workers were reported killed.   Some 60 police officers were injured.

In 1892, at Carnegie’s steel works in Homestead, Pennsylvania, a group of 300 Pinkerton detectives the company had hired to break a bitter strike by the Amalgamated Association of Iron, Steel, and Tin Workers fought a fierce and losing gun battle with strikers.  The National Guard was called in to protect non-union workers and the strike was broken. Unions were not let back into the plant until 1937.

In 1894, wage cuts at the Pullman Company just outside Chicago led to a strike, which, with the support of the American Railway Union, soon tied up much of the country’s rail system. As the situation deteriorated, U.S. Attorney General Richard Olney, himself a former railroad lawyer, deputized over 3,000 men in an attempt to keep the rails open. This was followed by a federal court injunction against union interference with the trains. When rioting ensued, President Cleveland sent in federal troops, and the strike was eventually broken.

The most militant of the strike-favoring unions was the Industrial Workers of the World (IWW). Formed from an amalgam of unions fighting for better conditions in the West’s mining industry, the IWW, or “Wobblies” as they were commonly known, gained particular prominence from the Colorado mine clashes of 1903 and the singularly brutal fashion in which they were put down. Influenced by militant anarchism and openly calling for class warfare, the Wobblies gained many adherents after they won a difficult strike battle in the textile mills of Lawrence, Massachusetts, in 1912. Their call for work stoppages in the midst of World War I, however, led to a government crackdown in 1917 that virtually destroyed them.

THE REFORM IMPULSE

The presidential election of 1900 gave the American people a chance to pass judgment on the Republican administration of President McKinley, especially its foreign policy.  Meeting at Philadelphia, the Republicans expressed jubilation over the successful outcome of the war with Spain, the restoration of prosperity, and the effort to obtain new markets through the Open Door policy. McKinley easily defeated his opponent, once again William Jennings Bryan.  But the president did not live to enjoy his victory. In September 1901, while attending an exposition in Buffalo, New York, he was shot down by an assassin, the third president to be assassinated since the Civil War.

Theodore Roosevelt, McKinley’s vice president, assumed the presidency. Roosevelt’s accession coincided with a new epoch in American political life and international relations. The continent was peopled; the frontier was disappearing. A small, formerly struggling republic had become a world power. The country’s political foundations had endured the vicissitudes of foreign and civil war, the tides of prosperity and depression. Immense strides had been made in agriculture and industry. Free public education had been largely realized and a free press maintained. The ideal of religious freedom had been sustained. The influence of big business was now more firmly entrenched than ever, however, and local and municipal government often was in the hands of corrupt politicians.

In response to the excesses of 19th-century capitalism and political corruption, a reform movement arose called “progressivism,” which gave American politics and thought its special character from approximately 1890 until the American entry into World War I in 1917. The Progressives had diverse objectives.  In general, however, they saw themselves as engaged in a democratic crusade against the abuses of urban political bosses and the corrupt “robber barons” of big business. Their goals were greater democracy and social justice, honest government, more effective regulation of business, and a revived commitment to public service.   They believed that expanding the scope of government would ensure the progress of U.S. society and the welfare of its citizens.

The years 1902 to 1908 marked the era of greatest reform activity, as writers and journalists strongly protested practices and principles inherited from the 18th‑century rural republic that were proving inadequate for a 20th‑century urban state. Years before, in 1873, the celebrated author Mark Twain had exposed American society to critical scrutiny in The Gilded Age. Now, trenchant articles dealing with trusts, high finance, impure foods, and abusive railroad practices began to appear in the daily newspapers and in such popular magazines as McClure’s and Collier’s. Their authors, such as the journalist Ida M. Tarbell, who crusaded against the Standard Oil Trust, became known as “muckrakers.”

In his sensational novel, The Jungle, Upton Sinclair exposed unsanitary conditions in the great Chicago meat-packing houses and condemned the grip of the beef trust on the nation’s meat supply. Theodore Dreiser, in his novels The Financier and The Titan, made it easy for laymen to understand the machinations of big business.  Frank Norris’s The Octopus assailed amoral railroad management; his The Pit depicted secret manipulations on the Chicago grain market.  Lincoln Steffens’s The Shame of the Cities bared local political corruption. This “literature of exposure” roused people to action.

The hammering impact of uncompromising writers and an increasingly aroused public spurred political leaders to take practical measures. Many states enacted laws to improve the conditions under which people lived and worked. At the urging of such prominent social critics as Jane Addams, child labor laws were strengthened and new ones adopted, raising age limits, shortening work hours, restricting night work, and requiring school attendance.

ROOSEVELT’S REFORMS

By the early 20th century, most of the larger cities and more than half the states had established an eight-hour day on public works. Equally important were the workman’s compensation laws, which made employers legally responsible for injuries sustained by employees at work. New revenue laws were also enacted, which, by taxing inheritances, incomes, and the property or earnings of corporations, sought to place the burden of government on those best able to pay.

It was clear to many people – notably President Theodore Roosevelt and Progressive leaders in the Congress (foremost among them Wisconsin Senator Robert LaFollette) – that most of the problems reformers were concerned about could be solved only if dealt with on a national scale. Roosevelt declared his determination to give all the American people a “Square Deal.”

During his first term, he initiated a policy of increased government supervision through the enforcement of antitrust laws.  With his backing, Congress passed the Elkins Act (1903), which greatly restricted the railroad practice of giving rebates to favored shippers.  The act made published rates the lawful standard, and shippers equally liable with railroads for rebates.  Meanwhile, Congress had created a new Cabinet Department of Commerce and Labor, which included a Bureau of Corporations empowered to investigate the affairs of large business aggregations.

Roosevelt won acclaim as a “trust-buster,” but his actual attitude toward big business was complex.  Economic concentration, he believed, was inevitable.  Some trusts were “good,” some “bad.”  The task of government was to make reasonable distinctions.  When, for example, the Bureau of Corporations discovered in 1907 that the American Sugar Refining Company had evaded import duties, subsequent legal actions recovered more than $4 million and convicted several company officials. The Standard Oil Company was indicted for receiving secret rebates from the Chicago and Alton Railroad, convicted, and fined a staggering $29 million.

Roosevelt’s striking personality and his trust-busting activities captured the imagination of the ordinary individual; approval of his progressive measures cut across party lines. In addition, the abounding prosperity of the country at this time led people to feel satisfied with the party in office.  He won an easy victory in the 1904 presidential election.

Emboldened by a sweeping electoral triumph, Roosevelt called for stronger railroad regulation.  In June 1906 Congress passed the Hepburn Act.  It gave the Interstate Commerce Commission real authority in regulating rates, extended the commission’s jurisdiction, and forced the railroads to surrender their interlocking interests in steamship lines and coal companies.

Other congressional measures carried the principle of federal control still further. The Pure Food and Drug Act of 1906 prohibited the use of any “deleterious drug, chemical, or preservative” in prepared medicines and foods.  The Meat Inspection Act of the same year mandated federal inspection of all meat-packing establishments engaged in interstate commerce.

Conservation of the nation’s natural resources, managed development of the public domain, and the reclamation of wide stretches of neglected land were among the other major achievements of the Roosevelt era.  Roosevelt and his aides were more than conservationists, but given the helter-skelter exploitation of public resources that had preceded them, conservation loomed large on their agenda.  Whereas his predecessors had set aside 18,800,000 hectares of timberland for preservation and parks, Roosevelt increased the area to 59,200,000 hectares.  They also began systematic efforts to prevent forest fires and to re-timber denuded tracts.

TAFT AND WILSON

Roosevelt’s popularity was at its peak as the campaign of 1908 neared, but he was unwilling to break the tradition by which no president had held office for more than two terms. Instead, he supported William Howard Taft, who had served under him as governor of the Philippines and secretary of war.  Taft, pledging to continue Roosevelt’s programs, defeated Bryan, who was running for the third and last time.

The new president continued the prosecution of trusts with less discrimination than Roosevelt, further strengthened the Interstate Commerce Commission, established a postal savings bank and a parcel post system, expanded the civil service, and sponsored the enactment of two amendments to the Constitution, both adopted in 1913.

The 16th Amendment, ratified just before Taft left office, authorized a federal income tax; the 17th Amendment, approved a few months later, mandated the direct election of senators by the people, instead of state legislatures.  Yet balanced against these progressive measures was Taft’s acceptance of a new tariff with higher protective schedules; his opposition to the entry of the state of Arizona into the Union because of its liberal constitution; and his growing reliance on the conservative wing of his party.

By 1910 Taft’s party was bitterly divided.  Democrats gained control of Congress in the midterm elections. Two years later, Woodrow Wilson, the Democratic, progressive governor of the state of New Jersey, campaigned against Taft, the Republican candidate – and also against Roosevelt who ran as the candidate of a new Progressive Party.  Wilson, in a spirited campaign, defeated both rivals.

During his first term, Wilson secured one of the most notable legislative programs in American history. The first task was tariff revision. “The tariff duties must be altered,” Wilson said. “We must abolish everything that bears any semblance of privilege.” The Underwood Tariff, signed on October 3, 1913, provided substantial rate reductions on imported raw materials and foodstuffs, cotton and woolen goods, iron and steel; it removed the duties from more than a hundred other items. Although the act retained many protective features, it was a genuine attempt to lower the cost of living. To compensate for lost revenues, it established a modest income tax.

The second item on the Democratic program was a long overdue, thorough reorganization of the ramshackle banking and currency system.  “Control,” said Wilson, “must be public, not private, must be vested in the government itself, so that the banks may be the instruments, not the masters, of business and of individual enterprise and initiative.”

The Federal Reserve Act of December 23, 1913, was Wilson’s most enduring legislative accomplishment.  Conservatives had favored establishment of one powerful central bank.  The new act, in line with the Democratic Party’s Jeffersonian sentiments, divided the country into 12 districts, with a Federal Reserve Bank in each, all supervised by a national Federal Reserve Board with limited authority to set interest rates.  The act assured greater flexibility in the money supply and made provision for issuing federal-reserve notes to meet business demands.  Greater centralization of the system would come in the 1930s.

The next important task was trust regulation and investigation of corporate abuses. Congress authorized a Federal Trade Commission to issue orders prohibiting “unfair methods of competition” by business concerns in interstate trade.  The Clayton Antitrust Act forbade many corporate practices that had thus far escaped specific condemnation: interlocking directorates, price discrimination among purchasers, use of the injunction in labor disputes, and ownership by one corporation of stock in similar enterprises.

Farmers and other workers were not forgotten. The Smith-Lever Act of 1914 established an “extension system” of county agents to assist farming throughout the country.  Subsequent acts made credit available to farmers at low rates of interest. The Seamen’s Act of 1915 improved living and working conditions on board ships. The Federal Workingman’s Compensation Act in 1916 authorized allowances to civil service employees for disabilities incurred at work and established a model for private enterprise. The Adamson Act of the same year established an eight-hour day for railroad labor.

This record of achievement won Wilson a firm place in American history as one of the nation’s foremost progressive reformers. However, his domestic reputation would soon be overshadowed by his record as a wartime president who led his country to victory but could not hold the support of his people for the peace that followed.

Posted in Uncategorized | Tagged , , , , | Leave a comment