Discussion questions

7.1 A Return to Normalcy

Some elements of prewar society persisted into the 1920s, including concerns over private economic power and government responsibility for social problems. Racial and ethnic divisions and tensions that had grown in earlier decades endured and even intensified. Overall, though, the decade following World War I represented a shift in temperament and culture for the United States. The idealism and reform impulse of the Progressive era were replaced by conservatism, materialism, and a rising consumer culture. Americans turned away from imperialism and involvement in foreign affairs and back toward isolationism. Among the most striking changes of the 1920s was the state of American politics (Cooper, 1990).

Harding and Coolidge

With his health failing at the end of his second term and struggles over the League of Nations continuing, Woodrow Wilson had ceased to be a viable leader for the Democratic Party by 1920. In the election that year, the Democrats nominated Ohio governor James M. Cox for president, with Franklin D. Roosevelt for vice president. The other commanding national political presence, former president Theodore Roosevelt, had died in his sleep on January 5, 1919. On the 10th ballot held at the convention, the Republicans nominated conservative Ohio senator Warren G. Harding. Harding’s running mate, Calvin Coolidge, had most recently served as the governor of Massachusetts.

Newly enfranchised female voters swelled the electorate, so that 8 million more people voted in the 1920 election than had in 1916. They cast their ballots for Harding by a large margin because he was seen as sympathetic to their concerns. During the campaign he sent a personal letter to Carrie Chapman Catt endorsing suffrage, and he sent a campaign staffer to be on hand for the Tennessee legislature vote that ratified the 19th Amendment. The election was a landslide, with Harding earning 16 million votes to Cox’s 9 million. Campaigning from federal prison, Socialist Eugene V. Debs claimed just over 3% of the vote, demonstrating that more than a million American voters did not find representation of their interests in the dominant parties.

Harding’s administration represented a turn away from reform and toward conservative policies. He argued that the nation needed “not heroism but healing, not nostrums but normalcy, not revolution but restoration,” by which he meant an emphasis on economic growth that would result in community and harmony. He offered America a normalcy that represented an end to reform and war and aimed to substitute them with small-town simplicity full of nostalgia and tradition (Payne, 2009).

In international affairs, Harding opposed Wilson’s advocacy for membership in the League of Nations, but also Theodore Roosevelt’s arguments about military leadership, pacts, and alliances, even within the Western Hemisphere. Harding largely avoided discussing the growing global interconnections between nations and economies, although the president well knew that it was impossible to insulate the United States from the world economy and global politics. Instead, he dealt with international issues quietly while he publicly advocated a return to an unconcern over foreign affairs and gave Americans the impression that they could accept or reject involvement in world concerns when and where they pleased.

On the domestic front, Harding supported the efforts of conservative Republicans to court big business and subvert the gains made by labor during the war. Harding’s probusiness orientation faced some challenges at the state level when Progressive Republican governors were elected in Wisconsin and Montana. For the most part, however, conservative Republican leaders surged forward with their agenda (Cooper, 1990).

A series of scandals also characterized Harding’s presidency. He appointed his close friends and allies to important political positions, and several members of the so-called Ohio Gang took advantage of their place in the Harding administration to advance their own agendas. It is unclear if Harding was fully aware of the actions of his appointees, since many of the scandals came to light only after his death.

The Teapot Dome affair, involving the lease of navy petroleum reserves in Wyoming and California to private companies without public bidding, was the subject of a congressional investigation. The scandal resulted in the bribery conviction of Harding’s secretary of the interior, Albert B. Fall, who had negotiated the leases. Other Harding administration scandals involved corruption in the Justice Department, perpetrated by his attorney general and former campaign manager Harry M. Daugherty, and in the Veterans’ Bureau, where director Charles R. Forbes was accused of putting his own economic gains ahead of the needs of returning veterans.

A New Economic Vision

In 1921 the nation’s economy was in a severe slump. Demobilization resulted in high unemployment, and investments fell below the rate of inflation, leaving all Americans with less buying power. The end of wartime production resulted in thousands of layoffs, and the nation entered a period of economic adjustment that required intervention. Even Americans still employed found that their incomes did not stretch far enough to cover household needs, and the purchase of extra consumer goods was out of the question for most households.

To deal with the economic concerns, Harding called a President’s Conference on Unemployment. Its participants recommended a controversial public works expansion and a bonus bill to reward veterans for their service, but both failed in Congress. Instead, the administration cut taxes and created a budget bureau to oversee and limit the spending of government funds. Once the Federal Reserve slashed interest rates, investment recovered, and by 1923 many industries actually faced a labor shortage (Perrett, 1982).

Harding’s approach to the presidency was in many ways the opposite of his Progressive predecessor (McGerr, 2005). He supported more individual freedom and greater limitations on government activism, and he was far more favorable to and tolerant of big business. He demonstrated his convictions by appointing officials to the Interstate Commerce Commission and the Federal Reserve Board that he believed would change those agencies’ policies to make them much more supportive of business.

He also strove to enact legislation that gave corporations more power. He signed legislation to restore a higher tariff that supported American production, and he encouraged federal agencies such as the Federal Trade Commission and Interstate Commerce Commission to cooperate with businesses rather than merely regulate them. Harding also supported business by taking a more hands-on approach to breaking labor strikes.

Challenges for Labor

Using both the “carrot” and the “stick,” business in the 1920s sought to erode worker protections and union membership. The stick, or punitive tactics, some employers used included forcing newly hired workers to sign so-called yellow dog contracts in which they agreed not to join unions; if they did, they would be fired.

More employers engaged in an open shop movement, arguing that they wanted to give their employees the ability to decide on their own whether to join a union. Mobilizing under what they called the American Plan, these employers declared that the open shop was consistent with American values, freedom, and patriotism. By contrast, they charged unions with limiting freedom by creating closed shop workplaces, where only union members could be employed. They argued that unions restricted production, made unreasonable wage demands, and kept capable workers from reaching their full earning potential. In reality, employers promoted the American Plan to rid their industries of union organization and were successful in holding back the number of workers who could enjoy the benefits of collective bargaining (Goldberg, 1999).

To further discourage unionization, industrialists devised as a carrot the system of welfare capitalism. Designed to instill worker loyalty and encourage efficiency, welfare capitalism was practiced by the largest employers, including Goodyear, International Harvester, and General Electric. The programs included company unions that could bargain for limited workplace improvements but not for wage increases. Some created grievance committees to hear worker complaints. Other features could include profit sharing, life insurance, and company baseball teams.

Labor journalist Louis Francis Budenz, a reporter for Labor Age, railed against the practices of company unions, considering them the gravest threat to workers. In one case, he reported a construction job purporting to have a company union that ingenuously promised, but failed to pay, trained carpenters $12 a day, nearly double the wage union carpenters earned. Budenz asserted that company unions were disingenuous organizations that aimed to draw in unsuspecting workers (Grant, 2006). By the mid-1920s a mere 4 million worked for a firm that practiced welfare capitalism, but the concept grew throughout the decade (Dumenil, 1995).

Both the American Plan and welfare capitalism accelerated during the postwar recession and caused considerable strife between labor and employers. Although the decade saw many strikes across multiple industries, the probusiness climate assured that organized labor made few gains.

The Triumph of Big Business

The U.S. economy rebounded from the postwar recession by 1922, thanks largely to a consumer revolution and growth in industries that manufactured automobiles and other durable goods like refrigerators and radios. Following Harding’s sudden death in 1923, Calvin Coolidge succeeded to the presidency. A Republican attorney from Vermont, Coolidge began his political career in Massachusetts, first in the state legislature and then as the commonwealth’s governor. He gained a national reputation as an opponent of organized labor after he fired the striking Boston police force in 1919.

Coolidge was elected in his own right in 1924 and extended a series of policies favorable to business expansion. He appointed probusiness men to the Federal Trade Commission and the Interstate Commerce Commission and supported a move to raise tariffs to offer more protection for business. Under his watch, Congress also passed three revenue acts, greatly reducing income taxes for most Americans.

In contrast to the Progressive era’s push to regulate large corporations and make them more responsive to environmental and societal problems, the 1920s political climate supported business mergers and did little to restrict or influence business practices. The U.S. Supreme Court and Justice Department protected businesses from organized labor through a series of injunctions and limitations on union organization and strike activity.

The economy grew considerably for the remainder of the decade. Industrial output rose 64%, and the production of automobiles grew from 1.5 million in 1919 to 4.8 million in 1929. Industries incorporated new technologies, including mechanization, assembly lines, and electricity to boost production. Worker productivity grew 43%, and overall output grew 70% (Murphy, 2012).

Henry Ford’s motor company stands as a clear example of the business ethos of the 1920s. Initially operating one plant outside Detroit, Michigan, Ford introduced the moving assembly line and applied Frederick Winslow Taylor’s scientific management to the manufacture of his Model T automobiles. The process reduced the time and cost to produce a car but also created a monotonous and challenging work environment that initially led to massive turnover.

Ford countered by paying workers $5 per day (roughly $15 an hour in today’s money) and reducing the workday to 8 hours. Soon workers were lined up for jobs at the Ford plants. The Ford Motor Company was also one of the first to apply the principles of welfare capitalism, offering workers profit sharing to discourage unionization. Ford also implemented a so-called sociology department to ensure worker loyalty, patriotism, and moral values (Drowne & Huber, 2004).

Ford’s sociology department, also known as the education department, aimed to guide his workers in living moral and upright lives and to embrace a new identity as a “Ford Man.” Ford expected his workers to refrain from using tobacco and alcohol and to avoid interaction with unions, political radicals, and socialists. Immigrant workers received instruction in English and endured a plan of Americanization as a condition of continued employment. Those who demonstrated clean and wholesome habits were likely to see a wage increase. Those who did not want their employer intruding in their personal lives were invited to look elsewhere for a job (Hooker, 1997).

Sick Industries

Although some workers such as those at Ford plants made wage gains in the 1920s, most corporate profits were not passed along to employees. Nor did all segments of the economy benefit from the government’s new probusiness orientation. Although most of the economy recovered from the postwar recession fairly quickly, railroads, coal, textiles, and agriculture continued to struggle. Workers in those industries experienced stagnant wages (Murphy, 2012). Employees at a Gastonia, North Carolina, textile mill, for instance, averaged 70 hours per week. Despite the long hours, men’s wages were a mere $18 per week, and women earned a paltry $9 per week (St. Germain, 1990). Many workers also faced unemployment or underemployment.

The coal industry was another “sick” industry struggling to recover in the postwar decade. Coal was once the main fuel for American factories and mills, but competition from cleaner and abundant oil and hydroelectric power contributed to falling coal prices. The price of coal fell from a high of $3.74 a ton in 1920 to a mere $1.78 in 1929. Mines reduced production or shut down altogether, leaving remote communities unable to participate in the growing consumer economy.

Farmers likewise struggled to find prosperity. Mechanization in the form of tractors, combines, and disc plows increased production capabilities but reduced the prices of staple crops like wheat and corn. Coolidge vetoed congressional proposals to aid the farm crisis, arguing that the government had no constitutional power to intervene in private business (St. Germain, 1990). The agricultural sector continued to limp along well into the 1930s, when the Great Depression reversed attitudes toward government interference in the economy.

Economic Growth and Foreign Policy

America’s emergence as the world’s dominant economic power drew the nation into a host of international affairs during the 1920s. The nation officially sought a foreign policy that aimed to reduce the risk of international conflict and ensure the safety of trade and investment. In practice, however, U.S. foreign interactions often undermined those very goals.

U.S. investment overseas made America the world’s leading creditor nation, and its continued economic success depended on the ability of other nations, especially those in Europe, to repay their war debts of approximately $10 billion. However, Harding and the Congress, focused on nurturing U.S. business development, enacted a series of policies that showed little concern for European recovery following the war’s devastation. Higher tariff rates made it difficult for Britain and France to profit from exports. At the same time, the United States flooded European markets with American manufactured goods. Instead of providing relief and encouraging the commerce needed to reduce the debt, the United States continued on a path that produced further restrictions.

In this climate the United States hosted the first conference aimed at world disarmament. Held in Washington, D.C., from November 1921 through February 1922, leaders from nine nations met to consider interests in the Pacific Ocean and east Asia. Among those attending were representatives from China, Japan, Britain, France, Italy, Belgium, and Portugal. Neither Germany nor the new Soviet Union was invited. Supported by peace advocates in America and abroad, the conference resulted in the Washington Naval Treaty, in which each nation agreed to reduce the size of its naval fleet and limit production of new warships (Goldstein & Maurer, 2012).

The Harding and Coolidge administrations also sought to retreat from involvement in Latin American affairs unless economic ties there forced the United States to intervene. American business interests sought investment in the rich oil fields in South America and encouraged a foreign policy favorable to their plans. The Senate ratified a treaty apologizing to Colombia for American intervention in Panama in 1903 and offered a payment of $25 million in amends. This paved the way for U.S. investment in Colombian and eventually Venezuelan oil fields.

To further cement relations in Latin America, Secretary of State Charles Evans Hughes used the centennial of the Monroe Doctrine in 1923 to assure the nations of the region that the United States intended to be a good neighbor, although at that moment the United States still occupied and controlled the governments of Haiti and the Dominican Republic (see Chapter 6) (Goldberg, 1999).

7.2 The Culture of Modernity

Modernity, or the bureaucratic, industrial, and consumer-oriented society of early 20th-century America, was characterized by an evolving and distinct culture. Following the postwar recession, the nation saw unprecedented prosperity and industrial productivity. The United States stood as the world’s dominant economic power, and at home most Americans enjoyed a higher standard of living and more leisure time. Although some segments of society, such as farmers, coal miners, and African Americans, did not experience as much prosperity, all participated in an emerging culture of modernity.

The Boom of the Consumer Culture and the Consumer Economy

Beginning with the growth of American capitalism and industrialization in the 19th century, a new consumerism began to emerge. Linked to the expanding market economy, consumer culture celebrated the worth of goods and services in terms of their financial value. A significant part of modernity in the 1920s was the expansion of a consumer-oriented culture that prioritized acquisition and consumption. It associated happiness with accumulating material goods and made monetary value the most important measure of worth. Consumption rather than hard work came to measure an individual’s worth in society (Leach, 1994).

Drawing more Americans into the consumer culture was key to maintaining the nation’s economic prosperity. Goods produced required a market, and many looked to the American consumer as an important outlet for manufactured products. Businesses soon realized that consumers simply did not have enough money in their pockets to afford everything that they wanted to buy. Therefore, they devised a way for them to enjoy the products immediately but pay for them later. This technique for immediate gratification became known as buying on credit, and it was very much opposite to the Victorian ethos of the 19th century, which held that upstanding citizens did not incur debts.

Turning this idea on its head, in the 1920s purchasing on credit meant that you were a strong consumer, and many aspired to purchase modern conveniences to demonstrate their rising economic status. The trend began with more expensive items like automobiles, and it soon extended to other durable goods such as refrigerators and washing machines and even to small consumer goods.

Edward Filene, owner of an upscale Boston department store, recalled handing a doll to a little girl with whom he had been speaking in the toy department of his store. As Filene looked for the girl’s reaction, her mother prompted her: “What are you going to say to the gentleman?” The girl looked Filene in the eye and said, “Charge it!” (as cited in Benson, 1988, p. 100). Soon more and more people began filling their homes with the latest devices, even as they owned fewer and fewer of them outright.

Those with charge accounts were likely to spend more than customers paying cash, especially at department stores such as Filene’s. The U.S. Department of Commerce surveyed the use of store charge accounts in 1928 and found that although charges accounted for a small percentage of total transactions, they often represented as much as 20% of overall sales. Managers treated charge customers well and courted their repeat business (Benson, 1988).

The use and availability of charge accounts continued to increase. Although installment buying (consumer credit), or individual borrowing for consumable goods or services, was evident before World War I, in the 1920s household debt nearly doubled. Due to manufacturing techniques, the prices for appliances, automobiles, and other household products generally declined across the decade. At the same time a relaxation on the qualifications for credit saw the amount of goods purchased on time increase enormously (Olney, 1991).

Modern Advertising

New approaches to advertising fueled the consumer economy. The volume of advertising increased tremendously, with popular magazines being the most common marketing medium. Later in the decade, radio joined magazines as an important venue for ads. Moving beyond the utilitarian display of products that previously characterized product marketing, modern advertising firms created colorful ads showing individuals enjoying products.

The ads glorified consumption and leisure, such as in a car manufacturer’s depiction of a lush countryside and the slogan “You find a Road to Happiness the day you buy a Buick” (as cited in Dumenil, 1995, p. 89). A growing number of advertising agencies associated their clients’ products with the modern era, fashion, and progress. They pushed the necessity of owning new household products, including refrigerators, vacuum cleaners, toasters, and radios (Olney, 1991).

The new advertising also played on consumers’ fears and anxieties, brought on by the changes in modern society. Capitalizing on the stress of modern life, Post Bran Flakes promoted a cure for those “Too Busy to Keep Well” (as cited in Dumenil, 1995, p, 89). The makers of Listerine, formerly used only as an antiseptic, advertised its ability to cure halitosis, more commonly known as bad breath. Another technique aimed for a personal approach by including the word you in the text of an advertisement.

Ads connected to other elements of popular culture through celebrity endorsements, linking movie stars and sports figures to products. Finally, the introduction of company spokespersons such as Betty Crocker helped humanize corporations and their products (Dumenil, 1995).

The Automobile in 1920s Culture

The automobile was the most expensive and most desirable durable good of the era, and it became increasingly available to average Americans. Ford’s Model T, or “Tin Lizzy,” remained the best-selling and most inexpensive car. Aiming to make his vehicles affordable for the company’s assembly line workers, Henry Ford pushed an efficiency that allowed him to continuously reduce prices. First costing $850 in 1909, in the 1920s a Tin Lizzy could be purchased for $260 (Flink, 1998).

Along with other consumer products, the automobile was an important factor in the postrecession boom. By the 1920s multiple manufacturers produced thousands of cars each year using the assembly line and other efficiency techniques first used at the Ford Motor Company.

By the end of the 1920s, more than half of American families owned a car, making a whole new culture possible. Like George Babbitt, Americans were fascinated with automobile-related gadgets and other developments of the car culture. Gas stations, motels, diners, and other businesses sprang up to serve car-owning individuals and families.

The automobile became the ultimate symbol of leisure, promising owners freedom and mobility. Many passed on more traditional pastimes such as Sunday church services to take a drive in the country. In rural areas especially, having a car opened new options for shopping and leisure (Goldberg, 1999).

Morals, Movies, and Amusement

Motion pictures and other forms of mass media transformed American culture in the 1920s. Movies created and spread a set of common American values, attitudes, and experiences. Beginning around the turn of the 20th century, theaters known as nickelodeons offered one-reel silent films to mostly working-class audiences. By the 1920s the movie industry, located in and near Hollywood, California, expanded to include elaborate multireel feature films. Cities constructed ornate movie theaters or “palaces,” and films began attracting a middle-class audience.

Movie actors such as Mary Pickford, her husband Douglas Fairbanks, Buster Keaton, and Charlie Chaplin became national stars, setting trends for dress and style. One film star in particular, Clara Bow, popularized the flapper image of the so-called Roaring Twenties. Starring in multiple silent movies, including It (1927)—from which she became known as the “It Girl”—Bow was the sex symbol of the age. Copying the stars’ style, young women bobbed their hair, wore short skirts, smoked, and listened to jazz music. The modern, emancipated young women of the 1920s drew the ire of more conservative Americans, who believed that their behavior challenged women’s traditional roles in society (Stenn, 2000).

Working-Class Leisure and Culture

The freedom of middle-class culture did not extend to the nation’s working class. Blue-collar workers, who toiled in skilled or unskilled manual work in manufacturing, mining, or other heavy industries, enjoyed less time for leisure activities. Although the working class earned less money and worked more hours, they did participate in a variety of pastimes. The expanding commercialization of leisure saw many participate in more sedentary activities. Instead of playing sports, they were more likely to attend a semiprofessional baseball game or listen to a band or music on the radio. Amusement parks, nickel theaters, and 10-cent museums catered to men and women and may have created a more homogeneous and less ethnically divided working class.

At the same time, various elements of different ethnic cultures spread among the working class. Whereas some leisure activities divided along racial and ethnic lines, others were adopted more fluidly. White musicians adopted African American music and musical instruments such as the banjo and then fused them into mainstream culture. Jews made up a disproportionate number of entertainers, many of whom were among the most important pioneers in the film industry but worked closely with non-Jewish actors.

Other forms of entertainment reinforced ethnic identities. Films aimed at particular ethnic groups were shown at “race” theaters in African American or Mexican neighborhoods. The growing music recording industry similarly emerged to serve particular audiences. “Race” records were sold at stores in cities with large African American populations such as Chicago, where one owner reported lines forming around the block to purchase the latest release. “Hillbilly” music similarly aimed for an audience of rural southern Whites (Dumenil, 1995).

The Jazz Age

Music was a central part of the 1920s, and jazz was the soundtrack of the decade. It combined traditional African American styles such as the deep soulful feeling of blues with the rhythmic beats of ragtime and, in the process, became a unique American musical form (Burns & Ward, 2000). The improvisational aspect of jazz let musicians spontaneously explore new sonic boundaries, and audiences listening on the radio or dancing in front of big bands experienced the newness and sexual openness of the Jazz Age.

Jazz started in New Orleans but soon spread to Chicago and New York (Martin & Waters, 2006). However, it did not go mainstream until the Original Dixieland Jazz Band, which was an all-White group that was clearly not the “original” jazz band, became popular. This was one of the many examples in the 20th century of White Americans capitalizing on and mass marketing African American culture. The mass marketing of African American culture did, however, also pave the way for African American jazz musicians like Joe “King” Oliver’s Creole Jazz Band and many others to tour the United States.

Jazz became a central unifying cultural phenomenon among the youth of the 1920s. Featuring improvisation over structure jazz broke musical rules, and the way it made racial mingling seem normal challenged the dividing line between Whites and African Americans. From its big-city origins, jazz soon spread and was played in dance halls, roadhouses, and illegal speakeasies across the nation. Radio and phonograph records helped spread the jazz craze to even the most remote towns and farms. It was the music of a younger generation coming of age in modern America, and it sparked a backlash among traditionalists who called it the “devil’s music” and worried that youth would lose their appreciation for classical music.

7.3 Traditionalism’s Challenge to the New Order

A large segment of American citizens pushed back against the march of cultural modernity and sought a more conservative vision for the nation. During their presidencies, Harding and Coolidge presided over a return to economic and political conservatism. A movement to regulate morality accompanied these values.

Although the impulse to dictate moral values was not new to Americans, the Progressive era had strengthened the belief in society’s right to regulate personal behavior (McGirr, 2001). The conservative movement of the 1920s banned alcohol sales and production and fed the rise of militant and fundamental Christianity. The 1920s also saw a rebirth of the Ku Klux Klan and a virulent anti-immigrant movement. All of these groups participated in a struggle that pitted a preservationist-oriented Protestantism on one side against modernism, secularism, immigration, and urbanization on the other (McGirr, 2001).

Prohibition

The American tradition of efforts to restrict alcohol consumption stretches back into the 19th century. Temperance activists from groups such as the Anti-Saloon League and the Woman’s Christian Temperance Union had long argued that stopping liquor sales and consumption would make America a well-ordered and industrious society. They also claimed it would reduce domestic violence and increase worker productivity. During the Progressive era, the Prohibition movement to end liquor trafficking gained considerable ground as many localities, counties, and cities voted to go dry.

With the outbreak of World War I in 1914, Prohibitionists gained more ground as anti-German hysteria made the dominant German-owned breweries suspect. In 1916 lawmakers in Congress took up the Anti-Saloon League’s call for a constitutional amendment banning liquor traffic, finally passing the 18th Amendment in December 1917. By that time 19 states had already outlawed alcohol. The states ratified the amendment, and it went into effect January 17, 1920. It banned the production, sale, and transportation of intoxicating liquors (Okrent, 2010).

Representing a triumph for conservative values, the amendment was almost impossible to enforce. A follow-up law, the Volstead Act, provided for enforcement and defined intoxicating liquor as any beverage containing more than 0.5% alcohol, prohibiting beer and wine from being consumed, along with rum, whiskey, and other hard liquor. Congress was unable to appropriate the funds necessary to enforce the law, and many distilleries simply moved their operations across the Canadian or Mexican borders and continued production. The youth and even middle-class men and women skirted the law, taking pleasure in frequenting illegal saloons known as speakeasies. Illicit drinking became fashionable, representing a modern form of leisure and entertainment (Goldberg, 1999).

“Wets,” who supported responsible alcohol consumption, advocated the controversial amendment’s repeal. In 1923 the New York state legislature repealed that state’s enforcement law. Even where enforcement was funded and supported, it became evident that it was impossible to fully eliminate liquor trafficking. The Prohibition era gave rise to organized crime syndicates that illegally manufactured and sold liquor on a wide scale. This criminal element gained notoriety for violence and frequently made headlines for their grisly activities.

Clearly, Prohibition was not working. The opposition movement grew throughout the decade, and in 1933 the states ratified the 21st Amendment, which repealed the 18th. It is the only constitutional amendment approved specifically to repeal another one.

Fundamentalism and the Scopes Trial

American religious life altered significantly as modernity advanced in the early 20th century. Some sects, including some Presbyterians and the Roman Catholics, moved toward a more scientific interpretation of the Christian Bible that incorporated and accepted such conceptions as evolution and natural selection. Other Protestants insisted on a literal, or fundamental, interpretation of the events depicted in the Bible as historical fact.

The strongest reaction against the new morality of the 1920s also came from these conservative religious groups. They worried that religious modernists would continue to push cultural changes like the acceptance of evolution and biblical criticism. They were also concerned with social changes in American society, including the recent influx of immigrants and what they perceived as the loose morals of many Americans. The term fundamentalism was thus coined to describe a movement to restore traditional values in the face of modern indulgences and relaxed morals.

Leaders of many Protestant denominations grew gravely concerned that the churches themselves stood in danger of being altered by modernists. A series of articles published under the title “The Fundamentals” outlined the fears of leading Protestant theologians that their principles were threatened by unorthodoxy (Dumenil, 1995). The fundamentalist movement aimed to bring lapsed Christians back into the fold and to promote and protect a conservative dogma.

In California, Minister Aimee Semple McPherson used modern technology to achieve conservative ends. She combined fundamentalist ideas with charismatic radio broadcasts, becoming a model for later televised evangelists. McPherson and others, like former professional baseball player and Christian evangelist Billy Sunday, spread their conservative Christian message to millions.

Supporting a literal reading of the Bible—especially Genesis, which says that God created the heavens and the earth in 6 literal days—fundamentalists began to argue that evolution (which is measured in the billions of years) should not be taught in public schools. The modern and growing acceptance of Darwinian principles such as evolution and natural selection represented a real and viable threat to conservative Protestant beliefs.

The conflict between fundamentalism and modernity came to a head in 1925 in the so-called Scopes Monkey Trial. John T. Scopes, a Tennessee high school science teacher, had taught theories of evolution to his students despite the fact that the Tennessee legislature had passed a law forbidding it. Scopes had volunteered to violate the law to test the state’s willingness to enforce the ban, and the city officials of Dayton, Tennessee, supported his actions because they hoped that it would bring national attention to their small town. The American Civil Liberties Union had agreed to defend anyone willing to violate the law so that the principles could be tested in a public court.

In his trial Scopes was represented by prominent attorney Clarence Darrow, a staunch advocate for civil liberties. Darrow was pitted against famous orator and politician William Jennings Bryan, an outspoken supporter of fundamentalism. Once Bryan was brought in, the trial became a national spectacle, sparking heated debate about science, religion, and the place of humans in the world. The proceedings were reported daily in the national press, and it was the first trial broadcast on radio.

On the seventh day of the trial, Darrow famously called Bryan himself to the stand as an expert on the Bible. Questioning Bryan on the historical accuracy of biblical events, including whether Bryan believed that Eve was actually created from Adam’s rib, Darrow aimed to use scientific evidence to prove that many biblical stores were metaphorical. Bryan accused Darrow of casting ridicule on Christians.

The end result was more anticlimactic than the media-frenzied buildup. The jury found Scopes guilty of teaching evolution, and he had to pay a $100 fine. But the trial initiated the rift between religious fundamentalists and scientific modernists that continues to this day (Larson, 2006).

Immigration Restriction

Other cultural conflicts of the 1920s were cast in ethnic terms. The flood of eastern and southern European immigrants that began arriving in the 1880s made the nation’s industrial growth possible but also sparked recurring patterns of nativism, or anti-immigrant sentiment. Nativism was particularly strong during economic downturns, such as the post–World War I recession.

Some came to apply the term melting pot to the diverse groups of ethnicities and nationalities among the immigrant communities. Rather than blending diverse people into a new type of American, however, the expectation was that immigrants should conform to dominant White Protestant culture. Intense Americanization campaigns that included English education and discouraged the persistence of ethnic culture sought to mold White ethnics into proper citizens, but African Americans and other non-Whites were not deemed capable of assimilation (Dumenil, 1995).

The fact that many of the recent arrivals were Catholic, Jewish, or of some other non-Protestant religion also inflamed both mainstream and fundamentalist Christians, who assumed that to be fully American, one must be Protestant. A movement to restrict and qualify the numbers and types of immigrants began in the Gilded Age with the exclusion of Chinese immigrants. A number of groups favored other restrictions such as literacy tests but generally lost ground to strong business interests, which wanted to keep the door as open as possible to fill their labor needs.

The tide turned to favor immigration restriction during World War I and strengthened afterward. Many questioned the loyalty of German Americans and other Europeans thought to be radical Socialists, Communists, or anarchists. Although the tide of immigrants receded during the war, European refugees began to flood into the United States in the spring of 1920, with as many as 5,000 arriving each day (Dumenil, 1995).

Nativists urged Congress to act immediately, and in 1921 a temporary law placed a quota on the number of immigrants to be admitted from each nation. Under the measure only a maximum of 357,803 European immigrants could enter the United States each year. Each nation was given a quota of 3% based on its count in the 1910 census. The law favored immigrants from western Europe and severely limited the numbers of new arrivals from southern or eastern Europe. It was designed to last for a single year but was not replaced until 1924.

Nativists feared that a reopening of immigration would erode American culture and society, and so they fought to make the restriction both permanent and more exclusive. One fear was that the dominant culture based on Protestant values might be replaced or challenged due to the large numbers of Roman Catholics and Jews among the new immigrants. Another strain of nativist thought argued that the inclusion of southern and eastern Europeans, many of whom had a dark complexion, would lead to race mixing and the “mongrelization” of the American people.

Proponents of eugenics, a pseudoscientific movement, defined immigrants, African Americans, and those with disabilities as physically inferior. Eugenicists argued immigration restriction was necessary to protect the White race in America from being polluted through mixing with inferior peoples.

In 1924 Congress responded to all these voices with the National Origins Act, which set permanent national quotas using the 1890 census and cut the overall number of foreign nationals allowed to enter the United States in any year to 167,667 and eventually to 150,000. Asian immigrants were denied entry altogether. The law did not apply to countries in the Western Hemisphere, so Canadians and Latin American citizens remained free to emigrate without restriction. Although some nativists expressed a desire to include Mexico in the quota system, agricultural interests reliant on inexpensive immigrant labor lobbied against the measure (Goldberg, 1999).

With just a few revisions, the National Origins Act remained the nation’s primary immigration law until 1965, when it sparked backlash and indignation from ethnic organizations in the United States and was formally protested by foreign governments.

A Second Ku Klux Klan

During the 1920s a revival of the Ku Klux Klan gained strength by appealing to anti-immigrant and especially anti-Catholic White Protestants, as well as to those who wanted stronger enforcement of Prohibition. In the Reconstruction era the Klan had been a southern-based terrorist group that suppressed African American civil and political rights. Wearing long white gowns and hoods, the late 19th-century Klan was a secretive organization that performed its work under cover of darkness using illegal methods.

The successor organization embraced 100% Americanism, the notion that dominant White culture and Protestant traditions formed the only acceptable American values, and styled itself as a fraternal organization on the order of the Elks, the Masons, or the Odd Fellows. In addition to the South, it also had prominent chapters in the North and particularly the Midwest (Chalmers, 1965).

The Klan of the 1920s gained inspiration from D. W. Griffith’s 1915 film, The Birth of a Nation, which depicted the triumph of the White supremacist organization over the forced imposition of racial equality during Reconstruction. Capitalizing on the film’s popularity, Methodist minister William J. Simmons, known as Colonel Simmons, and several other men created the organization that they proclaimed to be the successor to the original Klan. Membership was limited to White Protestants and remained small until wartime nationalism launched a backlash against immigrants, Catholics, and radicals.

The Klan represented the decade’s strongest pushback against the changes of modern American society. In the South, where it still proclaimed to be primarily a White supremacist organization, members whipped African Americans for voting, refusing to ride in segregated rail cars, or seeking a wage increase. In many cases they burned a fiery cross in the yard of offenders. In California the group targeted the Jewish influence in the growing motion picture industry. In northern cities, the Klan attacked Catholics and ethnic immigrants for clinging to their native culture. The group also railed against the slack morals of the younger generation and women who wore short skirts, bobbed hair, and engaged in public smoking or drinking (Goldberg, 1999).

The Klan’s membership swelled after Simmons contracted with publicists Edward Clarke and Elizabeth Tyler, who promoted the organization using modern subscription, or membership, techniques and clever advertising. Recruiters received a hefty percentage of every subscription fee they sold. New chapters and even women’s and children’s auxiliary groups spread widely. The organization portrayed itself as the defender of “pure womanhood” and touted its opposition to strong drink, wife beaters, and adulterers, in addition to its promotion of American values. In late 1922 a Dallas dentist named Hiram Wesley Evans ousted Simmons and the publicists, and the organization continued to grow (Goldberg, 1999).

Klan member lists were rarely made public, but scholars estimate that the organization had more than 5 million members between 1920 and 1925 (Dumenil, 1995). The Klan’s fraternalism offered support for local businesses and openly endorsed candidates for office. Many retail outlets advertised their connection by displaying signs that proclaimed “Trade with Klan,” whereas Jewish and Catholic businesses were often subjected to boycotts. The Klan also offered a platform for newly enfranchised women, who could combine nativist political views with support for Prohibition and women’s rights (Dumenil, 1995).

The Klan declined after 1925. The passage of the National Origins Act eliminated many supporters’ fears of a mongrelized America. In 1924 a sex scandal involving a prominent Indiana Klan leader led to a public trial and helped discredit the organization. The same year at a rally in Niles, Ohio, organized Irish and Italian immigrants clashed physically with Klan members, earning the organization more negative national press.

A public parade of Klansmen and women down Washington, D.C.’s Pennsylvania Avenue in August 1925 marked the fraternal organization’s last major public appearance. Although the Klan persisted with a much-reduced membership, its national influence waned by 1926 (Goldberg, 1999).