THE ROARING TWENTIES

 

 

Wars have the effect of accelerating change.  In addition to political changes, the First World War had an enormous effect upon technology and social attitudes.  The 1920s in the United States was a decade of enormous contrasts, as new attitudes, often engendered by the war clashed with a longing by many Americans to return to a simpler, more traditional form of life.  The clash between modern and traditional beliefs was accompanied by an unprecedented degree of economic progress, fueled primarily by new technology.  This progress, however, tended to further divide Americans, as many were left behind and did not share in the economic prosperity.  In short, the 1920s was a transition decade, filled with dramatic contrasts and enormous public controversies.

Wilson and the Treaty

·        Explain the factors that led the U.S. Senate to reject the Treaty of Versailles.

 

President Wilson returned to the United States from France convinced that the Treaty of Versailles offered the best hope for preventing future wars.  Even before his return, he was aware that Republican Senators, led by Henry Cabot Lodge of Massachusetts—the powerful chairman of the Foreign Relations Committee—had strong objections to the treaty and America’s participation in the League of Nations.  Wilson had failed to involve any of the Republican leaders in the treaty negotiations, and many were determined to oppose the ratification of the treaty without significant revisions (changes).  Their most important objection was based upon the fear that joining the League of Nations would commit the United States to continual involvement in European wars.  Many Americans were beginning to believe that the country’s involvement in the war had brought little benefit, as the victorious Allies had ignored Wilson’s Fourteen Points and pursued a vindictive settlement against Germany.  The treaty was also opposed by Irish-Americans—angered by its lack of support for Irish independence, Italian-Americans—disappointed that Italy did not receive more territory for its participation in the Allied cause, and German-Americans.

Wilson presented the Treaty of Versailles to the Senate on July 10, 1919, asking: “Dare we reject it and break the heart of the world?”  The United States Constitution required that the treaty be ratified by two-thirds of the Senate, and many of Wilson’s aides urged him to compromise with the treaty’s opponents in order to secure its passage.  Wilson stubbornly refused to make any form of compromise, insisting instead that the United States had a moral obligation to respect the terms of the agreement precisely as they had been negotiated in Paris.  If the United States insisted upon revisions, Wilson argued, other countries would as well, delaying the formation of the League of Nations.

When it became apparent that the Senate would never ratify the treaty without a guarantee that Congress would have to approve the use of any U.S. troops in support of the League, Wilson vowed to take his case directly to the people.  He traveled more than 8,000 miles across the country, delivering as many as four, hour-long-speeches per day.  Finally, the president’s health broke under the strain of this exhausting schedule.  After speaking at Pueblo, Colorado, on September 25, 1919, he collapsed with a severe headache.  Canceling the rest of his itinerary (schedule of events), he rushed back to Washington.  There, a few days later, Wilson suffered a major stroke.  For two weeks he was close to death, and for six more weeks he was so seriously ill that he could conduct virtually no public business.  His wife, Edith Galt Wilson, and his doctor prevented the public from receiving any accurate information about the gravity of his condition.  During her husband’s absence, Mrs. Wilson essentially ran the government by controlling access to the president.

The president ultimately recovered enough to resume a limited official schedule, but remained an invalid for the eighteen remaining months of his presidency.  His left side was partially paralyzed, but more importantly his mental and emotional state was unstable.  When the Senate Foreign Relations Committee finally reported the treaty to the full Senate, recommending nearly fifty amendments and reservations, Wilson refused to consider any of them.  When the Senate voted in November to accept fourteen of the reservations, Wilson gave stern directions to his Democratic allies that they must vote only for a treaty with no changes.  On November 19, 1919, the Senate refused to ratify the treaty.  There were sporadic efforts to revive the treaty over the next few months.  In March of 1920, the Senate voted 49 to 35 in favor of the treaty, but this was still 7 votes short of the necessary two-thirds majority.  As a result, the United States never joined the League.

Postwar Turmoil

·        Describe the economic problems that followed the end of the war.

·        Identify the factors that led to widespread strikes in 1919.

·        Describe the racial violence that occurred in 1919.

 

The war had ended earlier than most people anticipated.  During the war American industries had prospered enormously, first with military contracts from Britain and France and later from the United States’ own military buildup.  Suddenly, the defense contracts stopped.  Wartime price controls also ended, and the demand for consumer goods, unavailable during the war, drove prices up dramatically.  Through most of 1919 and 1920, prices rose at an average of more than 15% per year.

Soon the entire economy crashed as businesses struggled to convert from wartime to a peacetime economy.  Between 1920 and 1921, the gross national product (GNP: the total of all goods and services produced by the country) declined nearly 10%, the index of wholesale prices fell from 228 to 151, 100,000 businesses went bankrupt, 453,000 farmers lost their land, and nearly 5 million Americans lost their jobs.

During the war, a temporary labor shortage allowed workers to achieve many of the demands that they had struggled for in the previous decades.  Wages rose and many unions were finally able to win recognition of their right to collective bargaining.  After the war, workers now saw this progress under attack by employers.  As unemployment rose, employers attempted to cut wages and refused to bargain with union representatives.  As a result, an unprecedented wave of strikes erupted in 1919, involving over four million workers.  In Seattle, Washington a strike by shipyard workers soon evolved into a general strike (a strike that involves workers in most businesses) that paralyzed the entire city.  The mayor requested help from the military to keep city services running and the strike was eventually broken.  More serious was a strike by the entire police force of Boston in attempt to win recognition of their union.  Violence and looting erupted and finally Governor Calvin Coolidge was forced to call in the National Guard to restore order.[1]  Eventually, Boston officials dismissed the entire police force and hired replacements.  In September, 350,000 steelworkers across the country went on strike, demanding the recognition of their union and an eight-hour workday.  The strike lasted four months and was marked by frequent outbursts of violence.  In Gary, Indiana eighteen workers were killed in a battle with the police.  The strike finally collapsed, as steel companies were able to keep their plants open with non-union replacement workers.

Racial disruptions also struck many cities after the war.  During the war almost half a million rural southern blacks migrated to northern cities, enticed by job openings created by the war.  Almost overnight, large black communities sprang up in most northern cities.  For the first time, northern whites came into contact with large number of blacks.  Many whites feared that black workers would compete with them for jobs and lower their wages.  As early as 1917, serious race riots flared in cities as diverse as Houston, Philadelphia, and East St. Louis (where 49 people, 39 of them blacks, were killed).  When the war and plentiful jobs ended, racial tensions increased.  In Chicago, a black teen-ager swimming in Lake Michigan on a hot July day in 1919 happened to drift toward a white beach.  Whites on shore stoned him unconscious until he sank and drowned.  The incident ignited the severe racial tensions in the city.  For more than a week, Chicago was virtually at war.  White mobs roamed into black neighborhoods, shooting, stabbing, and beating passers-by, destroying homes and properties.  Blacks fought back and inflicted destruction of their own.  In the end, 38 people died (15 whites and 23 blacks), 537 were injured, and over 1,000 people were left homeless.  The Chicago race riots were the worst but not the only racial violence during the summer of 1919.  In all, 120 people died in such outbreaks in the space of little more than three months.

The Red Scare

·        Identify the factors that led to the “Red Scare” of 1919-1920.

·        What were the Palmer Raids?

·        Who were Sacco and Vanzetti?

 

The Bolshevik Revolution of 1917 in Russia sent a tremor of fear throughout the entire capitalist world.  Radical political ideas had sporadically emerged in the labor movements of most industrialized countries; however, the capture of a major country by Marxists determined to create a socialist utopia gave greater credibility to radicals in other countries.  This was one of the major reasons that the United States joined with Britain and France in sending troops to Russia in an unsuccessful attempt to overturn the communist revolution.  Fear that Bolshevism would spread increased after the Russian government formed the Communist International (or Comintern) in 1919 with the expressed goal of exporting their revolution throughout the world.  American business leaders saw the spread of Bolshevik-style radicalism in the strikes and racial turmoil of 1919.  The mayor of Seattle claimed that the general strike was an attempt by revolutionaries, “to establish a Soviet government.”  The leaders of the steel industry insisted that “radical agitators” had stirred up trouble among their employees, who were, they claimed, otherwise content with things as they were.

In April 1919, the post office intercepted several dozen parcels addressed to leading businessmen and politicians that were triggered to explode when opened.  Several others reached their destinations, one of them severely injuring the servant of a Georgia public official.  Two months later, eight bombs exploded in eight cities within minutes of one another, suggesting a nationwide conspiracy.  One of them damaged the home of Attorney General A. Mitchell Palmer in Washington.  Palmer ordered the Justice Department to take steps to quell what he later called the “blaze of revolution . . . sweeping over every American institution of law and order.”  On January 1, 1920, he and his assistant, J. Edgar Hoover, orchestrated a series of raids on alleged radical centers throughout the country and arrested more than 6,000 people.[2]  The Palmer Raids had hoped to uncover huge stores of weapons and explosives but they netted only a total of three pistols and no dynamite.  Nevertheless, many of those arrested spent days and weeks in jail with no formal charges filed against them.  Most were ultimately released, but about 500 who were not American citizens were deported.

The Palmer Raids were just one aspect of a general fear of radicalism that swept the country in the years after the war.  During this “Red Scare,” any socialist, labor organizer, or critic of the current system was subject to attack.  A mob of off-duty soldiers in New York City ransacked the offices of a socialist newspaper and beat up its staff.  In Washington, a mob dragged an IWW (Industrial Workers of the World—a socialist labor union) organizer from jail and castrated him before hanging him from a bridge.

The ferocity of the Red Scare soon ended, but its effects lingered well into the 1920s, most notably in the celebrated case of Sacco and Vanzetti.  In May of 1920, two Italian immigrants, Nicola Sacco and Bartolomeo Vanzetti, were charged with the murder of a paymaster in Braintree, Massachusetts.  The evidence against them was questionable; but because both men were known anarchists, they faced a widespread public presumption of guilt.  The judge in their trial, Webster Thayer, was openly prejudiced (later it was reported that the judge had boasted to a friend, “Did you see what I did to those anarchist bastards?”); and it was not surprising under the circumstances that they were convicted and sentenced to death.  Over the next several years, however, public support for Sacco and Vanzetti grew, but all requests for a new trial or a pardon were denied.  On August 23, 1927, amid widespread protests around the world, Sacco and Vanzetti, still proclaiming their innocence, died in the electric chair.  Throughout the saga, the media followed the fate of the two men closely:

The World War lasted four years and was duly chronicled as an international episode.  The case of Sacco and Vanzetti is seven years old and is still an international episode.  It is a tale filled with blood and tears, with Reds and bigwigs, with bombs and laws...

July 14, 1921

A jury found Messrs. Sacco and Vanzetti guilty of the South Braintree murders on the following evidence: Factory-window witnesses, who had previously identified other Italians as participants in the crime, swore that Messrs. Sacco and Vanzetti were the killers.  But, 20 Italians said they had purchased eels from Mr. Vanzetti at the hour of the crime, and the Italian consul in Boston swore that Mr. Sacco had been in his presence at that time.  However, the police who arrested them swore that they had drawn guns.  This was interpreted as “evidence of guilt.”  The jury was asked to do its duty as “did our boys in France”—an effective plea, considering the fact that Messrs. Sacco and Vanzetti were pacifists (against war) as well as radicals.

1921-1927

Motions for a new trial were repeatedly turned down while radicals flung bombs at many a U.S. embassy, while liberals protested against the injustice being done to the fish peddler and the shoemaker.

But there is more in the Sacco-Vanzetti case than is contained in a bare recital of its facts and dates.  During one morning last week the first mail alone brought to Governor Fuller 57 letters, some urging intercession, others protesting against intercession for Shoemaker Sacco, for Fish-peddler Vanzetti.  Twenty-two members of the British Parliament cabled Governor Fuller, demanding a new trial, viewing with horror the approaching executions of two men whose guilt they question.  Last week 7,000 New Yorkers gathered in Union Square, roared: “Stop the murder of Sacco & Vanzetti.”  In London, in Paris, in The Hague police guard U.S. embassies and consulates, fearing that European radicals will let bombs express their disapproval of Massachusetts justice.  And a long list of liberal intelligentsia, including Jane Addams of Hull House, Romain Rolland (French novelist), Felix Frankfurter of the Harvard Law School faculty, Albert Einstein (relativity theorist) and many another have enrolled themselves with the Sacco-Vanzetti sympathizers.

August 15, 1927

In their cells in the death house of the Massachusetts State Prison, Messrs. Sacco & Vanzetti heard last week that they were to die.  Mrs. Sacco and two advisers brought the news.  For an hour and a half, they talked together, while prison guards listened and looked. Mr. Sacco (then on the 19th day of a hunger strike) mumbled over and over, “I told you so, I told you so,” as if in rhythm with his throbbing, withered arteries.  Said Mr. Vanzetti: “I don't believe it.”

August 29, 1927

After seven years of premeditation, blood was shed beside a so-called cradle of American liberty, Boston…

Guilty or not, justly or not, Nicola Sacco, clean-shaven factory worker and Bartolomeo Vanzetti, mustachioed fish-peddler, were informed last Monday evening that they must die that midnight for the murders—which to the end they denied committing—of a paymaster and guard at South Braintree, Mass., in 1920.

Prisoners Sacco and Vanzetti died in the order that their names had long been coupled, seven minutes apart.[3]

Return to “Normalcy”

·        What was the purpose of the eighteenth and nineteenth amendments to the constitution?

·        Why did the Election of 1920 indicate the end of the Progressive Era?

·        Describe the presidency of Warren G. Harding.

 

For two decades prior to 1920 America had embraced the concept of change.  The Progressive Era addressed many of the worst aspects of industrialization, and an unprecedented (never before) flurry of laws were passed during the Roosevelt, Taft, and Wilson administrations limiting the power of private businesses and expanding the role of citizens in their government.  World War I was seen by many as another manifestation of the idealism of the Progressive Era.  America had entered the war not as a result of a direct attack by a hostile power or out of a desire to add territory.  President Wilson’s call to fight “a war to end all wars” and to make the world “safe for democracy” had been in the finest progressive tradition of idealism.  In 1920, two additional progressive amendments were added to the constitution.  The eighteenth prohibited the manufacture, transport, or sale of alcohol.  The nineteenth guaranteed the right of women to vote.  Both prohibition and women’s suffrage had been primary progressive goals.  Progressives believed that the votes of women would guarantee that the period of reform and idealism would continue into the 1920s.

In retrospect, 1920 marked the end of the Progressive Era.  Disillusioned by a war that accomplished little for the U.S. and the economic turmoil of the postwar years, Americans turned away from idealism and reform in the election of 1920.  The victorious candidate, Republican Warren G. Harding, promised a return to “normalcy” and reminded voters, “all human ills are not curable by legislation.”

Harding was an obscure Ohio senator whose only real asset seemed to be his ability to please everyone (His father had once commented “If you were a girl, Warren, you’d be in the family way all the time.  You can’t say no.”).  Party leaders had settled on him late one night at the Republican Convention in a “smoke-filled room” in a Chicago hotel, confident that he would follow their instructions once in office.  In the course of his campaign, Harding offered few new ideas, only a vague and comfortable reassurance of stability.  Harding and Massachusetts Governor Calvin Coolidge defeated the Democratic candidates, Ohio Governor James M. Cox and Assistant Secretary of the Navy Franklin D. Roosevelt in a landslide.[4]  The Republicans received 61% of the popular vote and carried every state outside the South.  Republicans made major gains in Congress as well.  The results of the election seemed to show that the American people seemed to have grown tired of idealism, reform, controversy, and instability.  For decades, they had been living in turbulent times.  Many now yearned for tranquillity.

Harding was elected to the presidency having spent many years in public life doing little of note.  He had advanced from the editorship of a newspaper in his hometown of Marion, Ohio, to the state legislature by virtue of his good looks, polished speaking style, and friendliness.  He had moved from there to the United States Senate as a result of the support of Ohio’s Republican political bosses.  And he had moved from Congress to the White House as a result of a political agreement among leaders of his party who considered him, as one noted, a “good second-rater.”

The new president had few illusions about his own qualifications for office.  Awed by his new responsibilities, he made a sincere effort to perform them with distinction.  Even as he attempted to rise to his office, he exhibited a sense of bafflement about his situation, as if he recognized his own unfitness.  “I am a man of limited talents from a small town,” he reportedly told friends on one occasion, “I don't seem to grasp that I am president.”  Unsurprisingly, Harding soon found himself delegating much of his authority to others, members of his cabinet, political cronies (friends), Congress, and Republican Party leaders.  In the meantime, the nation's press, overwhelmingly Republican, portrayed him as a wise and effective leader.

Harding's personal weaknesses as much as his political naiveté (lack of sophistication) eventually resulted in historians naming him as one of the country’s most ineffective presidents.  He realized the importance of capable subordinates in an administration in which the president himself was reluctant to act.  At the same time, however, he lacked the strength to abandon the party hacks that had helped create his political success.  One of them, Harry Daugherty, the Ohio party boss principally responsible for Harding’s political success, was appointed Attorney General.  Another, Albert B. Fall, became Secretary of the Interior.  Members of the so-called “Ohio Gang” filled important offices throughout the administration.  It was widely known within the government that the president's cronies (friends) led active, illicit social lives; that they gathered nightly at the famous “House on K Street” to drink illegal alcohol, play poker, and entertain women.  The president himself often joined in all these activities.  In 1963, over one hundred love letters were discovered that Harding had written to Carrie Phillips, the wife of a close friend who was his neighbor in Marion, Ohio.  When Harding became a presidential nominee, the Republican National Committee packed Mrs. Phillips and her husband off on a slow boat to Japan with some $20,000 in spending money.  After Harding’s death, his longtime mistress, Nan Britton, published a book entitled The President’s Daughter in which she alleged that he began an affair with her when she was sixteen, fathered her illegitimate daughter in 1919, and frequently made love to her in a White House closet.  At the time, however, the public was totally unaware of these indiscretions.

The Economic Boom

·        Explain the factors that led to the business boom of the Twenties.

·        What new industries began during the decade?

 

Harding’s failings as a leader made little difference to the country as the United States soon embarked on a period of economic prosperity that was to last until the end of the decade.  The nation's manufacturing output rose by more than 60% during the decade; the gross national product increased at an average of 5% a year, and output per worker rose by more than 33 %.  Per capita income grew by a third, while inflation remained virtually non-existent.

The economic boom was a result of many things.  The most obvious immediate cause was the destruction of the major European powers during the World War, leaving the United States for a time the only vigorous industrial power in the world.  During the decade the U.S. controlled 40% of the world’s wealth.  More important, however, was an enormous degree of technological advancement and new products that fueled the growth of American industry.  By the middle of the decade, industry accounted for 80% of the GNP, up from 50% in 1880.

The most significant industrial development of the 1920s was the growth of the American automobile industry.  The first cars had appeared on American roads in the 1890s, and by 1920 this number had grown to three million.  Americans bought an additional one and a half million cars in 1921.  By 1929, car sales had reached more than five million per year.  The greatest reason for the growth of automobile sales was their drop in price due to the use of assembly line production techniques.  These techniques were not new to American industry, but were first utilized in automobile production by Henry Ford.  A former employee of Thomas Edison, Ford began the Ford Motor Company in 1903 and set out to build a simple car that every American could afford.  His “Model T” sold for $850 in 1908, but by 1924 he had managed to lower the price to only $300.  Eventually, Ford sold over fifteen million Model T’s before he introduced a new model in 1927.

The growth of the automobile industry greatly spurred progress in other related industries.  Auto manufacturers purchased the products of steel, rubber, glass, and tool companies.  Auto owners also bought large amounts of gasoline from oil companies (U.S. oil production went from fifty million barrels in 1920 to one billion in 1928).  Road construction became an important industry, and the increased mobility that the automobile afforded increased the demand for suburban housing, creating a boom in the construction industry.

Other new industries benefiting from technological innovations also contributed to the economic growth.  Commercial radio became a booming industry within a few years of its commercial debut in 1920.  By 1922, there were three million radio sets in operation.  The motion picture industry expanded dramatically, especially after the introduction of sound in 1927.  Aviation, electronics, and home appliances all helped sustain unprecedented (never occurring before) American economic growth.  The invention of new plastics and synthetic fibers helped the chemical industry become an important force in the nation’s economy.

Middle-class families rushed to purchase new appliances as radios, phonographs, electric refrigerators, washing machines, and vacuum cleaners became standard items in most middle-class homes (there was a 500% increase in the number of homes wired with electricity between 1920-1930).  Men and women wore wristwatches and smoked cigarettes.  Women purchased cosmetics and mass-produced fashions.  Charles Birdseye revolutionized the way Americans ate when he discovered the first effective way to freeze foods.  Americans in every part of the country ate commercially processed foods distributed nationally through chain stores and supermarkets, such as A & P, which increased from 400 to 15,000 stores during the decade. 

Increasing the demand for consumer products were the first professionally organized advertising campaigns.  The advertising industry, which had its origin in wartime propaganda efforts, had become a one and a half billion-dollar industry by the end of the decade.  One of the most creative forms of advertising was developed by Burma-Shave, which placed its advertisements along roads, divided into sentences that people read as they passed by:

A peach looks good

with lots of fuzz

but man’s no peach

and never was

Burma-Shave

Henry the Eighth

Prince of Friskers

lost five wives

but kept his whiskers

Burma-Shave

 

Prices remained low throughout the decade (the Consumer Price Index fell from 115 in 1920 to 100 in 1928) due to mass production and low labor costs.  Even products that could not be immediately afforded could be bought on generous terms of credit, known as “paying on installment.”[5]  By 1928, an incredibly high percentage of goods were being bought on credit (85% of all furniture, 80% of phonographs, 75% of dishwashers, and 70% of refrigerators).

“The Business of America is Business”

·        What was the Teapot Dome Scandal?

·        In what ways was President Coolidge different from President Harding?  In what ways were the two men similar?

·        How did the federal government support the development of big business?

·        What is “trickle-down” economics?

 

The economic prosperity of the Harding years masked the fact that many of the members of his administration were engaged in a widespread pattern of fraud and corruption.  They sold government offices and favors, bribed congressmen and senators to support legislation favorable to their interests, and plundered the agencies and departments in which they worked.

The most spectacular scandal involved the rich naval oil reserves at Teapot Dome, Wyoming and Elk Hills, California.  At the urging of Albert Fall, his Secretary of the Interior, Harding transferred control of those reserves from the Navy Department to the Interior Department.  Fall then secretly leased them to two wealthy businessmen and received in return nearly half a million dollars in “loans” to ease his private financial troubles.  Fall was ultimately convicted of bribery and sentenced to a year in prison.  Harding’s trusted friend, Harry Daugherty, barely avoided a similar fate for his part in another scandal.

For several years Harding remained generally unaware of the corruption infecting his administration.  But by the summer of 1923, only months before Senate investigations and press revelations brought the scandals to light, he began to realize how desperate his situation had become.  Tired and depressed, the president left Washington for a speaking tour in the West and a visit to Alaska.  In Seattle, late in July, he suffered severe pain, which his doctors wrongly diagnosed as food poisoning.  A few days later, he seemed to recover and traveled on to San Francisco.  There, on August 2, he died after suffering two major heart attacks.

Calvin Coolidge succeeded Harding in the presidency.  During his years in Massachusetts politics, Coolidge had won a reputation as a safe, trustworthy figure; and largely as a result of that, he had been elected governor in 1919.  His response to the Boston police strike won him national attention and his party's vice-presidential nomination.  Three years later news of Harding's death reached him in Vermont; and there, by the light of a kerosene lamp on a kitchen table, he took the oath of office from his father, a justice of the peace.  In matters of personality and style, Harding and Coolidge could not have been more dissimilar.  Whereas Harding was outgoing and friendly, Coolidge was introverted and almost silent.  Where Harding embraced a loose, even immoral life style, Coolidge lived like a monk.  And while Harding, if not personally corrupt, was tolerant of corruption in others; Coolidge was honest beyond reproach.  The image of stolid respectability he projected was so convincing that the Republican Party managed to avoid any lasting damage from Teapot Dome and related scandals.  In other ways, however, Harding and Coolidge were similar figures.  Both represented an unadventurous conservatism.  Both took an essentially passive approach to their office, and strongly favored the interests of big business.  As president, Coolidge often said, “The business of America is business.”

Coolidge was an even less active president than Harding, partly as a result of his conviction that government should interfere as little as possible in the life of the nation, and partly as a result of his own personality.  He took long naps every afternoon (often sleeping over twelve hours per day).  He kept official appointments to a minimum and engaged in little conversation with those who did manage to see him.  He proposed no significant legislation and took little part in the running of the nation's foreign policy.  “He aspired,” wrote one of his contemporaries, “to become the least president the country ever had and attained his desire.”

However motionless Harding and Coolidge may have been, much of the federal government was working effectively and efficiently during the 1920s to adapt public policy to the widely accepted goal of the time: helping business and industry operate with maximum profitability and productivity.  Congress raised the tariff to protect American industry, reducing European imports by 50%.  In the executive branch, the most active efforts came from members of the cabinet.  Secretary of the Treasury Andrew Mellon, a wealthy steel and aluminum tycoon, devoted himself to working for substantial reductions in taxes on corporate profits, personal incomes, and inheritances.  Largely because of his efforts, Congress cut all of these taxes by more than half, with the highest tax rate on personal income dropping from 73% to 25%.  Because only the affluent (wealthy) paid most of these taxes, the cuts only benefited a small minority of the population.  Mellon claimed, however, that their effect would be to stimulate investment and that the benefits would “trickle-down” to the rest of the population.  Mellon also worked closely with Coolidge on a series of measures to trim dramatically the already modest federal budget.  The administration even managed to retire half the nation's World War I debt.

The Supreme Court in the 1920s further confirmed the business orientation of the federal government, particularly after the appointment of former president William Howard Taft as chief justice in 1921.  The Court struck down federal legislation regulating child labor (Bailey v. Drexel Furniture Company, 1922), nullified a minimum wage law for women in the District of Columbia (Adkins v. Children's Hospital, 1923), and weakened antitrust legislation in several of its decisions.

The Other Side of Prosperity

·        What groups did not share in the prosperity of the decade?

·        Explain why union membership declined during the 1920s.

·        Explain the difficulties that farmers experienced during the Twenties.

 

Remarkable economic growth was only one side of the American economy in the 1920s.  Another was the unequal distribution of wealth and purchasing power that persisted during the decade.  The prosperity of the decade was real enough, but most of its benefits flowed to only a minority of the population.  More than two-thirds of the American people in 1929 lived at no better than what one study described as the “minimum comfort level.”  Sixty percent of the nation’s wealth was controlled by the richest 6% of the population, and the wealth of the richest 27,000 Americans was equal to that of the twelve million poorest.  Unskilled workers, in particular, saw their wages increase very slowly (only a little over 2%) between 1920 and 1926.  Many workers, moreover, enjoyed little security in their jobs.  An average of between 5-7% of workers were unemployed between 1923 and 1929.

One of the most significant factors in the economic inequity of the decade was the weakness of labor unions.  Union membership suffered a serious decline in the 1920s, falling from more than five million in 1920 to less than three million in 1929.  Corporate leaders worked hard after the labor turmoil of 1919 to spread the doctrine that unionism was subversive and un-American and that a crucial element of democratic capitalism was the protection of the open shop (a business in which no worker could be required to join a union).  When such tactics proved insufficient to counter union power, government assistance often made the difference.  In 1921, the Supreme Court upheld a ruling that declared picketing illegal and supported the right of lower courts to issue injunctions (court orders) against strikers.  In 1922, the Justice Department intervened to subdue a strike by 400,000 railroad workers.  In 1924, the courts refused protection to members of the United Mine Workers Union when mine owners launched a violent campaign in western Pennsylvania to drive the union from the coalfields.

Some employers in the 1920s, eager to avoid disruptive labor unrest and forestall the growth of unions, adopted more benevolent (kindly) techniques that came to be known as “welfare capitalism.”  Industrialists such as Henry Ford shortened the workweek for employees and instituted paid vacations.  Manufacturers such as U.S. Steel spent millions of dollars installing safety devices and improving sanitation in the workplace.  Most importantly, many employers offered their workers substantial raises in pay and other financial benefits (Ford workers earned five dollars per day while most industrial workers earned less than two).  By 1926, nearly three million industrial workers were eligible for pensions upon retirement.  In some companies, employees were permitted to buy stock at below-market value.  These efforts were, however, limited to only a few companies and still gave workers very little control over their own fate.  Without union representation, what companies gave workers could be easily taken away at a later date.  Regardless, many industrial workers showed little interest in joining unions during the decade.

Unions also ignored two important groups of workers during the 1920s.  A growing proportion of the work force consisted of women who were concentrated in what have since become known as “pink-collar” jobs (low-paying, service occupations).  Large numbers of women worked as secretaries, salesclerks, and telephone operators.  Because such positions were technically not industrial jobs, the American Federation of Labor (AFL) and other labor organizations were uninterested in organizing these workers.  Black workers were another group that received little attention from the unions.  The half-million blacks that had migrated from the rural South into the cities during the Great Migration after 1914 constituted a small but significant proportion of the unskilled work force in industry, but as unskilled workers they had few opportunities for union representation.  The skilled crafts represented in the AFL often worked actively to exclude blacks from their trades and organizations.  Most urban blacks worked as janitors, dishwashers, garbage collectors, and in other service positions, which the AFL made no effort to unionize.

In the end, American workers remained in the 1920s a relatively impoverished and powerless group.  Their wages rose, but the average annual income of a worker remained below $1,500 a year at a time when $1,800 was considered necessary to maintain a minimally decent standard of living.  Only by relying on the earnings of several family members at once could many working-class families make ends meet.  In some industries, such as coal mining and textiles, hours remained long and wages hardly rose at all.  Nor could workers do very much to counter the effects of technological unemployment.  Total factory employment hardly increased at all during the 1920s, even while manufacturing output was soaring.

American farmers also failed to join in the prosperity of the decade.  Traditional problems of overproduction and low prices continued during the decade.  Advancements in technology also played a role in this area.  The number of tractors at work on American farms quadrupled during the 1920s, helping to open 35 million new acres to cultivation.  The result was a dramatic decline in food prices and a severe drop in income for farmers as farm prices dropped 50% during the decade.  The per capita annual income for Americans not engaged in agriculture in 1929 was $870, but for farmers it was only $223.  In 1920, farm income had been 15% of the national total; by 1929, it was 9%.  According to the Department of Agriculture, the average price for an acre of farmland in 1920 was $108; by 1926 it had fallen to $76.  More than three million people left agriculture altogether in the course of the decade.  Of those who remained, many were forced into tenancy, losing ownership of their lands and having to rent instead from banks or other landlords.

Some industries also declined as a result of technological changes.  Competition from oil and natural gas hurt the coal industry and over a thousand mines closed during the decade.  Many textile industries left their traditional home in New England for cheaper labor costs in the South.  Railroads also suffered due to the growth of the automobile and trucking industries.

Social Criticism

·        What did the term “Lost Generation” mean?

·        What were the main themes of A Farewell to Arms, The Great Gatsby, and the novels of Sinclair Lewis?

·        Who was H. L. Mencken?

·        What was the Harlem Renaissance?

 

Gertrude Stein once referred to the young Americans emerging from World War I as a “Lost Generation.”  For many writers and intellectuals, at least, it was an apt description.  At the heart of the Lost Generation's viewpoint was a complete rejection of the values and mores (code of right and wrong) of the pre-war world.  The idealism and belief in progress of the pre-war period now appeared, especially to the young, as hopelessly naive.  As a British writer, D.H. Lawrence, summarized, “All the great words were cancelled out for that generation.”

This disillusionment had its roots in many things, but in nothing so deeply as the experience of World War I.  To those who had fought in France and experienced the horror and savagery of modern warfare, and even to those who had not fought but who nevertheless had been aware of the appalling costs of the struggle, the aftermath of the conflict was shattering.  Nothing, it seemed, had been gained from the enormous sacrifice.  The purpose of the war had been a fraud; the suffering and the dying in vain.  Ernest Hemingway, one of the most commercially successful of the new breed of writers, expressed his generation's contempt for the war in his novel A Farewell to Arms (1929).  Its hero, an American officer fighting in Europe, decides that there is no justification for his participation in the conflict and deserts the army with a nurse with whom he has fallen in love.

Equally disillusioning as their war experiences was the character of the nation these young intellectuals found on their return home at war's end.  It was, they believed, a society utterly lacking in vision or idealism, obsessed with materialism, and steeped in outmoded and hypocritical morality.  Worst of all, it was one in which individuals had lost any control over their own lives.  One result of this alienation was a series of savage critiques of modern society by a wide range of writers, often self-described as “debunkers.”  Particularly influential was Baltimore journalist H. L. Mencken.  In the pages of his magazines, first the Smart Set and later the American Mercury, he delighted in ridiculing everything Americans held dear: religion, politics, the arts, even democracy itself.  He found it impossible to believe, he claimed, that “civilized life was possible under a democracy,” because it was a form of government that placed power in the hands of the common people, whom he ridiculed as the ignorant “booboisie.”  Echoing Mencken's contempt was the novelist Sinclair Lewis, the first American to win a Nobel Prize in literature.  Lewis's novels lashed out at one aspect of modern society after another.  In Main Street (1920), he satirized life in a small midwestern town.  Babbitt (1922) ridiculed life in the modern city, Arrowsmith (1925) attacked the medical profession, and Elmer Gantry (1927) satirized fundamentalist religion.  Intellectuals of the 1920s turned their backs on the traditional goals of their parents.  F. Scott Fitzgerald ridiculed the American obsession with material success in The Great Gatsby (1925).  The novel's hero, Jay Gatsby, spent his life accumulating wealth and social prestige in order to win the woman he loved.  The world to which he aspired, however, turns out to be one of pretension, fraud, and cruelty, ultimately destroying Gatsby.

To another group of intellectuals, the solution to contemporary problems lay neither in escapism nor in progressivism, but in an exploration of their own cultural origins.  In New York City a new generation of black intellectuals created a flourishing African-American culture, described as the “Harlem Renaissance.”  The Harlem poets, novelists, and artists drew heavily from their African roots in an effort to prove the richness of their own racial heritage.  The poet Langston Hughes captured much of the spirit of the movement in a single sentence: “I am a Negro and I am beautiful.”  In addition to literature, Harlem developed its own form of entertainment and music during the decade.  Young people of both races flocked to Harlem to hear the latest jazz music played by Duke Ellington, Count Basie, and Louis Armstrong.  Many white artists copied the sound of black musicians.

A Decade of Contrasts

·        What factors tended to break down regionalism and form a national culture during the twenties?

·        What impact did motion pictures and radio have on the country?

·        How did the role of women change during the decade?

·        Describe the country’s main source of cultural conflict.

 

One popular image of the 1920s is that of an endless party.  The Jazz Age” is a nickname often applied to the decade, implying a period in which there were few social inhibitions.  In reality, the 1920s were a decade of contrasts and no area exhibited these contrasts as strongly as the country’s social development.

One major development was the breakdown of regionalism.  Increased communications tended, for the first time, to create a nationwide culture, especially in the nation’s cities.  The number of local newspapers was rapidly shrinking; those that survived often became members of great national chains, which meant that readers in widely scattered parts of the country were reading the same material in their various newspapers.  In addition, there were a growing number of national, mass-circulation magazines: Time, Reader's Digest, The Saturday Evening Post, and others aimed at the widest possible audience.

The most important communications vehicle of all, however, was the only one that was truly new to the 1920s—radio.  The first commercial radio station in America, KDKA in Pittsburgh, began broadcasting on election night in 1920; and the first national radio network, the National Broadcasting Company (NBC), took form in 1927.  By 1923 there were more than 500 radio stations, covering virtually every area of the country.  At the end of the decade, more than twelve million families owned radio sets.  Broadcasting became the most important vehicle for linking the nation together, providing Americans everywhere with instant access to a common source of information and entertainment.

Even more influential in shaping the culture of the 1920s was the growing popularity of the movies.  Over a hundred million people saw films in 1930, as compared to only forty million in 1922.  The addition of sound to motion pictures, beginning with the first “talkie” in 1927 The Jazz Singer with Al Jolson, created nationwide excitement.  All across the nation Americans were watching the same films, idolizing the same screen stars, and absorbing the same set of messages and values.  Actors and actresses became overnight celebrities.  Rudolph Valentino became famous for movies in which he played an Arab lover, “the Sheik of Araby.”  When he died suddenly in 1926, several women committed suicide (two Japanese women were said to have jumped into a volcano hand in hand).

One result of the communications revolution was that America became a society in which fads and obsessions could emerge suddenly and quickly sweep the country.  Radio helped elevate sports, and in particular professional baseball and college football, from the level of limited local activities to that of a national craze.  Athletes like Babe Ruth, Red Grange, Bobby Jones, Jack Dempsey, Bill Tilden, and Johnny Weissmuller became nationwide heroes.  Men and women across the country enjoyed the same popular stunts: flagpole sitting, marathon dancing, and goldfish swallowing.  They also shared an interest in national sensations, such as the famous murder trial of Leopold and Loeb or the tortuous progress of the Sacco-Vanzetti case. [6]  Frederick Lewis Allen, the celebrated chronicler of the 1920s, referred to the decade as the “ballyhoo years.”

The greatest stunt of the decade produced its greatest hero when twenty-five year old pilot, Charles A. Lindbergh became the first man to fly across the Atlantic alone in 1927:

Late one evening last week Capt. Charles A. Lindbergh studied weather reports and decided that the elements were propitious for a flight from New York to Paris.  He took a two-hour sleep, then busied himself with final preparations at Roosevelt Field, Long Island. Four sandwiches, two canteens of water and emergency army rations along with 451 gallons of gasoline were put into his monoplane, Spirit of St. Louis.

He entered the cockpit.  At 7:52 a.m. he was roaring down the runway, his plane lurching on the soft spots of the wet ground.  Out of the safety one, he hit a bump, bounced into the air, quickly returned to earth.  Disaster seemed imminent; a tractor and a gully were ahead.  Then his plane took the air, cleared the tractor, the gully; cleared some telephone wires.  Five hundred onlookers believed they had witnessed a miracle.  It was a miracle of skill.

Captain Lindbergh took the shortest route to Paris—the great circle—cutting across Long Island Sound, Cape Cod, Nova Scotia, skirting the coast of Newfoundland.  He later told some of his sky adventures to the aeronautically alert New York Times for syndication: “Shortly after leaving Newfoundland, I began to see icebergs...  Within an hour it became dark.  Then I struck clouds and decided to try to get over them.  For a while I succeeded at a height of 10,000 feet.  I flew at this height until early morning.  The engine was working beautifully and I was not sleepy at all.  I felt just as if I was driving a motor car over a smooth road, only it was easier.  Then it began to get light and the clouds got higher... Sleet began to cling to the plane.  That worried me a great deal and I debated whether I should keep on or go back.  I decided I must not think any more about going back...

Captain Lindbergh then told how he crossed southwestern England and the Channel, followed the Seine to Paris, where he circled the city before recognizing the flying field at Le Bourget.  Said he: “I appreciated the reception which had been prepared for me and had intended taxiing up to the front of the hangars, but no sooner had my plane touched the ground than a human sea swept toward it.  I saw there was danger of killing people with my propeller and I quickly came to a stop.”[7]

Lindbergh completed his 3,600-mile conquest of the Atlantic in 33 hours, 29 minutes, at an average speed of 108 miles per hour.  New Yorkers later dropped 1,750 tons of confetti on Lindbergh’s welcome home parade and he received three and a half million letters from admirers in the month following his flight (75% were from women, many of whom enclosed their picture).

Enrollment in colleges and universities increased threefold between 1900 and 1930, with much of that increase occurring after World War I.  Between 1918-1930 the number of students in college doubled, including nearly twenty percent of the college-age population.  An increasing number of students saw school as a place not just for academic training but for social activities as well.  Organized athletics, clubs, and fraternities and sororities allowed young people to define themselves less in terms of their families and more in terms of their peer group.  College was seen as a time for social enjoyment as well as academic training.

In some senses, the changes of the postwar years offered women a form of liberation.  College-educated women were no longer pioneers in the 1920s.  They were forming the second and third generations of graduates of women’s or coeducational colleges and universities; and they were occasionally making their presence felt in professional areas that in the past they had rarely penetrated.  A substantial group of women now attempted to combine marriage and careers; 25% of all woman workers were married in the 1920s.  Most middle-class women, however, still generally had to choose between work and family.  Professional opportunities remained limited by society’s assumptions about what were suitable occupations for women.  Although there were notable success stories about woman business executives, journalists, doctors, and lawyers, most professional women remained confined to such fields as fashion, education, social work, nursing, and the lower levels of business management.

The “new professional woman” was a vivid and widely publicized image in the 1920s.  In reality, however, most employed women were nonprofessional, lower-class workers.  Most middle-class women remained largely in the home.  The number of employed women rose by several million in the 1920s, but the percentage of women employed rose scarcely at all.  Society as a whole still had little tolerance for the idea of combining marriage and a career, and most women found little support for their ambitions.  Nevertheless, women’s organizations and female political activities grew in many ways in the 1920s.  The General Federation of Women’s Clubs, the YWCA, and other female philanthropic (charity) and reform groups expanded.  Responding to the suffrage victory, women organized the League of Women Voters and the women’s auxiliaries of both the Democratic and Republican parties.  Female-dominated consumer groups grew rapidly and increased the range and energy of their efforts.

The 1920s saw important new advances in the creation of a national birth control movement.  The pioneer of American birth control was Margaret Sanger, who had spent most of her adult life promoting and publicizing new birth control techniques.  At first, she had been principally concerned with birth control for working-class women, believing that large families were among the major causes of poverty and distress in poor communities.  By the 1920s (partly because she had limited success in persuading working-class women to accept her teachings), she was becoming more concerned with persuading middle-class women of the benefits of birth control.  Birth-control devices began to find a large market among middle-class women, even though some techniques remained illegal in many states (abortion remained illegal nearly everywhere).  The declining birth rate meant that many women had to spend fewer years caring for children.  The introduction of laborsaving appliances (washing machines, refrigerators, vacuum cleaners) in the home reduced some of the burdens of housework (although not always the amount of time devoted to housework, since standards of cleanliness rose simultaneously).  Many middle-class women experienced a significant increase in their leisure time.

The new view of womanhood had its greatest effect on young women who were working or in college.  Many of these women concluded that it was no longer necessary to maintain a rigid, Victorian standard of female “respectability,” and that women could adopt less inhibited life styles.  They could smoke, drink, dance, wear seductive clothes and make-up, engage in premarital sex, and attend lively parties.  The popular image of the “flapper,” the modern woman whose liberated life style found expression in dress, hairstyle, speech, and behavior became one of the most widely discussed features of the era.[8]  The flapper lifestyle had a particular impact on lower middle-class and working-class single women, who were flocking to new jobs in industry and the service sector.  At night, such women, clad in short skirts, flocked to clubs and dance halls in search of excitement and companionship.

The modern culture of the 1920s was not unchallenged.  It grew up alongside an older, more traditional culture, with which it continually and often bitterly competed.  The new culture reflected the values and aspirations of an affluent, largely urban middle-class, committed to a new, increasingly uninhibited lifestyle, linked to a national cultural outlook.  The older culture expressed the outlook of generally less affluent, less urban, more provincial (isolated or rural) Americans men and women who continued to revere traditional values and customs and who feared and resented the modernist threats to their way of life.  Beneath the apparent stability of the decade raged a series of harsh cultural controversies.  Three areas where the cultural clash was most apparent were prohibition, religious fundamentalism, and the re-emergence of the Ku Klux Klan.

Prohibition

·        What factors led to the failure of prohibition?

·        How did prohibition contribute to the cultural division of the country?

·        What was the twenty-first amendment?

 

When the prohibition of the sale and manufacture of alcohol went into effect in January 1920, it had the support of most members of the middle class and most of those who considered themselves progressives.  Within a year, however, it had become clear that the “noble experiment,” as its defenders called it, was not working well.  Prohibition did substantially reduce drinking, at least in some regions of the country; however, it also produced conspicuous (open and obvious) and growing violations that made the law an almost immediate source of ridicule and controversy.  The first prohibition commissioner promised rigorous enforcement of the new law, but violations were soon so rampant that the resources available to him proved ludicrously insufficient.  The government hired only 1,500 agents to do the job.  Before long it was almost as easy to acquire illegal alcohol in much of the country as it had once been to acquire legal alcohol.  A prohibition agent toured major American cities posing as a would-be drinker to find out how difficult it was to purchase alcohol:

New Orleans                      35 seconds

Detroit                                3 minutes

New York                          3 minutes

Boston                                11 minutes

Chicago                              21 minutes

Washington                         2 hours, ten minutes[9]

Between 1921 and 1925, the government seized 696,933 stills from home distillers.  More significant than the ineffectiveness of the law, however, was the role prohibition played in stimulating organized crime.  As an enormous and lucrative industry was now barred to legitimate businessmen, underworld figures quickly and decisively took it over.  In Chicago, Al Capone built a criminal empire based largely upon illegal alcohol and guarded it against competitors with an army of as many as 1,000 gunmen whose zealousness contributed to the violent deaths of more than 250 people in the city between 1920 and 1927.  Other regions produced gangsters and gang wars of their own.  On Valentine’s Day 1929 Al Capone eliminated seven members of rival “Bugs” Moran’s gang:

It was 10:20 o’clock on St. Valentine’s morning.  Chicago brimmed with sentiment and sunshine.  Peaceful was even the George (“Bugs”) Moran booze-peddling depot on North Clark Street, masked as a garage of the S.M.C. Cartage Co., where lolled six underworldlings, waiting for their breakfast coffee to cook.  A seventh, in overalls, tinkered with a beer vat on a truck. 

Into the curb eased a car, blue and fast, like the Detective Bureau’s.  Through the office door strode four men.  Two, in police uniforms, swung sub-machine guns.  Two, in plain clothes, carried stubby shotguns.

The gangsters in the office raised their hands.  Their visitors marched them back into the garage, prodding their spines with gun muzzles.  Tin coffee cups clattered to the stone floor.  Snarled orders lined the six gangsters up along the north wall, their eyes close to the white-washed brick.  The visitors booted the overalled mechanic into the lien and “frisked” away hidden guns.

One of the men at the wall said: “What is this, a...”

“Give it to ‘em!” was the answer.  The garage became a thunder-box of explosions.

From the four guns streamed a hundred bullets.  Only eight of them ever reached the brick wall behind the seven targets.  One man, all blood, tried to crawl away.  A volley at six inches ripped away his head above the ears.  The others toppled over into the careless postures of death.

A Mrs. Alphonsine Morin, across the street, saw two men, hands over head, walk out of the garage, followed by two uniformed policemen with leveled guns.  Obviously a raid and an arrest.  She watched captors and captives enter the blue car, which flashed down the street, passed a trolley on the wrong side, melted away in traffic.  Real police came jostling through the gabbling crowd that quickly collected.  They counted the neat row of bodies by the wall—six dead, one dying.  It was a record, even for Chicago.

“Bugs” Moran, the proprietor of the garage, was not among the dead.

Gangster Capone was reported to be lolling innocently in Miami Beach, Fla, on St. Valentine’s Day.[10]

When Capone finally went to jail, for tax evasion, among his comments were “All I ever did was to supply a demand that was pretty popular,” and “Public service is my motto.”

Prohibition, in short, became not only a national joke but also a national scandal.  Nevertheless, it survived throughout the decade.  The middle-class progressives who had originally supported prohibition may have lost interest; but an enormous constituency of largely rural, overwhelmingly Protestant Americans, continued fervently to defend it.  To them, drinking and the general sinfulness with which they associated it were an assault on their conservative code of morality.  Prohibition had always carried implications far beyond the issue of drinking itself.  It represented the effort of an older, rural America to maintain its dominance in a society that was moving forward in spite of it.  As the decade proceeded, opponents of prohibition (or “wets,” as they came to be known) gained steadily in influence.  Not until 1933, however, when the Great Depression added weight to their appeals, were they finally able effectively to challenge the “drys” and win repeal of the eighteenth amendment with the passage of the twenty first amendment.

Religious Fundamentalism

·        What was fundamentalism?

·        Describe the Scopes Monkey Trial.

 

Another great cultural controversy of the 1920s revealed even more starkly the growing gulf between the new culture and the old.  It was a bitter conflict over questions of religious doctrine and, even more, over the place of religion in contemporary society.  By 1921, American Protestantism was already divided into two warring camps.  On one side stood the modernists, mostly urban, middle and upper class people who had attempted to adapt religion to the teachings of modern science and to the realities of their modern, secular (non-religious) society.  On the other side stood the fundamentalists, largely (although not exclusively) rural men and women, fighting to preserve traditional faith and to maintain the central position of religion in American life.  The fundamentalists looked with horror at the new morality of the modern city.  They expressed outrage at the abandonment of traditional beliefs in the face of scientific discoveries, insisting that the Bible be interpreted literally (as the absolute truth or word of God).  Above all, they opposed the teachings of Charles Darwin, whose theory of evolution had openly challenged the biblical story of the Creation.  Human beings had not evolved from lower orders of animals, the fundamentalists insisted; God, as described in the Book of Genesis, had created them.

Urban modernists usually looked on fundamentalism with condescension (disdain) and amusement.  By the mid-1920s, however, fundamentalists were gaining political strength with their demands for legislation to forbid the teaching of evolution in the public schools in many states.  To the modernists, such laws were almost unthinkable.  Darwinism had to them become indisputable scientific fact; to forbid the teaching of evolution, they believed, would be like forbidding teachers to tell their students that the world was round.  Yet they watched with disbelief as one state after another seriously considered the fundamentalist’s demands. 

In Tennessee, the legislature adopted a measure in March 1925 making it illegal for any public school teacher “to teach any theory that denies the story of the divine creation of man as taught in the Bible.”  The result was one of the most celebrated events of the decade.  When the American Civil Liberties Union (ACLU) offered free counsel (legal assistance) to any Tennessee educator willing to defy the law and become the defendant in a test case, a twenty-four year-old biology teacher in the tiny town of Dayton, John T. Scopes, arranged to have himself arrested.  When the ACLU decided to send the famous attorney Clarence Darrow to defend Scopes, the aging William Jennings Bryan (now an important fundamentalist spokesman) announced that he would travel to Dayton to assist the prosecution.  Journalists from across the country, among them H. L. Mencken, flocked to Tennessee to cover the “Monkey Trial,” which opened in an almost circus atmosphere.  Scopes had, of course, clearly violated the law; and a verdict of guilty was a foregone conclusion, especially when the judge refused to permit “expert” testimony by evolution scholars.  Scopes was fined $100, and the case was ultimately dismissed in a higher court because of a technicality.  Nevertheless, Darrow scored an important victory for the modernists by calling Bryan himself to the stand to testify as an “expert on the Bible.”  In the course of the cross-examination, which was broadcast by radio to much of the nation, Darrow made Bryan’s stubborn defense of a literal interpretation of the Bible appear foolish and finally tricked him into admitting the possibility that not all religious dogma was subject to only one interpretation.  The Scopes trial did not resolve the conflict between fundamentalists and modernists.  Indeed, four other states soon proceeded to pass anti-evolution laws of their own.  The issue continued to smolder for decades until it emerged once again in the form of the creationist movement of the 1980s.

Nativism and the Klan

·        Describe the efforts to limit immigration during the 1920s.

·        What factors led to the rebirth of the Ku Klux Klan?  How did the new Klan differ from the earlier Klan?

 

In the years immediately following the war, immigration began to be associated with political radicalism.  As a result, popular sentiment on behalf of restricting immigration grew rapidly.  In 1921 Congress passed an emergency immigration act, establishing a quota system by which annual immigration from any country could not exceed three percent of the number of persons of that nationality who had been in the United States in 1910.  The new law cut immigration from 800,000 to 300,000 in any single year but the nativists (those opposed to immigration) remained unsatisfied.  In 1924 Congress enacted an even harsher law, the National Origins Act, which banned immigration from East Asia entirely (deeply angering Japan) and reduced the quota for Europeans from three to two percent.  The quota would be based not on the 1910 census but on the census of 1890, a year in which there had been far fewer southern and eastern Europeans in the country.  What immigration there was, in other words, would heavily favor northwestern Europeans people of “Nordic” stock.  The 1924 Act cut the yearly flow of immigrants almost in half, to 164,000.  Five years later a further restriction set a rigid limit of 150,000 immigrants a year.  In the years that followed, immigration officials seldom permitted even half that number actually to enter the country.

Some supporters of these measures were progressives who hoped to cure unemployment and the nation’s urban slums by stopping the flow of immigrants, but many Americans had other motives.  To defenders of an older, more provincial America, the growth of large communities of foreign peoples, alien in their speech, habits, and values, came to be seen as a direct threat to their own embattled way of life.  This provincial nativism took a number of forms; the most prominent was the rebirth of the Ku Klux Klan as a major force in American society.

The first Klan was the product of the years following the Civil War.  That organization had died in the 1870s.  But in 1915, shortly after the premiere of the film The Birth of a Nation, which celebrated the early Klan, a new group of Southerners gathered on Stone Mountain outside Atlanta, Georgia, to establish a modern version of the society.  At first the new Klan, like the old, was largely concerned with intimidating blacks, who were, Klan leader William J. Simmons claimed, becoming dangerously insubordinate.  After World War I, however, concern about blacks gradually became secondary to concern about Catholics, Jews, and foreigners.  The Klan would devote itself, its leaders proclaimed, to purging American life of impure, alien influences.

It was during the 1920s that the modern Klan experienced its greatest growth.  Membership in the small towns and rural areas of the South soon expanded dramatically.  More significantly, the Klan was now spreading northward, establishing a strong foothold particularly in the state of Indiana and the major industrial cities of the Midwest.  By 1923, there were reportedly 3 million members; by 1924, 4 million.  In some communities, where Klan leaders came from the most “respectable” segments of society, the organization operated much like a fraternal society, engaging in nothing more dangerous than occasional political activities.  Most Klan units (or “klaverns”) tried to present themselves as patriots and defenders of morality.  Many established women’s and even children’s auxiliaries to demonstrate their commitment to the family.  Often, however, the Klan also operated as a brutal, even violent, opponent of “alien” groups and as a defender of traditional, Protestant morality.  Klansmen systematically terrorized blacks, Jews, Catholics, and foreigners, boycotting their businesses, threatening their families, and attempting to drive them out of their communities.  Occasionally, they resorted to violence: public whipping, tarring and feathering, arson, and lynching (political or racial murder).  In Indiana, the Klan achieved a brief status greater than in any other state.  Under the leadership of D.C. Stephenson, the KKK was able to dominate the Republican Party, dictating the choice of mayors and securing the election of a Klansman, Ed Jackson, to the governorship.

What the Klan most deeply feared, it soon became clear, was not simply “foreign” or “racially impure” groups; it was anyone who posed a challenge to their view of traditional standards.  Klansmen persecuted Catholics, immigrants and blacks and also those white Protestants they considered guilty of irreligion, sexual promiscuity, or drunkenness.  The Klan worked to enforce prohibition, institute compulsory Bible reading in schools, and punish divorce.  The Ku Klux Klan, in short, was fighting not just to preserve racial homogeneity but also to defend a traditional culture against the values and morals of modernity.  The organization itself began to decline in influence after 1925, when a series of internal power struggles and several sordid scandals discredited some of its most important leaders.  The issues it had raised, however, retained strength among some Americans for many years.

Republican Government

·        How did the Democratic Party suffer from the country’s cultural division?

 

For twelve years, beginning in 1921, both the presidency and the Congress rested securely in the hands of the Republican Party, a party in which the power of reformers had greatly dwindled since the heyday of Teddy Roosevelt and progressivism before the war.  During the decade, the federal government expressed a profound conservatism and enjoyed a warm and supportive relationship with the American business community.

One of the major factors in the Republican domination of government was the profound division of the Democratic Party as a result of tensions between its urban and rural factions.  Far more than the Republicans, the Democrats consisted of a diverse coalition (political alliance) of interest groups, linked to the party more by local tradition than common beliefs.  Among these interest groups were prohibitionists, Klansmen, and fundamentalists on one side and Catholics, urban workers, and immigrants on the other.

In 1924, the tensions between the two groups of Democrats exploded in a bitter convention fight.  At their national convention in New York that summer, the party’s urban wing attempted to win approval for the repeal of prohibition and a condemnation of the Klan.  Both proposals narrowly failed.  More serious was a deadlock in the balloting for a presidential candidate.  Urban Democrats supported Alfred E. Smith, the Irish Catholic product of Tammany Hall who had risen to become a progressive governor of New York.  Rural Democrats, hating Smith and all he represented, backed William McAdoo, Woodrow Wilson’s Treasury Secretary (and son-in-law).[11]  For 103 ballots, the convention dragged on, until finally both Smith and McAdoo withdrew and the party settled on a compromise, corporate lawyer John W. Davis.  Davis won less than 30% of the vote in the general election, crushed by President Coolidge who won 382 of the 531 electoral votes.  Robert La Follette, the candidate of the reincarnated Progressive Party, received 16% of the popular vote.  Four years later Coolidge could have easily won reelection.  Instead, in characteristically curt fashion, he walked into a pressroom one day and handed each reporter a slip of paper containing a single sentence: “I do not choose to run for president in 1928.”

In 1928, Al Smith finally managed to secure the Democratic Party’s nomination for president after another bitter convention battle.  He was not, however, able to unite his divided party, largely because of strong anti-Catholic sentiment throughout much of Protestant America, especially in the South.  The anti-Smith campaign was one of the meanest in American history.  Smith was welcomed to Billings, Montana by the KKK with a gigantic burning cross and leaflets that claimed if Smith were elected, “bootleggers and harlots would dance on the White House lawn.”  Another leaflet warned:

When the Catholics rule the United States

And the Jew grows a Christian nose on his face

When Pope Pius is head of the Ku Klux Klan

In the land of Uncle Sam

Then Al Smith will be our president

And the country not worth a damn.[12]

Smith became the first Democrat since the Civil War to fail to carry the South, winning only six of the eleven states of the former Confederacy.  Elsewhere, although he did well in the large cities, he carried only the states of Massachusetts and Rhode Island.

Smith’s opponent, and the victor in the presidential election, was a man who perhaps more than any other personified the modern, prosperous, middle-class society of the decade: Herbert Hoover, widely regarded as the most progressive member of the Harding and Coolidge administrations.  Hoover had coordinated massive European relief efforts after the war that had saved millions from starvation.  As Commerce Secretary, Hoover was active in so many areas that he often seemed to be running the entire federal government single-handedly.  He used his position to promote a better-organized, more efficient national economy.  Only thus, he claimed, could the nation hope to fulfill its most important task—the elimination of poverty.  Hoover entered office promising bold new efforts to solve the nation’s remaining economic problems, but he had little opportunity to demonstrate his commitment to extending American prosperity to those who had not shared in it.  In less half a year after his inauguration the nation plunged into the most prolonged economic crisis in its history, the Great Depression.[13]

 

Jeffrey T. Stroebel, The Sycamore School, 1995. Revised 2001.



[1] Coolidge’s statement that “there is no right to strike against the public safety by anybody, anywhere, any time,” attracted national acclaim and eventually won him a place as the Republican vice-presidential candidate the following year.

[2] Hoover later became the long-time director of the FBI, serving from 1924-1972.

[3] All statements are taken from Time magazine.

[4] Socialist candidate Eugene Debs won almost a million votes despite being incarcerated in a federal prison for opposing American participation in the World War.  Less than a year after his defeat, Franklin Roosevelt was stricken with polio.

[5] Credit cards did not, however, appear until the 1950s.

[6] Leopold and Loeb were two brilliant college students who kidnapped and murdered a fourteen-year-old boy to see if they could commit “the perfect crime.”  The brilliant attorney Clarence Darrow defended them and was able to avoid the death penalty by pleading that the boys were mentally ill.

[7] Time 30 May 1927.

[8] The term “flapper” emanated from the fact that these young women wore unbuttoned galoshes that flapped as they walked.

[9] Barrington Boardman.  Flappers, Bootleggers, “Typhoid Mary” and the Bomb. New York : Harper & Row, 1989, p. 11.

[10] Time. 25 February 1929.

[11] Democratic Party rules required that a nominee have the support of 2/3 of the convention.

[12] Boardman, Flappers, Bootleggers, “Typhoid Mary” and the Bomb, p. 92.

[13] Portions of this chapter have been adapted from Compton’s Encyclopedia of American History. Compton’s New Media, Inc., 1994.