Throughout the 19th century America expanded
control of the continent to the Pacific Ocean
By 1880, many American leaders felt the U.S. should join European nations
and establish colonies overseas
Thus began America’s foray into Imperialism – the policy in which stronger
nations extend control over weaker nations