Answer:
yes
Explanation:
World War II and the Birth of an Empire. The United States became an empire in 1945. It is true that in the Spanish-American War, the United States intentionally took control of the Philippines and Cuba. It is also true that it began thinking of itself as an empire, but it really was not.14 Apr 2015
hope it helps
follow for more answers