Respuesta :

The United States became an empire in 1945. It is true that in the Spanish-American War, the United States intentionally took control of the Philippines and Cuba. It is also true that it began thinking of itself as an empire, but it really was not.