Respuesta :

Generally speaking, World War I made the American people even more bitter and isolationist leading up to World War II--since many felt that American lives had been wasted in a European conflict. 

Answer:

-build large military forces

-expand their colonial empires

-form military alliances

-expand their economies

Explanation:hope this helps:)