mindasze161 mindasze161 24-10-2022 History contestada At the end of WWII, the United States was the only nation that was financially better off than it had been at the start of the war.