Respuesta :
World War I changed the lives of women in America by allowing them to work in various employments.
What was the WWI?
WWI was started after the murder of Ferdinand in the year 1914 and ended in the year 1918.
During WWI, the women were allowed to take up the employment being left by the men due to war. They were shifted to many professions like teaching, tailoring, clerical works, official duties, etc. More than two million women were working in the employment sector in the era of WWI.
Therefore, the increase in employment for American women transformed their lives drastically during WWI.
Learn more about the WWI in the related link:
https://brainly.com/question/1449762
#SPJ1