📚

 > 

🇺🇸 

 > 

🗓️

World War II

march 20, 2019



Resources

On December 7th, 1941, the United States was pulled into the war that had encompassed most of Asia and Europe. Working with the allies, America fought to defeat the totalitarian governments of Japan, Italy, and Nazi Germany. American society was completely transformed during the war, as many women had to enter the workforce for the first time and many parts of the economy had to shift production to support the war efforts. Not only did World War II take down many fascist governments overseas, but it also helped pull the United States out of the Depression and permanently changed parts of the American culture.