How was American involvement in imperialism and war affected the American “homefront?” The effects of the World War I on America were wide ranged covering the political, economic, and had a social impact. World War II was the worst war in American history because of the European dictators, but there was many positive effects that came out from that war. American imperialism was economic, military and cultural. World War I has changed America. There were many changes during this war and some of the changes were good, some were bad. The American involvement in World War I had some worrisome indirect effects on the country. Dissimilar to the countries of Europe, the machineries, factories and the home of the US has not even been destroyed. Assembling, generation and productivity had expanded through need amid the Great War. The US developed as the world’s mechanical pioneer and the American economy was blasting, the benefits were expanding which prompted the period in American history called the Roaring Twenties with a huge ascent in consumerism for the well off. World War II forced women into the workplace. The women went to work and became spenders. The women did many things for this war at home and in uniform. Women didn’t only help their families they gave it their all and some women even gave up their lives. As you can see women played an very important role during this period. World War II drew America into a lot of activity and new ideas. Art, music and intellectual ideas changed. But, unfortunately these changes cost many lives. WWII had many positive effects but a lot of those positive things cost many lives.American colonialism was either immediate, roundabout, impacts and controls different nations or their strategies. Such influence is often closely associated with expansion into foreign territories. All these had a positive and negative effect on the American Homefront. As you see the effects of WWI on America were wide-ranged covering the political, economic, and had a social impact. WWII was the worst war in American history because of the European dictators, but there was many positive effects that came out of WWII.