World War I lasted for four years, from 1914 to 1918. Ironically, it was known as the “war to end all wars”; a war that would create peace and democracy.
But it became almost the exact opposite. It terrified citizens to know that in the beginning, they had supported the war effort, thinking that it would all be over by Christmas. It showed people the violence and destruction they were capable of.
America was a late entry into the war, joining on April 6, 1917. However, WWI shaped the world in many ways, even America, who only joined at the very end. To mark the 100th year of America's entry into WWI, exhibits and displays are being held at museums across the country.
World War I, In Brief
To understand what led to World War I, we need to go back to early 20th century. A time when European countries were expanding their colonies in parts of Asia, Africa and South America. They were also building up their military and naval power and weaponry. And back home in these European countries, there was a growing sense of national pride and a desire to dominate.
While each of the above reasons was creating a tense situation, the last straw was the assassination of the Archduke Franz Ferdinand, heir to the Austro-Hungarian throne. Even though the murder had been committed by a Serbian terrorist group, Austria-Hungary still blamed Serbia. A few days later, Austria declared war on Serbia and other countries began taking sides. The two sides became the Triple Alliance, consisting of Austria-Hungary, the Ottoman empire, and Germany, and the Allies, consisting of Serbia, Russia, France, and Britain.
WWI was a total war, which meant all facets of society contributed to the war effort. It was unrestricted in terms of firepower and gave rise to horrors such as trench warfare (use of trenches to attack enemies), gas attacks, and was the bloodiest battle up to that point in history.
America In The War
The US was neutral for most of the war and joined in only after German submarines repeatedly sunk US ships, killing thousands of Americans. The US declared war on Germany on April 6th, 1917. The stalemate between the Triple Alliance and the Allies was broken, and Germany was forced to sign the Treaty of Versailles. However, since the nations only agreed to an armistice (an agreement to end fighting), WWII was started shortly after.
World War I helped shape America politically, socially and economically. Because the US didn’t suffer as many losses as some of the other countries, Americans came home to find a rapidly growing nation. Factories and industries had started booming, and people were moving into major cities from small farmlands. More wealth meant rising demand for goods and borrowing of money to buy them (use of credit).
In addition, women, who had taken over the jobs of men while they were at the war, took the chance to fight for women’s suffrage. A year later in 1919, they were allowed to vote. The end of the war also led to some wartime industries closing down. With no jobs for returning soldiers and unions unable to protect their workers, rising costs ultimately led to the Great Recession of 1929.