Friday, February 26, 2010

The Treaty of Versailles

The Treaty of Versailles was a treaty signed June 28, 1919. It officially ended WWI. The treaty destroyed what Germany had worked so hard to attain, it required that Germany give up 10% of its land, all its colonies, and it required Germany to take sole responsibility for being the cause of the war. This crippled Germany's economy, which inadvertently led to the rise of Fascism and Adolf Hitler. The country was weak and in need of a powerful leader, and Hitler provided just that. He said he would help the country and the countries people. The German people put Hitler in power, and by the time they saw his true sick intentions it was already too late, and he was a dictator.

No comments:

Post a Comment