World War I had been the main cause that led to World War II

World War I had been the main cause that led to World War II. The victors of World War I, namely France and Britain, had placed the blame of the war on Germany. The Treaty of Versailles was a peace treaty that was signed at the end of World War I. It ended the state of war between the Allied Powers and Germany. In the Treaty of Versailles, the Germans were being legally forced to pay the reparations of World War I. This led to the downfall of Germany. Germany went into a depression, quite awhile before the Great Depression began in the United States. As bad as things got in the United States, they were a lot worse in Germany.