? Overview of US history after the war of the United States | About ...
US of A Brief History of - after the war of the United States
[It, new world, must be much built a better world than ever before. It is a world in which human beings of timeless dignity is respected. ]
Consensus and change
What years or immediately after the first World War, was the presence of the United States dictate world affairs. It won the great battle, the United States, where the mainland has escaped the ravages of war, had confidence in the mission at home and abroad. US leaders is to maintain the democratic institutions that defended paying a great deal of sacrifice, wished to spread widely as possible the benefits of prosperity. For them, and for the issuer Henry Ruth [time] magazine, which was [the United States of the century].
Then 0 years, most of the Americans, did not suspect the attitude full of this self-confidence. They are, in the Cold War, which represents the clear and appearance after year, has accepted the need to maintain a hard-line stance. They support the authority expansion of the government, were accepted an overview of the formed in the New Deal era rudimentary welfare state.
Copyright (C)2021Middle East: Maki Igarashi and the Red Cross Middle East ....All rights reserved.