Thursday, April 23, 2009

Do Americans take America for Granted?

America on average I believe has become more and more self centered. Americans are too in love with themselves and their precious possessions and personal lives that they have taken for granted what truly matters. Americans do not realize how good they have it in the land of the free. They see the news and the countless wars, oppression, hunger, and death but yet it does not faze them. They are too worried about what makes them happy and how things will benefit them. Most know American history and the struggles the colonists made to secure their freedoms and liberty from their oppressors but now that all the fighting is over and done with Americans take for granted the liberties they have and the freedoms that were bought with blood by their ancestors. They do not truly love their nation. Most Americans, except obviously for the military, do not know what it means to fight a die for something they believe in. They are not ready to put their lives on the line to protect the way of life and the things they hold dear. Americans need to stop living in this dream world of theirs and get off tier lazy butts and do something for their country. If they would simply do that then America would not be such a horrible place that other countries look down on and say are fat and self centered but are instead forth right and strive to do what is best for the common good. Finally then Americans would not take America for granted they would have a part in shaping it with their own hands.

No comments:

Post a Comment