What has happened to America? This used to be a country where if you worked hard and put your time in you could make a decent living and have a family. Not anymore. The rich are getting richer and the poor are getting poorer. Pretty soon there will be no more middle class. Nothing but peasants doing work for their wealthy business overlords.
This is a bit of a rant on my part due to my own situation. I am 26 years old, a Marine Iraq combat veteran, a college graduate who graduated Magna Cum Laude, I paid off my college tuition with the money saved from going to war...and yet the best job that I can get is still a heavy labor job that pays little and has few benefits. I can't afford to live on my own, and can barely afford health insurance. I am trying, believe me I am trying to get a decent job, but there is little out there. I can't even afford a girlfriend, much less a family. What happened?
Are you rich people on top so greedy and sociopathic that you must suck every little penny out of the working man's pocket just to have continued growth? Is it necessary to outsource every single American job so that there is nothing left? Does the government exist to serve the people, or does it exist for the people to serve?
I am not a Democrat or a Republican. Both parties make me sick. Politicians make me sick. Big businessmen make me sick. Special interest groups make me sick. The media makes me sick. You know what else? This country is beginning to make me sick.