You know, lately, I really dislike being an American.
Yes, yes, I know. I have heard and studied about American History. I know what we have gone through as a country to earn our name, our essence. I admire and believe in everything that America stands for: freedom of speech, human/race/gender/orientation equality, living without fear of religious oppression... But we are all talk!
We claim to be all of the above. We are equal... but if you are gay, you can't marry who you love, which results in the denial of hundreds of human rights. We support women's suffrage, but, not only will you make $.75 to the man's dollar, but you women must also leave your body up to the government's discretion.
It makes me sick that congressional hearings on birth control can occur without the participation or involvement of women. It makes me sick that our country thinks that it is constitutional to deny rights to any human being, regardless of sexual orientation. It makes me sick that in a country that originated due to a desire for religious freedom, we base most of it's laws on, primarily, one religion. We speak highly of a separation of church and state -- but we are still "One Nation, Under God". (Please don't get me wrong. This is not an argument against religion. I fully support anyone's right to practice any religion they choose, as long as it doesn't compromise the freedom or safety of any other person.)
We will not be what we claim to be, until these things change.
I hate living in a country where appearance and beauty are more revered than intelligence and compassion. Every time I see a cover of a gossip magazine, this is only exemplified. Anytime any female celebrity takes an unflattering photograph, it's the front page news. Really, who cares if (insert celebrity here) gained a few pounds?
I hate knowing that the cast of Jersey Shore makes more money than teachers. The financial distribution in this country is bananas. (I am not saying that the destructive and endangering actions of the Occupy movement are justifiable, however.) I understand that idiots make for good entertainment, but do they really deserve to make millions of dollars a year, when there are hard-working, educated people who struggle trying to make ends meet?
I hate that the people who are able to contribute to causes, don't. Instead, they own multiple cars, pay mortgages on million-dollar-thirteen-bedroom mansions to house their 4 person family, and look down on those less fortunate. I recently read an article where a wealthy business man dined at restaurant, spent over $100 on the meal, and left a 1% tip, along with a nasty piece of advice. Another photo shows a non-existent tip, and some even nastier advice. I'd like to think that these disgusting people took that tip money and put it towards something beneficial to their community, or used it to buy food to donate - but we know that isn't the case.
Please know that this is not my pledge of anarchy. No, I am not going to burn the American flag, or move to Canada. (However, if they ban the use of tampons like they have been discussing...) I am just very upset with my country right now. I can't wait for the US to become what they claim to already be.
We will not be free until any human has the right to marry whomever they want to.
We will not be equal until there are no financial discriminations against women, and until the government removes itself from the woman's vagina, and furthermore, her personal business.
We will not be free from religious oppression until Biblical statements are no longer used to define laws and rights in this country.
Ron Paul for president! a but really totally agree with you here it makes me sick! also gives me another reason why i want to move to another country or live off the grid
ReplyDelete