I don't see the Declaration of Independence as a document that defines "Americans". It was written by rich, white men. If you were to take a look at our country and our government, you would find that that "America" no longer exists (though there is definitely room for an argument saying it does).
I look at the document as yeah, laying out some great things. Every man is created equal (how are we doing on that?), that we are all endowed with certain unalienable rights (which are what, because going back to the whole "equal" thing I think there would be some discrepancies), and that we're all allowed to pursue our happiness and whatnot, and that any government that impedes that MUST be overthrown.
Well. I don't know about you, but it looks like I have a government to overthrow.
I could mention a number of things. I could mention the fact I'm not allowed to adopt in some states, marry in most, and not pursue a career in the military. "Americans" years upon years before me were slighted by this Declaration, this most "American" document.
Why do I feel like despite the patriotism I should feel from this document, I feel nothing but a sense that this is really nothing more than self-serving bitch-fest for men who were well-off and didn't like seeing their money taken away (despite being ever so graciously being defended by the British in a little tussle known as the French-Indian War).
If this document truly held relevance to the present, I think there would be more people referencing it as a reason things need to get done around here.
But I won't hold my breath.
No comments:
Post a Comment