Now wait... are we not the United States of America that live in North America? Or are we just the United States. Does it not say on our declaration of independence, the United States of America?
I don't mind learning something new or looking at things in different ways, but why is it that I'm all the sudden racist, classist and arrogant? Perhaps I'm ignorant, plain fucking stupid or just misinformed. I invite discussions and expanding my mind but I loathe doing it from the stand point of "didn't you know you suck?".
I still adore you Jen and respect you immensely, we're just talking here.
~~~shark~~~~~~~~
__________________
take a fish boating
|