Hey all,
If anyone read my previous thread y'all know what I think about it.
That fact is the U.S. is obsessed with race and many believe in myths (no, I won't go there this time).
The impression of the U.S. where I'm from is their generally quite arrogant. Someone said here how Africa isn't a continent...yeah it ain't, but you know what? Nor is America. When ever America is mentioned everyone assumes it's the U.S. instead we use "South America" or "North America" making the point how the U.S. is in the center...more like the center of the Universe...
And what's with American sports and how virtually no one else outside of the U.S. plays them...Baseball? NFL? (and what's with calling the NFL football...they hardly use their feet) NHL? and they're about the only nation to refer to the world's best sport as...Soccer? WTF? Where did they get that word from anyway?
My personal impression is that they don't consider football as "masculine" and therefore reserve it for the women, which is why the U.S. Womens team is so good...they should try playing rugby btw.
Then there's the obsession with large dicks...at least in the porn industry and there's the misprounciation of the English language...then theres the death penalty, the fact WW II didn't interest them until Pearl Harbor...the war "on terrorism", refering the Barrack Obama as "black" when he's mixed-race, the obession with the ***, the fact you have to pay to go to college and pay for health treatment, the obsession with fast food, the fact that they love the sound of their own voices, the fact that all their films must have a "happy ending"....and that's all I can think of at the moment.
Oh and the "cuckold".