Why do people always want the "bad people" to die and always cheer for the "good guys"...life is never black and white, and everybody is good and evil to some degree.
That's why I like GOT, EVERYBODY suffers...
I just think some people watch too many Hollywood films, and somehow think the ideals of these films should reflect reality.
I especially hate "feel good" films...people who appear to be happy all the time are almost always the most miserable deep down.
Off-Topic rant
The things I blame the western media for are:
1) The idea that there is one ideal person for each one of us ("the one"), that we will fall instantly in love with and should marry.
People get married for a whole host of reasons, not none of them should ever be for "love". This IMO is the main reason why there are more divorces now than there have ever been.
2) The idea that we should all strive for "happiness", and that life is about "living" (what ever that means) and we should smile and laugh everyday.
Does anyone really think life is about happiness? Try telling that to the people in the 3rd world, especially those, whom when they were children watched their father get shot, their mother raped and themselves either used as a slave or forced to marry the very perpetrator. Life isn't about "enjoyment" or "happiness", life is about growth (i.e. learining from ours, others people's mistakes and genuinely becoming better people.
3) The idea that men are useless, clumsy and are subordinate to their wives.
Most American sitcoms that focus on families, portray the "husband" or "father" as stupid and incapable of functioning without his wife. This is also the case in adverts, which portray men in general as idiots. And if you think this is BS, attempt to find one single advert where only a woman or women are portrayed as stupid.
I could post more but I've got things to do.
Off-Topic rant over....