I think so... telling us how to look, our place in society and who we are destined to be.
A woman is all about looks, an object for sex, a passive creature and a fragile specimen.
A man needs to be tough, barely show emotion, run a household and be financially successful and get that cold hard body.
Anything outside of this is deemed failure...
Is anyone else annoyed by this, or do you simply accept it and move on?
Most Helpful Guy
Yeah, I do think the media affects us too much. What you see on TV plays a big role on women (and men, but not as much). How your makeup needs to be, what you should wear, how to act, etc. It's mostly because of celebrities I think. It makes people feel like they are not good enough in any category when they are perfectly fine. I don't like the role media plays on us.1