I know it exists, because a lot of my guy friends say there's pressure to be buff, muscular, tall, strong, etc., or else you're not considered a "man". I'm sure there's some show out there that portrays this, but I can't seem to think of any specifically. Can any of you?
Are there any TV shows that portray negative body image for men?
I know it exists, because a lot of my guy friends say there's pressure to be buff, muscular, tall, strong, etc., or else you're not considered a "man". I'm sure there's some show out there that portrays this, but I can't seem to think of any specifically. Can any of you?
Most Helpful Opinions