Basically, the article claims that it's "sexist" to tell women to smile and even goes as far as to say it is a "very common form of harassment that objectifies women." So I'm wondering, do you agree or disagree that telling women to smile objectifies them and is offensive? What do you think?
Is telling girls to smile a form of objectification?
Basically, the article claims that it's "sexist" to tell women to smile and even goes as far as to say it is a "very common form of harassment that objectifies women." So I'm wondering, do you agree or disagree that telling women to smile objectifies them and is offensive? What do you think?
Most Helpful Opinions