Why do people always tell women to smile? Why do they think it's ok?

Like seriously? .. why should we give a fuck if you like our face with a smile or not and why do people feel entitled to tell others hoe to look and what to do "so they look better "... like wtf? people sometimes tell me this and it's seriously annoying. I asked my brother and he says no one ever asks him that... sad
i am guilty of doing this and undersand that i need to stop
Vote A
i dont ever do this
Vote B
im a women who has experinced this and i find it extreamly annoying aswell
Vote C
im a women and none ever says this to me but can see how it can be annoying
Vote D
i do this and see no problem with it..
Vote E
Select age and gender to cast your vote:
GirlGuy
Why do people always tell women to smile? Why do they think it's ok?
11
3
Add Opinion