When will people stop telling women to smile?

Somehow a woman's neutral expression is disliked by many men and even women. If she is upset or angry she has even higher chances of being told to smile.

This is really sexist because men are never told that they are more beautiful when they smile and it is really annoying for many reasons:

1. If a woman's neutral expression is bothering you then you have a problem, not her. You are insecure.
2. Telling her to smile won't make her happy if she is upset.
3. You're basically telling her to act in a way that pleases you, so why would she do it?
4. Fake smiles are creepy and if you prefer them then you clearly have a problem.
5. Telling people to act like their only emotions are the positive ones and to neglect the rest is not ok. Wearing masks is not ok. This may only suit a professional envirnoment.
6. Since it is directed towards women only it is certainly about the woman looking like she invites you to spend time with her. So again, you have some insecurities, it is not her that is the problem.
When will people stop telling women to smile?
Add Opinion