Wearing make-up is something commonly done by many, if not most, women in some form, depending on culture etc.
This question is open to both men and women and what they think.
A younger relative of mine said something about that she felt 'less of a woman' because she doesn't wear make-up often, if at all. I realised I too have often felt this way because I have actively decided not to wear it (unless for special occasions, where it is minimal, or in dire need, i. e. acne, bad skin), despite it being a prominent thing in media and advertisement.
So, do you think all women should be expected to wear make-up? And do you agree that it can make a woman 'less of one' if they don't?
- Yes, it should be expected of all women to wear it.Vote A
- No, it should not be expected.Vote B
Most Helpful Girl
Honestly society prefers people without makeup. They say you should naturally have high cheekbones, clear skin, Rodney cheeks and long black lashes.
But I understand no one looks like that so I don't care if they don't have those features. I have acne an I honestly feel like it makes me less of a beautiful/pretty woman because I choose to wear makeup1
- Show AllShow Less