Girls, what do you think they mean?
What did women have before, that women nowadays don't have anymore? I think that women's nature such as empathy and compassion never changes.
Please don't give the generic answer "Feminism", try not to be vague.
Thaaanks <3
Most Helpful Opinions