Should women be educated about the ruthless nature of men very early on in their lives?

Why don’t mothers (or parents, in general) educate their daughters about men? We’re raised by society and the entertainment industry to think men truly want love. We’re blindsided by men’s true nature, which is simply to use and misuse a female till she’s basically 30 (gone are her prime fertile years) and finally knows, without a shadow of a doubt, the truth about men. She’s left feeling bitter and simply decides that using men in return is her only choice. At which point, she may find a man that wants children and is therefore willing to be in a relationship with her (but it’s never about love for a man).
Should women be educated about the ruthless nature of men very early on in their lives?
Post Opinion