1. treat men like utter sh*t,
2. hate all men everywhere (feminists),
3. treat men like walking ATMs and pay-pigs,
or 4. just queer (gay, bi, lesbian, etc. All the same thing).
No, I'm NOT hating on women or anything; just being honest.
Are there still any women who hold men in high regard and want to impress them and being a good woman for them (and themselves)?
Like this song they constantly play at the gym all the time for all the ladies. (It's 3am and I'm kinda sleepy, at the moment.)
Most Helpful Opinions