Kind of an oddly worded question I know. I'm not sure of the right syntax or words to make this question, but I thought about this on my way home tonight.
Do (straight) women still give a damn about what men think or try to ever do anything to impress them? Cause it seems like they don't. Online and offline, it seems like the majority of women under 35 these days have these sh*tty entitled selfish attitudes that they deserve to be wined and dined and men can either "step up" and be grateful little paypigs and fin-dom slaves. Or "don't waste my time" as they'd put it. It's such a sh*tty attitude, but it seems like the norm for young women nowadays.
Are women still modest, anxious, or even nervous, around guys they like? Women who want guys to like them and date them? Not just for resources or money, but for the man themselves. And specifically, no, I don't means 10-out-of-10s, perfect Chads, millionaires, and celebrities. But working-class, quality, non-elite men that they happen to catch genuine feelings for.
I guess, what I'm asking is, do straight, monogamous, adult women still care about romance and making a good impression on men? (No, I don't care what queer women and sluts think about this.) If they do, I sure as f*ck haven't seen it in years. I would've made this a poll as well, but I want people to actually answer the question.
Most Helpful Opinions