Women just want a physical relationship, what the hell is wrong with women these days?

Am I the only guy that is seeing a trend where women just want to have a physical relationship and nothing else?

Updates:
Sorry for posting this dumb question as I was angry and in this short period of time I grew to see some of my immaturity. I still have more to grow. Thank you for your answers.

Most Helpful Girl

  • Sooo.. most men bitch that women are too uppity and prude and play hard to get, etc. But once women start acting in a way that is seen as more masculine, and the way you say you want it, and then it's a horrible thing.

    Most women I know and have ever known DO want relationships. Maybe not ALL the time (everyone should experience life as a single adult) but everyone I know wants to end up in love. I personally have had trouble getting other women to understand my ability to discern sex from love and enjoy them as separate (but obviously complementary) entities. I'm often told that I'm like a guy. And seeing as I'm in a long term relationship that is okay emotionally but completely lacking in the bedroom and having no problem getting my needs fulfilled elsewhere makes me seem more like the idea of a guy. Now, just because right now I'm listening more to my sexuality than my heart right now, that doesn't mean I don't want to eventually settle down and be faithful- same thing for MANY men!

    • I'm not most men. The women in my age group(I am 40) are who I was referring to. They have been burned badly and are leery of getting serious and just want a f*** buddy.

    • Thank you for your honesty