Sooo.. most men bitch that women are too uppity and prude and play hard to get, etc. But once women start acting in a way that is seen as more masculine, and the way you say you want it, and then it's a horrible thing.
Most women I know and have ever known DO want relationships. Maybe not ALL the time (everyone should experience life as a single adult) but everyone I know wants to end up in love. I personally have had trouble getting other women to understand my ability to discern sex from love and enjoy them as separate (but obviously complementary) entities. I'm often told that I'm like a guy. And seeing as I'm in a long term relationship that is okay emotionally but completely lacking in the bedroom and having no problem getting my needs fulfilled elsewhere makes me seem more like the idea of a guy. Now, just because right now I'm listening more to my sexuality than my heart right now, that doesn't mean I don't want to eventually settle down and be faithful- same thing for MANY men!
Here is my story why I think guys only want sex from me. I can't speak for other girls.
When ever since school guys would try to "take" what they wanted. For instance, boys would always try to lift my skirt or dry hump me, or even kiss me and I didn't like them or want to be touched, and a the nice guys always acted like they didn't like me, so overtime it just sort of developed into me thinking guys only wanting sex from me. I'm working on it.
I think that a lot of females these days are adopting the idea that men can go out and get all kinds of sexual favors, so why shouldn't they be able to?! So they go out and hook up with guys, not just for that but often to validate their worth and gain some confidence by feeling desired because they can please a guy sexually (even though it's not rocket science lol it's not difficult to please a penis).
I think that sex in general has decreased the value of pure intimacy on different levels. Now days, people just want orgasms, they just want to recieve oral sex because it feels good...it's all just about the physical aspect of them feeling good, that it's taken so much away from what sex should be. Now days it seems like people are so busy trying to put everything into logic. Oh well "Sex is natural. We, as human beings, are created with sexual organs, and sex feels good so why not get it?" Totally taking away from the fact that sex is not a physical act.
So yeah, it's sad. So many girls out here seem to be under the impression that it's just okay to give their body away for the sake of feeling pleasure and then they get STD's, get bad reputations, loosen their vaginas, and give away one of the most powerful pieces of themselves. All for what? An orgasm? A few seconds of bliss?