There is even a study that showed that women are more turned on looking at naked pictures of female bodies than naked pictures of male bodies.
Now I have heard some people say that women do like the male body but they just don't admit it. But I find it hard to believe that women are part of a conspiracy to make men feel crap about their bodies.
So why is this. Why do women even bother dating Men if they are more turned on with female bodies.
If women really do like the male body more then why are they do dishonest. Are they out to make men feel depressed and get low self esteem.
What is it.
Most Helpful Opinions