Do women actually like men or do they only want them for their resources, work, and authority in society?

Do women even find men to be desireable or do they only care for their resources, work, and power?
It seems to me that the only people who actually find men desireable are gay men.
Do women actually like men or do they only want them for their resources, work, and authority in society?
Post Opinion