I think it may be kinda a psychological thing. Women tend to be brought up a certain way. That it's not "right" to touch ourselves, etc. It's a "forbidden" zone that she be left alone. That it is dirty and something to be embarrassed by. That extend to even just looking, naturally. So many women just never take the time to really get to know their stuff. Many are embarrassed. With guys, your equipment is there in your way, you can't miss it. You handle it every day. Ours is tucked away. We don't even have to interact with that area all that much. So it's kinda like a fear of the unknown type thing. We are not used to it, it's foreign to us, so therefore it looks alien or something lol, and we dislike it.
I think once a woman gets more comfortable with herself, and accepts her body, and all it's juicy workings, it's less of an ... eyesore, I guess you can say.
I think that maybe how some men view them, is also kinda the way type of thing. The more they see and learn and accept, the more they like em lol
Love me, love my vagina. Embrace it. :)