So just curious everyone's opinion do women have the upper - hand in society and relationships or do men or is it equal? Or does one have it better in one then the other one does. Please explain why you believe it one way or another. All opinions welcome.
Select age and gender to cast your vote: