
What is your view about the roles being reversed financially and dynamically in some sorts when it comes to women in relationships? Is it okay for a woman to bring in most of the money and be the dominant factor among the family. Overall letting the man take a more submissive role?
Holidays
Girl's Behavior
Guy's Behavior
Flirting
Dating
Relationships
Fashion & Beauty
Health & Fitness
Marriage & Weddings
Shopping & Gifts
Technology & Internet
Break Up & Divorce
Education & Career
Entertainment & Arts
Family & Friends
Food & Beverage
Hobbies & Leisure
Other
Religion & Spirituality
Society & Politics
Sports
Travel
Trending & News
Most Helpful Opinions