My personal opinion is that men are becoming more feminized, and not because of clothing or whatever that's silliness in my opinion, because I own skinny jeans and even men's thongs, and I see myself as a manly man, with combat tours overseas, I used to work on oil drilling rigs, girlfriend the whole 9 yards.
I feel as if men are becoming more feminized due to a whole set of reasons which I won't get into out right.
Just wondering some other opinions on the matter
Holidays
Girl's Behavior
Guy's Behavior
Flirting
Dating
Relationships
Fashion & Beauty
Health & Fitness
Marriage & Weddings
Shopping & Gifts
Technology & Internet
Break Up & Divorce
Education & Career
Entertainment & Arts
Family & Friends
Food & Beverage
Hobbies & Leisure
Other
Religion & Spirituality
Society & Politics
Sports
Travel
Trending & News
Most Helpful Opinions