My personal opinion is that men are becoming more feminized, and not because of clothing or whatever that's silliness in my opinion, because I own skinny jeans and even men's thongs, and I see myself as a manly man, with combat tours overseas, I used to work on oil drilling rigs, girlfriend the whole 9 yards.
I feel as if men are becoming more feminized due to a whole set of reasons which I won't get into out right.
Just wondering some other opinions on the matter
Most Helpful Opinions