Do you feel as if men are being feminized?

Simple question, just found it interesting. I see a lot of people complaining about men becoming more feminine, with wearing skinny jeans and what not, or not being able to do things that men traditionally were good or skilled at.
My personal opinion is that men are becoming more feminized, and not because of clothing or whatever that's silliness in my opinion, because I own skinny jeans and even men's thongs, and I see myself as a manly man, with combat tours overseas, I used to work on oil drilling rigs, girlfriend the whole 9 yards.
I feel as if men are becoming more feminized due to a whole set of reasons which I won't get into out right.
Just wondering some other opinions on the matter
Do you feel as if men are being feminized?
Post Opinion