Guys, When did you realize being nice to women was a bad idea?

I am not talking about being nice to women who care about you or women who are attracted to you. I am talking about when society tells you at a young age to treat women with respect. But as you grow up you start to realize respect needs to be earned by women.

My dad told me never to be nice to a woman unless I knew she was attracted to me. The teachers at school told me tried to drill that feminist bs in my head; I did not care because I was getting all the girls at school and did not see any reason to change.

My dad told me women would think you are trying to get too friendly with them and spread rumors about you. So lucky for me, I was never a nice guy that kissed women's asses.

I am happy I had a good dad, and I am still living the good life.

Guys, When did you realize being nice to women was a bad idea?
Post Opinion