It seems like in today's culture admitting when your wrong is seen as a weakness, you can't even say sorry without "giving ground". I always thought that admitting when you're wrong was a gentleman's thing to do, but these days people would rather storm away than deal with the truth even if they're obviously wrong, and the people with the guts to actually admit they're wrong gets laughed at and made fun of and is seen as stupid, like it's bad to not know something and learn from another.
So is it better to admit you're wrong, even if the person who called you out started being a total dick about it? Or does admitting when your wrong mean that you're stupid and the dickhead who's laughing at you the real "man"? Whether you're wrong on a personal opinion, assumption, actions, past actions, political opinions, etc.
Select age and gender to cast your vote: