Are the far left and far right ruining the country?

It seems like nothing can just be enjoyed anymore. The far left is offended by everything, they have absolutely no sense of humor and call anything they don't agree with "racist" or "Qanon".

On the other hand, you have the far right, who accuse everything of being "woke", from movies to tv shows, never mind the fact that 85% of sitcoms from the 70s, 80s, and 90s had a liberal message and nobody gave a shit. If Diff'rent Strokes aired today it would be accused of being too woke.

Are the far left and far right ruining the country?
Post Opinion