Do you think America supports its teachers?

I was reading an article about the Democratic debate that happened a few days ago, and apparently they all agreed teachers should be paid more (I didn't personally watch it). I scrolled through the comments and found this comment with the following response:
Do you think America supports its teachers?
Do you think America supports its teachers?
The responder also stated in a later comment that every they listed had a Bachelor's degree in reference to them needing to get their Master's.

I was always under the impression that teachers got paid enough and whined and complained about nothing. Then I found the following SS that's been circulating in my area for the past couple of days. It's a message a teacher sent out to the parents of her students this school year concerning homework:
Do you think America supports its teachers?
What do you think? Does America as a country and society do enough to support teachers and education? My husband thinks teachers get paid plenty and are respected enough for what they do and only the ones who care about the check leave. His family agrees with him, and they think that if they for more money then they do not care about the kids who need them. I kind of agree to an extent, but I don't think it's so cut and dry. Caring for kids doesn't pay your bills so I understand they should make enough to support themselves especially if they're also responsible for buying their own supplies.
Updates
+1 y
So... I see lots of people talking about teacher pay when all I asked is if we support teachers. Support does not necessarily come in the form of a paycheck 🤷🏽‍♀️
Do you think America supports its teachers?
Post Opinion