Yet this profession is looked down upon in American society, as in it isn't glamorized as much as other professions.
A common saying is: "ones that can do, do...and the ones that can't, teach."
It's a rather sad way to think about a profession that has a HUGE impact on how the youth develop into adulthood.
----
Why do you think that the teaching profession isn't as respected in America compared to other professions that don't have as much societal impact?
Most Helpful Opinions