Teaching is the only profession I know that people outside of the field think they should tell those who do the job how to do their jobs.
Let's face the reality that a lot of the criticism leveled at teachers is sexist bullshit pure and simple. You will even hear this crap from women in other fields who think they are so fucking superior to those who work in the vital careers like this one but are female-dominated.
The arrogance absolutely makes me puke.
Fake Democrats and union moles make the perfect combination to destroy the teaching profession in the United States.