Here is the breakdown. Tell me if you see some causes for concern.
30% of the rating was through student and parent surveys. Surveys were completed by NICHE users (this is the company conducting this "study" BTW).
25% of the ranking wad determined by academic grades with some sort of NICHE designed system.
10% was determined by teacher absenteeism.
20% was determined by teacher salaries, both real and indexed.
10% was determined by the number of first and second year teachers present on the faculty.
5% was determined by student to teacher ratios.
Sigh.
This "story" will make the rounds. Some districts will use it for bragging rights. Some district administrators may even use it to berate their employees or, worse, it is as guidelines to improve the schools in a district.
Here is a case of a journalist NOT doing her or his job. Yes, there are links to the measures. But where is the commentary about how accurate any of this is in determining the schools with the BEST teachers? This sort of measurement is suspect from the get-go. It is as valid and reliable as Renaissance Learning's reports each year about books and reading. Why do we have this obsession with ratings? It seems to have bled from the world of finance and sports into education. Somehow we must be able to rank kids, teachers, schools, programs, etc.
Let's celebrate good teaching and not put teachers in some sort of swimsuit and talent competition, please. Let's keep the focus on the student.