Wednesday, September 4, 2013

LDOE Report on Educator Evaluations

The Louisiana Department of Education issued a press release yesterday basically claiming that the new Act 54 evaluation system has greatly improved the teacher evaluation system in Louisiana. You can view the press release here. The full report is here.

The report claims that the new system is much better than the old system which gave too many teachers a satisfactory evaluation. The new system has spread teacher evaluation results over a wide range using 4 levels of ratings from "ineffective" to "highly effective". The report claims that now teachers will get much more guidance from their evaluators about how to improve their teaching.

The DOE report on the 2012-13 teacher and administrator evaluation results also compares the evaluation results of each local school system to data that shows the progress school systems have made in reducing the number of non-proficient students as measured by state tests. The Department claims that such data shows that the new evaluation system generally aligns with actual results in the classrooms of the state. That is, teachers in parishes where student performance has not improved will more likely get low evaluations and teachers where student performance has improved most get the higher ratings on the evaluation. However when studying the tables we find all sorts of strange results that are difficult to explain. This will be covered in future posts on this blog.

But there are some very important details and results that are buried in the statistics that the Department report chose not to include in the press release or the written narrative. This blog will attempt to bring these issues to light over the next several weeks. For now, here are the most glaring concerns that are raised by just looking at the tables of data that accompany the report.
  1. The VAM portion of the evaluation originally had a mandatory 10% quota of teachers that must be found "ineffective" no matter how well students did statewide or locally on state tests. Also, that 10% "ineffective" on VAM takes precedence over the observation by the principal and overrules the Principal's evaluation to find at least that percentage of VAM rated teachers "ineffective". So there was the preconceived assumption that 10% of the teachers are bad each year no matter how much improvement is occurring with student performance and no matter what the educational leader of the school (principal) has found in his/her observation of the teachers. This is vicious system that ignores the professional judgements of our education leaders.
  2. The actual results of VAM rated teachers in the state report shows that somehow only 8% of the VAM rated teachers were found to be ineffective. This is good news, but the report does not tell us how 20% of the teachers destined for an "ineffective" somehow avoided that fate. We need to know more about how the DOE modifies their own quotas.
  3. Sure enough, as predicted, teachers of gifted and talented students were more likely than other teachers to get an ineffective. The DOE reports that 11.6% of such teachers received an "ineffective" VAM. This is apparently even after a handful of teachers of high performing students statewide were personally exempted from VAM by Superintendent White and allowed to use their SLTs for their student performance rating. We know of no BESE policy that allows such an exemption.
  4. Correction: This post originally stated that Special Education teachers got higher than average "ineffective" ratings from the VAM. That was not correct. Special Education teachers received approximately an average percentage of "ineffectives" and a higher than average rating of "highly effective".The actual higher rating of 10.2%  I reported was for teachers of high poverty students. Their greater than average "ineffective" VAM scores should be an area of concern.
  5. The greatest disparity or inequity in ratings occurs when comparing teachers rated using SLTs to those subjected to VAM. Only 3% of teachers rated by SLTs were rated "ineffective" while 8% of VAM teachers were rated "ineffective".  That's almost three times as great a chance of being rated "ineffective" if you are a VAM rated teacher.
  6. The final result for all teachers is that approximatey 4% were rated "ineffective" and 32% were rated "highly effective."
The DOE report on the evaluation results seems to indicate that only the VAM results are reliable in determining the true ratings of teachers. The report indicates that the DOE would like to see more teachers covered by VAM ratings. I hope that educators continue to strongly object to the assumption that VAM should dominate the evaluations. Remember that the 10% ineffective is an arbitrary quota. It is not based on any scientific study that shows any particular percentage of teachers to be ineffective. The DOE bosses have been trying to bully our principals all year to make their observation part of the evaluation agree with or "align" with the VAM. This is not what the law specifies. The law makes the observation portion of the evaluation an independent assessment of educator effectiveness, that is not supposed to be influenced by the VAM. This effort by DOE to make VAM the dominant factor is just wrong and not supported by any real science.

My concern at this point is that teachers who are evaluated using the VAM are going to want to transfer to other subjects or areas where they do not constantly live their professional lives under the gun of VAM. This will mean that it will be harder than ever to find dedicated teachers who are willing to teach the basic skills.


Jeremy said...

The problem with all of this (or at least one part) is that there are too many variables. If you're doing a science experiment, how many variables do you have at one time? Only one. If you have more than that, you can't figure out what actually caused the change. Well, when we're dealing with kids who can't read anywhere near grade level, no parental support, kids who don't study, then we end up with all of these variables at once.

Anonymous said...

And the DOE keeps changing the variables and data so no one really knows what they are doing. You can't infer anything from the VAM to any teacher because no one can validate any of the scores, any of the computations and DOE manipulates all the data to show what they want.

Anonymous said...

My problem with SLT's bom comes in the form of a teacher made pre/post test (and could even be an observational checklist for music, PE, foreign language, etc.). My principal required VAM teachers to create an SLT assessment just in case our VAM came back negative, we would have the pre/post test results as "evidence" our kids learned throughout the year (which would REALLY be influential in the event a previous VAM was Proficient/Highly Effective). The objective would be to use our own assessment that could not be manipulated by DoE like VAM can(the distrust is disheartening). If that isn't experience with SLTs gets more interesting!...when I gave the pre/test "SLT assessment" my kids asked me, "if this is a pretest, can we assume we really won't know much of this stuff? And we won't be graded (have it held against us) on what we don't know, right?" The obvious/logical/truthful answers to these questions are, "yes, you probably won't know much of the info... and no, it would be wrong of me to hold a grade against you for something no teacher to this point should have covered yet." It took them all of 10 minutes to answer a 45 question test, which I was reprimanded for not having one question per benchmark/standard but that test would have been over 80 questions (multiple class periods to finish)! Now lets fast forward to the end of the year, after I've taught this "stuff" all year...and get ready to end the year with my Post Test final exam (held against them)...I could have supplied them with a study guide (I didn't in order to TRY to maintain some semblance of integrity for the test), but even the parents expect such a study guide at the end of a course! The temptation would be parent pressured! Now that the post test would "count", it took TWO days for the students to complete it! Guess what REMARKABLE growth they showed for my SLT assessment? I came out fine on my VAM, but if I didn't...does the DOE REALLY believe this SLT business is "fair/just"? If we are going to remove the bottom 10% of teachers, the question I would pose is, "Would my job be on the line if the SLT teachers (subjective teacher evaluations) were viewed as strictly as the VAM (objective teacher evaluations)?"

kellysuch said...

Anonymous: I've preached the same sermon on slts and VAM! I'm so glad to read this tonight affirming I'm not going crazy!

Michael Deshotels said...

Sorry but I made a mistake on the % ineffective for SPED teachers. The 10.2% ineffective was for teachers was really for teachers of high poverty students. The SPED teachers had close to the average for all VAM teachers.

Lance Tankmen said...

Where did teacher evaluation come from? What started it?

Anonymous said...

Trying to understand this...what determines whether VAM or SLT is the ezaluation type for each individual teacher? Subject area? Years teaching?