I watched the video presentation this week by the DOE explaining the COMPASS changes approved by BESE last week. I concluded that the changes could have been explained just as well in about 3 minutes instead of the boring 34 minutes .
That's it. The changes in COMPASS could have been explained in 3 minutes.
I was struck by the fact that some of the most important questions about the validity and effectiveness of the program were not asked or answered. Here are some that popped into my head as I watched the video:
- What has been done to correct the lack of stability of the VAM? That is, the fact that VAM can give a teacher two completely different ratings from one year to the next even though the teacher teaches in exactly the same way both years. Wayne Free of LAE and Dr Mercedes Schneider (an independent researcher) for example, found that of the teachers rated in the top 10 percentile one year, 54% will fall below the top 10 percentile the following year even if they teach exactly the same! There are similar problems with stability of the VAM instrument for all subjects and levels of performance. The only answer on this issue I got (which was not on this video) is that for teachers of high achieving students, there has been an adjustment in the VAM formula to allow for less growth than had previously been expected. This change will help teachers of such students to get a higher VAM score. But nothing is done for all other teachers who may also get erroneous VAM scores.
- Is there still a 10% guaranteed failure rate for the teachers that get a VAM rating? When you watch the video, you get the point that in order to get a good evaluation, teachers should strive to get their students to achieve at or above the average projected VAM score for that class. What if classrooms across the state do generally better than what was predicted by the VAM projections? Does that mean that more teachers will get satisfactory VAM scores? I don't think so. You see, the VAM system is set up to rate teachers in relation to each other. Therefore the bottom 10 percentile of the ranking of teachers will always be rated as ineffective no matter how students do statewide. If I am wrong about this, I hope some education official will notify me right away so I can correct this information for my readers. My understanding is this teacher ranking system is supposed to give the same overall result in teacher evaluations no matter how well students do across the state. So the overall teacher scores can never improve as long as BESE keeps this system which ranks teachers using a bell curve. Over a period of several years I wonder how many teachers will be put on a path to dismissal.
- What about the two thirds of all teachers who will not get a VAM score? Will a certain percentage of them be required to be failed by the evaluation system? The last I heard from White is the answer is No. Only the VAM rated teachers have a required failure quota. Apparently the State wants to push mostly for removal of the teachers of Math and ELA.
- I have concluded that in the long run, VAM will count for 100% of a teacher's evaluation, not the 50% that was originally specified in the law. Here is my logic: We already know that for the bottom rated 10 percentile of teachers, the VAM overrules the principal's evaluation and the teacher must get an ineffective rating. So all in this group are rated totally by VAM. But what about the other 90%? If you watch the question and answer portion of the video, there is a question that asks: How can a teacher be assured that her principal's evaluation is fair? The answer they give is amazing! The state will analyze all teacher evaluations and compare the principal's observation rating with the VAM rating to see how well they agree. If the teacher got a good VAM score but a poor observation score, we are led to believe that the observation score will be corrected upward. But we are also told that if the principal's observation score is too generous compared to VAM, the score can be adjusted downward. That tells me that everything depends on your VAM score. Remember, that's the score that can vary immensely from year to year even if you teach exactly the same way each year. It is amazing that the VAM score with all its inherent error is still considered superior to the principal's judgment. Overall, the VAM counts for 100%.
- Will principals still be punished if their teachers score poorly on the VAM? I am guessing the answer is yes. According to the accountability plan submitted as part of Louisiana's ESEA waiver, the principal's evaluation will be based partly on how many teachers on his/her faculty are rated as ineffective. In theory, this requirement is intended to coerce the principal into firing teachers who score low on the VAM. Once again, everyone in education is tied to student test scores. That is, everyone except John White. Maybe he will get a bonus from Jindal for firing a lot of teachers and principals.