Distric Judge Tim Kelly ruled Friday that it is a violation of the Louisiana Constitution for MFP funds to pay students to attend private schools. The ruling was handed down as a result of lawsuits by the LAE, LFT and the Louisiana School Boards Association. The ruling prohibits the funding of public to private school vouchers now in operation and the course choice program scheduled to start next school year. See the Baton Rouge Advocate story.
This ruling will be appealed to the Louisiana Supreme court by the Jindal administration and BESE. But for now it is a great win for public education, for taxpayers and for children. The voucher program was sure to grow over the years because of heavy advertising and student recruitment by small and very large private schools. Under the voucher system millions of our tax dollars would have been diverted from the classroom to these advertising campaigns. In some states almost half of captured education funds by virtual school giant K12 were used for advertising. They can do this because of pupil teacher ratios of as much as 250 to one. Also virtual schools have no buildings to maintain and no buses to run. They provide such a poor education that they have to constantly recruit new students to replace their massive dropouts!
Educators and school board members must prepare for a continued fight with the Jindal administration. Even if the voucher decision is upheld, his new "presidential" sized ego will compel him to fight public education in every way possible. These recent events demonstrate to us that there is now an all out war agains public education. But this war is by privatization interests, not by the average taxpayer or public school parent. Remember, 98 % of parents eligible did not apply for vouchers!
It is more important than ever that all teachers join and pay dues to their unions/professional organizations. That is join either the LAE or the LFT. Stop being paralyzed by the bogus anti-union propaganda! For the first time in our history there is a strong coalition uniting teacher organizations and school boards. This coalition made the win possible in the voucher lawsuit and will be even more important in the upcoming legislative battles. Teachers need relief from the extremely unfair VAM and both principals and teachers are needlessly burdened by the dog-and-pony show required by the COMPASS. We need strong organizations representing teachers, administrators, and school board members to fight the continuing Jindal privatization campaign. Remember that this court decision has no effect on the addition of more charter schools that are being pushed harder than ever before.
In addition to joining your union/professional organization you can join my Defenders of Public Education for free! Just send me an email to louisianaeducator@gmail.com. Tell me who your state Representative and Senator are and the email you want me to use to send you updates on the important bills before your legislator votes. Together we can make a big difference!
Saturday, December 1, 2012
Tuesday, November 27, 2012
SPS Scoring Bias
Dear Readers:
I am very pleased to publish a guest post this week by Dr Mercedes Schneider, a Louisiana educator who has a doctoral degree in statistics. It is an excellent analysis of a major flaw in our Louisiana school grading system.
The grades that our schools in Louisiana have been assigned by our LDOE have almost no relation to the actual quality of education offered by each school. By far the dominant factor measured by the SPS scores and the resultant grade of “A” through “F” is the level of poverty of the students attending the school. In addition, Dr Schneider's analysis explains how the present school grading system favors high schools and combination schools over middle and elementary schools.
Next year the letter grading system will be based on 150 points instead of 200 points but the letter grades are not exactly proportionally adjusted. In addition, schools will receive bonus points for achievement gains by more than one third of low performing students that are greater than predicted by the new value added formula. The LDOE can and does change the formulas of the VAM to produce whatever results they want. But the primary result of all this manipulation has been to try to convince parents and voters that a large portion of our public schools in general are failures and that we should instead entrust our children and our tax dollars to private for-profit education providers that have almost no accountability for the expenditure of our taxes.
By: Dr Mercedes Schneider
On Sunday, November 18, I emailed John White and asked one question: “Who calculated the 2012 school and district performance scores?” The next afternoon, he emailed back, “The Department of Education’s Division of Assessment and Accountability performs that calculation.” I immediately responded, “What person or people specifically calculated the 2012 school and district performance scores? I would like to know their names please.”
I have yet to receive a response.
I realize that mine is not an easy question for White to answer. He cannot write that he doesn’t know. He cannot write that such information is secret. Given the importance of the scores, he certainly cannot provide names of people without stats/measurement credentials. And given the problems with the scores, White cannot even provide the name of a credentialed individual. The entire process of scoring, from the setting of the criteria to its application, is a psychometric sinkhole.
In addition, I think by now White knows my credentials; I believe this is why he went silent on me. I am a Louisiana public school teacher who holds a doctorate in applied statistics and research methods. I have been reading and examining the 2012 school performance scores and LDOE Bulletin 111, Louisiana School District, and State Accountability System, and I am not exaggerating when I write that from a statistics and measurement perspective, the entire process is a pristine example of how one should not conduct and apply high-stakes educational assessment.
Reading the contents of Bulletin 111 can be an intimidating task. First of all, it reads like a legal document. Indeed, it does have the force of law. Second, the document concerns the details related to calculating school and district performance scores, including numerous alterations, contingencies, labelings, and consequences. However, for all of its seeming complexity, the document is chaotic and is undeniably poor measurement procedure. Don’t let its daunting appearance fool you.
One sure way to ascertain whether or not Bulletin 111 is poor is to examine its outcomes: the scores it produces. I was able to view two spreadsheets related to the 2012 school and district performance scores, one made available to the public on the DOE website, and another sent to BESE members before the 2012 scores were made public. Examination of these scores reveals scoring bias in favor of high/combination schools and against elem/middle schools. The presence of scoring bias means that what is being measured is not school performance or growth; what is being measured is the quality of Bulletin 111. In the presence of bias, the quality of the bulletin is demonstrated to be poor, for the bulletin cannot deliver what it purports; namely, school performance and growth.
I calculated three ways in which this bias is evident in the 2012 scores, and on November 21, I composed a letter including my findings and sent it to Mr. White and members of BESE. I will briefly summarize my findings here.
First, the number of schools with score increases of 10+ points is 190. Only 22 are elementary/middle schools; 168 are high/combination schools. What is important to note is that the elementary/middle schools outnumber high/combination schools 3 to 1. Thus, in the absence of bias (i.e., in the presence of equal measurement opportunity), if 168 high/combination schools showed baseline score gains of 10+ points, one would expect approximately 168 x 3 = 504 elem/middle schools to show the same increases. The difference between the observed 22 and expected 504 leaves no doubt as to scoring bias evident in the stipulations set forth in Bulletin 111.
Second, the number of schools labeled as “schools in decline,” as determined by a decrease of more than –2.5 SPS points from one year to the next (Bulletin 111, pg. 13), shows a disproportionate number of elem/middle schools. The ratio of elem/middle to high/combination schools scores decreasing by more than 2.5 SPS points is more than 5.5 to 1 (78 elem/middle schools to 14 high/combination schools). In the absence of bias, the ratio should be closer to the ratio of elem/middle-to-high/combination schools in the population: 3 to 1.
Finally, schools that have a growth score of 7 to 10 are labeled as having “entered Academic Assistance” (Bulletin 111, pg. 15). The proportion of such schools exceeds the proportion expected in the population of elem/middle vs. high/combination schools (observed 3.8 to 1 to expected 3 to 1). However, the bias against elem/middle schools is evident in the proportion of elem/middle vs. high/combination schools that met the growth target of 7 to 10 points. The proportion favors high/combination schools and is 1.3 to 1 (actual numbers: 105 elem/middle schools vs. 79 high/combination schools). Over 50% of the high schools met the growth goal (79/151 = .52); however, fewer than 20% of the elementary schools met the growth goal (105/574 = .18).
These numbers are not the result of “less effective” elementary/middle schools vs. “more effective” high/combination schools, although from Mr. White’s silence on the subject, I think he is hoping the public will view it that way. This one will require a powerful PR blender for its DOE/BESE “spin.”
To those elementary and middle schools with scores artificially suppressed by a crippled scoring system: Thank you for your efforts. I realize that you have taken a “separate and not so equal” hit in your 2012 performance scores.
To those high/combination schools that are benefiting from this biased system: Thank you for your work, as well. Please do not allow the unfairness in the performance scores to show itself in any “better than Thou” behavior toward your colleagues working in elementary/middle school settings.
To those district administrators who realize the error but who are keeping quiet in order to preserve the appearance of what is really an inflated district score: Please speak up on the side of decency and fairness. Please don’t succumb to the “spin.”
Respectfully submitted,
Mercedes K. Schneider, Ph.D.
Applied Statistics and Research Methods
I am very pleased to publish a guest post this week by Dr Mercedes Schneider, a Louisiana educator who has a doctoral degree in statistics. It is an excellent analysis of a major flaw in our Louisiana school grading system.
The grades that our schools in Louisiana have been assigned by our LDOE have almost no relation to the actual quality of education offered by each school. By far the dominant factor measured by the SPS scores and the resultant grade of “A” through “F” is the level of poverty of the students attending the school. In addition, Dr Schneider's analysis explains how the present school grading system favors high schools and combination schools over middle and elementary schools.
Next year the letter grading system will be based on 150 points instead of 200 points but the letter grades are not exactly proportionally adjusted. In addition, schools will receive bonus points for achievement gains by more than one third of low performing students that are greater than predicted by the new value added formula. The LDOE can and does change the formulas of the VAM to produce whatever results they want. But the primary result of all this manipulation has been to try to convince parents and voters that a large portion of our public schools in general are failures and that we should instead entrust our children and our tax dollars to private for-profit education providers that have almost no accountability for the expenditure of our taxes.
By: Dr Mercedes Schneider
On Sunday, November 18, I emailed John White and asked one question: “Who calculated the 2012 school and district performance scores?” The next afternoon, he emailed back, “The Department of Education’s Division of Assessment and Accountability performs that calculation.” I immediately responded, “What person or people specifically calculated the 2012 school and district performance scores? I would like to know their names please.”
I have yet to receive a response.
I realize that mine is not an easy question for White to answer. He cannot write that he doesn’t know. He cannot write that such information is secret. Given the importance of the scores, he certainly cannot provide names of people without stats/measurement credentials. And given the problems with the scores, White cannot even provide the name of a credentialed individual. The entire process of scoring, from the setting of the criteria to its application, is a psychometric sinkhole.
In addition, I think by now White knows my credentials; I believe this is why he went silent on me. I am a Louisiana public school teacher who holds a doctorate in applied statistics and research methods. I have been reading and examining the 2012 school performance scores and LDOE Bulletin 111, Louisiana School District, and State Accountability System, and I am not exaggerating when I write that from a statistics and measurement perspective, the entire process is a pristine example of how one should not conduct and apply high-stakes educational assessment.
Reading the contents of Bulletin 111 can be an intimidating task. First of all, it reads like a legal document. Indeed, it does have the force of law. Second, the document concerns the details related to calculating school and district performance scores, including numerous alterations, contingencies, labelings, and consequences. However, for all of its seeming complexity, the document is chaotic and is undeniably poor measurement procedure. Don’t let its daunting appearance fool you.
One sure way to ascertain whether or not Bulletin 111 is poor is to examine its outcomes: the scores it produces. I was able to view two spreadsheets related to the 2012 school and district performance scores, one made available to the public on the DOE website, and another sent to BESE members before the 2012 scores were made public. Examination of these scores reveals scoring bias in favor of high/combination schools and against elem/middle schools. The presence of scoring bias means that what is being measured is not school performance or growth; what is being measured is the quality of Bulletin 111. In the presence of bias, the quality of the bulletin is demonstrated to be poor, for the bulletin cannot deliver what it purports; namely, school performance and growth.
I calculated three ways in which this bias is evident in the 2012 scores, and on November 21, I composed a letter including my findings and sent it to Mr. White and members of BESE. I will briefly summarize my findings here.
First, the number of schools with score increases of 10+ points is 190. Only 22 are elementary/middle schools; 168 are high/combination schools. What is important to note is that the elementary/middle schools outnumber high/combination schools 3 to 1. Thus, in the absence of bias (i.e., in the presence of equal measurement opportunity), if 168 high/combination schools showed baseline score gains of 10+ points, one would expect approximately 168 x 3 = 504 elem/middle schools to show the same increases. The difference between the observed 22 and expected 504 leaves no doubt as to scoring bias evident in the stipulations set forth in Bulletin 111.
Second, the number of schools labeled as “schools in decline,” as determined by a decrease of more than –2.5 SPS points from one year to the next (Bulletin 111, pg. 13), shows a disproportionate number of elem/middle schools. The ratio of elem/middle to high/combination schools scores decreasing by more than 2.5 SPS points is more than 5.5 to 1 (78 elem/middle schools to 14 high/combination schools). In the absence of bias, the ratio should be closer to the ratio of elem/middle-to-high/combination schools in the population: 3 to 1.
Finally, schools that have a growth score of 7 to 10 are labeled as having “entered Academic Assistance” (Bulletin 111, pg. 15). The proportion of such schools exceeds the proportion expected in the population of elem/middle vs. high/combination schools (observed 3.8 to 1 to expected 3 to 1). However, the bias against elem/middle schools is evident in the proportion of elem/middle vs. high/combination schools that met the growth target of 7 to 10 points. The proportion favors high/combination schools and is 1.3 to 1 (actual numbers: 105 elem/middle schools vs. 79 high/combination schools). Over 50% of the high schools met the growth goal (79/151 = .52); however, fewer than 20% of the elementary schools met the growth goal (105/574 = .18).
These numbers are not the result of “less effective” elementary/middle schools vs. “more effective” high/combination schools, although from Mr. White’s silence on the subject, I think he is hoping the public will view it that way. This one will require a powerful PR blender for its DOE/BESE “spin.”
To those elementary and middle schools with scores artificially suppressed by a crippled scoring system: Thank you for your efforts. I realize that you have taken a “separate and not so equal” hit in your 2012 performance scores.
To those high/combination schools that are benefiting from this biased system: Thank you for your work, as well. Please do not allow the unfairness in the performance scores to show itself in any “better than Thou” behavior toward your colleagues working in elementary/middle school settings.
To those district administrators who realize the error but who are keeping quiet in order to preserve the appearance of what is really an inflated district score: Please speak up on the side of decency and fairness. Please don’t succumb to the “spin.”
Respectfully submitted,
Mercedes K. Schneider, Ph.D.
Applied Statistics and Research Methods
Subscribe to:
Posts (Atom)