Dear Readers:
I am very pleased to publish a guest
post this week by Dr Mercedes Schneider, a Louisiana educator who has
a doctoral degree in statistics. It is an excellent analysis of a
major flaw in our Louisiana school grading system.
The grades that our schools in
Louisiana have been assigned by our LDOE have almost no relation to
the actual quality of education offered by each school. By far the
dominant factor measured by the SPS scores and the resultant grade of
“A” through “F” is the level of poverty of the students
attending the school. In addition, Dr Schneider's analysis explains
how the present school grading system favors high schools and
combination schools over middle and elementary schools.
Next year the letter grading system
will be based on 150 points instead of 200 points but the letter
grades are not exactly proportionally adjusted. In addition, schools
will receive bonus points for achievement gains by more than one
third of low performing students that are greater than predicted by
the new value added formula. The LDOE can and
does change the formulas of the VAM to
produce whatever results they want. But the primary result of all
this manipulation has been to try to convince parents and voters that
a large portion of our public schools in general are failures and
that we should instead entrust our children and our tax dollars to
private for-profit education providers that have almost no
accountability for the expenditure of our taxes.
By: Dr Mercedes Schneider
On Sunday, November 18, I emailed John
White and asked one question: “Who calculated the 2012 school and
district performance scores?” The next afternoon, he emailed back,
“The Department of Education’s Division of Assessment and
Accountability performs that calculation.” I immediately responded,
“What person or people specifically calculated the 2012 school and
district performance scores? I would like to know their names
please.”
I have yet to receive a response.
I realize that mine is not an easy
question for White to answer. He cannot write that he doesn’t know.
He cannot write that such information is secret. Given the
importance of the scores, he certainly cannot provide names of people
without stats/measurement credentials. And given the problems with
the scores, White cannot even provide the name of a credentialed
individual. The entire process of scoring, from the setting of the
criteria to its application, is a psychometric sinkhole.
In addition, I think by now White knows
my credentials; I believe this is why he went silent on me. I am a
Louisiana public school teacher who holds a doctorate in applied
statistics and research methods. I have been reading and examining
the 2012 school performance scores and LDOE Bulletin 111,
Louisiana School District, and State Accountability System,
and I am not exaggerating when I write that from a statistics and
measurement perspective, the entire process is a pristine example of
how one should not conduct and apply high-stakes educational
assessment.
Reading the contents of Bulletin 111
can be an intimidating task. First of all, it reads like a legal
document. Indeed, it does have the force of law. Second, the document
concerns the details related to calculating school and district
performance scores, including numerous alterations, contingencies,
labelings, and consequences. However, for all of its seeming
complexity, the document is chaotic and is undeniably poor
measurement procedure. Don’t let its daunting appearance fool you.
One sure way to ascertain whether or
not Bulletin 111 is poor is to examine its outcomes: the scores it
produces. I was able to view two spreadsheets related to the 2012
school and district performance scores, one made available to the
public on the
DOE website, and another sent to
BESE members before the
2012 scores were made public.
Examination of these scores reveals scoring bias in favor of
high/combination schools and against elem/middle schools. The
presence of scoring bias means that what is being measured is not
school performance or growth; what is being measured is the quality
of Bulletin 111. In the presence of bias, the quality of the bulletin
is demonstrated to be poor, for the bulletin cannot deliver what it
purports; namely, school performance and growth.
I calculated three ways in which this
bias is evident in the 2012 scores, and on November 21, I composed
a letter including my findings and sent it to Mr. White and members of
BESE. I will briefly summarize my findings
here.
First, the number of schools with score
increases of 10+ points is 190. Only 22 are elementary/middle
schools; 168 are high/combination schools. What is important to note
is that the elementary/middle schools outnumber high/combination
schools 3 to 1. Thus, in the absence of bias (i.e., in the presence
of equal measurement opportunity), if 168 high/combination schools
showed baseline score gains of 10+ points, one would expect
approximately 168 x 3 = 504 elem/middle schools to show the same
increases. The difference between the observed 22 and expected 504
leaves no doubt as to scoring bias evident in the stipulations set
forth in Bulletin 111.
Second, the number of schools labeled
as “schools in decline,” as determined by a decrease of more than
–2.5 SPS points from one year to the next (Bulletin 111, pg. 13),
shows a disproportionate number of elem/middle schools. The ratio of
elem/middle to high/combination schools scores decreasing by more
than 2.5 SPS points is more than 5.5 to 1 (78 elem/middle schools to
14 high/combination schools). In the absence of bias, the ratio
should be closer to the ratio of elem/middle-to-high/combination
schools in the population: 3 to 1.
Finally, schools that have a growth
score of 7 to 10 are labeled as having “entered Academic
Assistance” (Bulletin 111, pg. 15). The proportion of such schools
exceeds the proportion expected in the population of elem/middle vs.
high/combination schools (observed 3.8 to 1 to expected 3 to 1).
However, the bias against elem/middle schools is evident in the
proportion of elem/middle vs. high/combination schools that met the
growth target of 7 to 10 points. The proportion favors
high/combination schools and is 1.3 to 1 (actual numbers: 105
elem/middle schools vs. 79 high/combination schools). Over 50% of
the high schools met the growth goal (79/151 = .52); however, fewer
than 20% of the elementary schools met the growth goal (105/574 =
.18).
These numbers are not the result of
“less effective” elementary/middle schools vs. “more effective”
high/combination schools, although from Mr. White’s silence on the
subject, I think he is hoping the public will view it that way. This
one will require a powerful PR blender for its DOE/BESE “spin.”
To those elementary and middle schools
with scores artificially suppressed by a crippled scoring system:
Thank you for your efforts. I realize that you have taken a
“separate and not so equal” hit in your 2012 performance scores.
To those high/combination schools that
are benefiting from this biased system: Thank you for your work, as
well. Please do not allow the unfairness in the performance scores
to show itself in any “better than Thou” behavior toward your
colleagues working in elementary/middle school settings.
To those district administrators who
realize the error but who are keeping quiet in order to preserve the
appearance of what is really an inflated district score: Please
speak up on the side of decency and fairness. Please don’t succumb
to the “spin.”
Respectfully submitted,
Mercedes K. Schneider, Ph.D.
Applied Statistics and Research Methods