Sunday, July 27, 2014

White Refuses to Release Raw LEAP Cut Scores

A press statement accompanying the release of the Spring 2014 LEAP and iLEAP testing results announced that the percentage of students receiving a rating of "mastery" on the LEAP had improved this year and the percentage of students rated "basic" remained steady this year despite the inclusion of more "rigorous" Common Core aligned questions on this year's tests. The press release from the LDOE stated:

"The Department of Education today announced that on LEAP and iLEAP tests aligned to more challenging learning standards, the percentage of students performing at the state’s 2025 expectation of “mastery” (level 4 out of 5) increased in both English Language Arts and math, while the percentage of students performing at the state’s expectation level established in 1999, “basic” (level 3 out of 5), remained steady."

Using critical thinking skills to decipher the real meaning of the above statement, I began to ask myself "Does this press release mean that more students got a higher percentage of answers correct on this year's test than they did last year even though the test was supposed to be more difficult?" Also I wondered: "Does performance at a level of 4 out of 5 mean that students got 80% of the questions on the test correct? Does a rating of "basic" mean that a student got at least 60% (3 out of 5) of the questions right?" But after studying the technical explanations at the LDOE website, I concluded that the press release tells us nothing about what percentage of correct answers are represented by the ratings of "basic" and "mastery".  It also really tells us nothing about whether students got more or fewer right answers on this year's test compared to last year. To figure that out we would have to know the raw scores equivalent to such ratings. . . . and John White is not telling us the raw scores: the percentage of correct answers required to produce a rating of basic or mastery.

You see it turns out that the raw scores, or percentage of correct answers for the ratings of "basic" and "mastery" can be changed from year-to-year based upon judgements made by the LDOE and the testing company employed by the Department to design and grade the tests. The policy of the Department is that if the test for a particular year contains more difficult questions (in the opinion of the DOE and the testing company), the decision can be made to lower the raw cut score (the percentage of correct answers) for a rating of either "basic" or "mastery" to "adjust" for the greater difficulty of the new test. The following is the technical explanation given by the LDOE for adjusting or resetting the raw cut scores from one test form to the next: (From the DOE Technical Summary Report page 6)
 
"Equivalency is established by first building the forms to be equated according to tight content specifications. Then the form scores are placed on the same scale, such that students performing on an assessment at the same level of (underlying) achievement should receive the same scaled-score, although they may not receive the same number-correct score (or raw score).(emphasis added) The raw-to-scaled-score relationship performs this leveling function based on form equating studies. Theoretically, differences in the raw-to-scaled-score relationship between the two forms can be partially due to differences in the samples utilized for calibration and the differences in item difficulty."

So until we know how the raw cut scores compare this year with previous years, we really don't know whether on not student performance at the basic level "remained steady" or that the percentage of students performing at a level of mastery has improved.

That's why on June 10th I made a public records request of John White as the custodian of public records for the LDOE to provide me with the percentage of correct answers needed for students to receive a rating of basic and mastery for this year compared to the previous year. I also asked for a copy of any communications between the LDOE and the contracted testing company concerning any adjustments in test scores from last year to this year. But after more than a month of wrangling with the attorney representing John White and the DOE I was informed Friday that the Department is not in possession of the information I requested. How can the DOE not have the information it used to to give ratings on LEAP and iLEAP to approximately 500,000 Louisiana students?

What is the definition of a public record anyway? According to the Public Affairs Research Council which for years has advised the public in Louisiana on the meaning of the public records laws, the definition is: "Generally anything having been used, being in use or prepared for use in the conduct of public business is a public record, regardless of physical form." Based on this definition, I believe that the raw cut scores for a rating of basic and mastery on the LEAP tests are public records and should be provided to any citizen requesting them.

Our state superintendent, John White, before coming to Louisiana worked in the New York system of Education. The New York state agency was notorious for manipulating the cut scores used for determining the performance of students and schools in New York. It has been revealed recently that the raw cut scores were changed drastically over a 10 year period to first make it seem that student performance had improved dramatically and then last year the cut scores were changed to show a drop in performance when probably no real change had occurred. Is something similar now happening in Louisiana? We won't know unless John White provides us with the raw percentage scores for the ratings of basic and mastery over a period of years. We have a right to know if data is being improperly manipulated. We need to know if moving to Common Core testing is going to cause our students to perform higher or lower on the state tests.

I have offered to meet with White or his staff to resolve this matter amicably but that is not happening and it seems like my only option now is legal action once again to simply get White to follow state law.

 

6 comments:

Anonymous said...

Keep kicking ant piles, Mike!!

Seriously, when I curve a test...it is usually curved depending on who made the highest test score. If the highest percentage was an 88%, I'll add 12 points to everyone's test grade. However, if the next highest score was a 75%...I might give the 88% kid bonus points when I add 25% to everyone's score. (This also explains why a student of mine scored a perfect score on the 8th grade LEAP, but only got 84% of his civics questions correct.) I will write the added points on the tests themselves and explain what happened to the kids, parents, and my administration, if needed.

This openness has caused much improvement in either the way I teach/test...because at least one of them is is not providing the end result I'm looking for...or it improves the way the students approach my tests/instruction because I'm giving them what they need, but they are not transferring that to adequate test scores.

Knowing the raw test score and how it was "curved to secure palatable passing scores" is very important information to have in order to improve the overall education landscape.

Anonymous said...

The results are always standardized after the scores come in during Standards Setting. Teachers serve on this committee. Near the end, when we had just about decided we were a "rubber stamp" committee, I asked why the percentage of advanced scores were higher for math than for English. The testing company said that was because the Louisiana math teacher committee set it that way. "You mean WE can change that???" And they said yes. So we changed it! Also, you can't expect the "old test" results to show improvement if the teachers were teaching CC. That test wasn't made to test CC. IF we teach CC, we need a CC test!

Michael Deshotels said...

To the second anonymous commenter above, please send me a confidential email at mikedeshot@aol.com so that you can provide me with first hand information on the setting of cut scores for this year. I promise not to reveal your name to anyone. This is an extremely important matter affecting thousands of students and teachers. It can even make a difference in the grades our schools receive. Some of our BESE members are just as much in the dark on these matters as are regular teachers and citizens. A similar debate is happening in New York State.
Thanks in advance for your help,
Michael Deshotels

Candyce Watsey said...

Here is a link about the cut scores on the CC$$ tests in New York. Guess who set the cut scores? Pearson.http://curmudgucation.blogspot.com/2014/07/pearson-set-cut-scores-for-nys.html

Anonymous said...

So all of our jobs are based on kids LEAP scores
and the final score of that child can be basic or
advanced or whatever depending on how White
allows the cut scores to be set using a decision
making process clocked in secrecy, using
numbers we aren't allowed to see. REALLY!!!!!!
makes me want to just stop working on my room,
shut the door and walk into the sunset while the
music fades.

pharmacy cv said...

We would possibly bring around all those objects and values for the students and possibly they must needed to regard about every certainty forthwith.