Thursday, September 11, 2014

Higher Student Expectations?

What is an acceptable level of performance for an 8th grade math student? All public school teachers are required to set student learning targets (SLTs) each year for each classroom of students. Where should the passing scores be set and what is an acceptable level of student failure? One good point of reference should be the state administered 8th grade LEAP math test. Let's examine what is expected by our Department of Education for 8th grade Math.

Superintendent John White has repeatedly suggested that the state should have high expectations of all students and that if have high expectations on state tests that students will achieve more. Is he really following his own advice in designing and grading state tests?

Do you think Superintendent White would accept a passing score of slightly over 40%  on a teacher-made final test as acceptable? If the teacher set 40.1% for a minimum score for passing, would it be acceptable for 35% of a classroom of students to score below 40.1%? That's a pretty low expectation isn't it? Yet that's the result of the state-wide performance of our students on the new Common Core aligned 2014 LEAP given this Spring. The official scale score for a level of Basic on the Math 8th grade LEAP remained at exactly the point total used in the past (321 out of a total of 500 points). But the actual percentage of correct answers needed for a score of Basic (the minimum percentage), has been lowered this Spring to only 40.1%. And even with that low expectation, 35% of students state-wide failed to reach the level of Basic. Yet John White, in releasing the LEAP results in May announced that student performance on the new Common Core aligned LEAP tests was "steady" even though the test was more challenging.

It is also interesting that in an earlier press release the LDOE commented on a survey of students taking the new Common Core LEAP, and concluded that students did not find the new tests to be very difficult compared to their classroom work. Here is the quote:

"Of those students taking the computer-based test during the first phase, nearly 70 percent said the test was easier or about the same as their current school work. And, when asked if there were questions about things they have not learned this school year nearly 85 percent said there were none or few questions."

That's a strange conclusion, because most of the students I taught could always tell when a particular test was more difficult than usual. But apparently this time, students had no idea that the passing score would have to be lowered to 40.1% in order for 65% of the students state-wide to pass it. Similar results occurred on another LEAP test measuring ELA performance for 4th graders.

I believe the new Common Core aligned LEAP tests this year were poorly designed and resulted in abnormally low scores for students across the state, but that those low scores were hidden from the public by rigging the scale scores to make it look like students had no big problems with the new Common Core aligned tests. That's why I decided to write to the Accountability Commission to express my concerns. The Accountability Commission is a committee composed of teachers, superintendents, principals, parents, and business representatives who are supposed to advise the LDOE and BESE on the proper implementation of accountability policy. I am hoping that the Commission members will demand real accountability from our LDOE for the construction and grading of the most recent LEAP and iLEAP tests. Next year the state will be using the PARCC tests, and the results of those tests may not look so good. I would hate to see teachers and schools punished because of another drop in test scores on tests that may not be appropriate for our students.

And by the way, I do not agree with John White that higher expectations of students somehow magically results in higher performance. When standards are not well designed, when standards are not age appropriate and when tests are poorly designed, higher expectations alone mean nothing. They may just result in disappointment and further condemnation of our public education system. Here is my letter to the members of the Accountability Commission:

Dear Commission members:

My name is Michael Deshotels and I am writing to the Commission as a retired educator and a father of three children who all attended public schools, and a grandfather of 10 grandchildren who are in various stages of attending public schools today.

My reason for writing to the Commission is that in my research for my blog The Louisiana Educator, I have observed some apparent problems with test validity and possible double standards in the testing and grading of students using this year's LEAP tests compared to other tests. I am calling upon the members of the Accountability Commission to look into these matters and possibly recommend better policies. Here are my concerns:

In reviewing the results of the recent LEAP testing from the Spring of 2014 I focused primarily on the English Language Arts (ELA) and Math sections because these were the portions of LEAP that were changed to increase alignment with the new Common Core standards. I noticed that the scale scores for LEAP had not changed for this year compared to the previous years but I wondered if the underlying percentages of correct answers on each test related to the various achievement levels had changed. I made a public records request in May of this year for the minimum percentage of correct answers required for the achievement levels of Basic and Mastery on the recent LEAP tests. To date I have only received the results for the Basic level which is shown in the table below. I am still waiting for the minimum percentages for Mastery, and a court order has now been issued as a result of my lawsuit to require the Education Department to produce those records by September 22.



Percentage of Total Raw Score Points Required to Earn “Basic”




ELA
Math
Science
Social Studies
4th
2014
44.62%
47.22%
58.93%
53.03%
2013
51.54%
50%
56.90%
56.06%
2012
53.85%
52.78%
62.07%
57.58%
8th
2014
58.70%
40.13%
58.93%
50%
2013
57.97%
48.61%
56.90%
52.63%
2012
57.97%
55.26%
56.90%
52.63%


From this data, I made two observations: First, the raw percentage correct scores for a rating of Basic varies quite a bit from year to year even though the scale score for Basic remains the same. Second, I observed that the percentage of correct answers needed for a student to receive a rating of Basic dropped significantly for three out of four categories of LEAP for the 2014 school year compared to previous years. For example, I noticed that a student taking the 8thgrade math test needed to get only 40% of the answers right this year to get a rating of Basic. There is a similar result for 4thgrade ELA, which required only 44.6% of the answers right for a rating of Basic.

My concern is this: The LEAP scale score for a rating of Basic for 4th grade ELA remained at 301 points out of a possible 500 points, which to the average parent seems to be about 60% of the possible points, yet the actual percentage of correct answers needed for Basic was only 44.6%. The minimum scale score for Basic for 8th grade Math remained at stayed at 321 out of 500 which seems like 64% to the average parent, but the real cut percentage is only 40.13%. In addition, the cut percentages have been lowered significantly even though the public is given the impression that cut scores have remained the same. At the time the scores from the Spring LEAP testing were announced, the Department of Education stated that the average performance of our students at the level of Basic had remained steady even though the new Common Core aligned tests were more difficult. This statement appears to be misleading.

I want to assure the Commission that I am well aware that the official scale scores should not be viewed as proportionate to the number of correct answers on the test, but this is not clear to the public. I am also aware of the process called“leveling” where the testing company, with the approval of the Department, routinely adjusts the cut percentages on new forms of the LEAP test to take into account the comparative difficulty of different forms of the test and to insure that students are not penalized or rewarded when tests get more difficult or get easier. This is apparently the explanation for the drastic lowering of the cut percentages on this year's LEAP tests for ELA and Math. But I still find the statement that our students' performance has remained“steady” even though the tests were “aligned to more challenging learning standards” to be misleading.

I am also concerned that when a cut score has to be lowered to as low as 40% for a rating of Basic, that the test itself is getting close to being invalid as a measure of learning. That is because since the majority of the questions are still multiple choice, a student can get close to a passing score by just guessing at all questions where he/she does not know the answer and combining those guesses with a very few known answers. I believe that the Commission should seriously question the validity of tests that result in such low cut scores.

Such low cut scores also introduce the issue of a double standard in state testing of students compared to acceptable Student Learning Targets (SLTs). I do not know of any schools that approve student learning targets as low as 40 to 45% for a passing score, and where it is acceptable for 35% of students in a classroom to score below 40% on the final test. Yet this is the situation we find in the setting of the 8thgrade LEAP math standards for Basic for the entire state. I believe there was something wrong with the 8th grade LEAP Math test and also with some of the other tests, yet the real results were covered up by the pronouncement by the Department of “steady”performance of our students. Let me make this clear: I don't blame the students or the teachers for this dismal result, I blame the test and the test designers. I also blame the LDOE for using the false stability of scale scores as a smokescreen to hide the real performance on this year's LEAP test.

Finally, I must object to the low level of transparency exhibited by the Department in making this vital information available to educators and the public. Has the information above been produced for review by the Accountability Commission? When I first requested this information in May of this year, I was told that the information may be available in November. But of course, the cut percentages were known by the Department since the scores were issued in April. Why is it necessary to withhold such information from the public until November? Shouldn't at least the Accountability Commission be allowed to review this critical information? This is important because such scoring methods will possibly become even more important as the state fully transitions to the PARCC testing. Will the percentage cut scores be kept secret on the PARCC testing also?

I am requesting that the Accountability Commission look into these matters and recommend a more transparent policy on releasing results and scoring changes and also demand a thorough analysis of the validity and appropriateness of the tests which will be used to grade our students, our teachers and our schools. This analysis should be conducted by someone independent of the State Department of Education and the testing company.

Finally I would appreciate an opportunity to address the next meeting of the Accountability Commission to further clarify my concerns. Please feel free to contact me with any questions or comments.

Sincerely,
Michael Deshotels
Phone 225-235-1632

Monday, September 8, 2014

Holding Our Legislators Accountable

2014 LAE Legislative Report Card Now Available. Click Here.

There is no question that by far the most important decision makers affecting public education and the welfare of public school employees are our Louisiana state senators and representatives. The Louisiana legislature considers dozens of bills each year that impact our public education system so much that one could say that the future of public education is primarily controlled by our state legislators. In addition, the benefits of teachers and employment rights of teachers and other educators can change drastically in only one legislative session.

In only one legislative session in 2012, teachers and parents saw the introduction of vouchers and greatly expanded charter schools that siphon millions of dollars from public schools and damage the retirement system for public school teachers. Some of these expanded charter schools threaten to degrade public schools by competing for high performing students without having  the constraint public schools have of serving all students. (Look at this article today about damage to the Baker school system.) Some of the charter schools simply dump out low performers and discipline problems right back to public schools in order to raise their performance scores at the expense of public schools.

In the same 2012 session, Governor Jindal pushed through changes in teacher benefits basically destroying teacher tenure, all seniority rights, and by firmly tying teacher employment decisions to the extremely unreliable VAM system. In just a few weeks teachers lost all job protections and often found themselves potential victims of conditions over which they had no control. Oh, and at the same time, the legislature removed all education credentials necessary for college graduates to teach any subject in any charter school. Our legislators voted on every one of these extremely damaging bills. Are we holding them accountable for damaging public education and destroying educator protections and benefits?

I have pointed out in this blog many times that teachers and other public educators could take control of their own destiny by becoming much more active in expressing their opinions to their local legislators about important education bills. Educators could have more influence on legislators because each educator can influence many votes of relatives and friends in each legislative election. Educators could have tremendous influence if they would simply take the time and exert the effort to lobby their legislators each legislative session.

But many teachers admit that they know almost nothing about legislation before it happens and almost nothing about how their own legislators are voting on such bills. Well there is an easy way to remedy that. First, every educator should be a paying member of one of the two teacher unions in the state. (Yes there are only two real unions, LAE and LFT). Then when the legislative session approaches, simply go to the web sites updated each day by both groups and get informed about the bills that affect education. Next, contact the two legislators that represent each of us (our state senator and state representative) to tell them how we want them to vote. At the end of each legislative session both teacher unions issue a report card (here is the LAE report card) evaluating how each legislator voted on important education issues. Educators can then use this information to either thank their legislators or turn on the heat for the next session, and for making decisions on the next election.

It is quite simple. If public education is to survive this time of extreme crisis, and if professional educators expect to be treated as professionals in the future, every one of us must get involved in the legislative process. Sure you can get the information for free from this blog and from the two union web sites, but isn't it important for each of us to pay our fair share of the cost of representing public education with professional lobbyists and often to pay for lawsuits to fight bad legislation? Big wins recently by the use of expensive lawsuits by the two teacher unions have had a major impact in correcting some of the worst legislation passed by governor Jindal.

This year the LAE legislative report card graded our legislators using their votes on issues like teacher retirement age, revisions of the teacher evaluation system, authorizing of new charter schools, adoption of a new teacher due process system negotiated by the LAE and LFT, exemption of teachers from VAM in cases of excessive student absences and many others.

I am an LAE member and I like to use the LAE legislative report card to evaluate my legislators because I find this legislative report card to be extremely comprehensive and sensitive to the nuances of legislative maneuvering. The report card is even color coded to help us to identify our best friends in the legislature as well as the dangerous enemies of public education. Please find the time to get informed, then get active in the legislative process. It's your democratic right and responsibility!

Wednesday, September 3, 2014

New Revelations About LEAP and NAEP

The LA State Dept. of Education yesterday attempted to explain the setting of cut scores on LEAP. There is a section of the September 2 LDOE newsletter addressing the testing issue. Readers may remember that this blog demonstrated that the percentage of correct answers needed for a rating of basic in 2014 dropped drastically in 3 out of 4 categories in ELA and Math compared to 2013.

According to the LDOE, the scale scores for a rating of basic have not changed since they were created. It still takes a scale score of 301 to get a rating of Basic in 4th grade ELA and it still takes a score of 321 to get a rating of basic in 8th grade Math in 2014 just like it did in 2013. Even though the tests were made more difficult by going to the more rigorous Common Core aligned questions, the scale cut scores remained the same according to LDOE. But they also admit that the percentage of correct answers equating to those cut scores can change from year to year because of a process used by the testing company called “test equating”. That's an adjustment made mostly to insure that the students are not penalized or rewarded as the new test form either gets harder or easier.

In my blog post of August 17, I suggested that the statement made by the LDOE, that the percentage of Louisiana students scoring basic had remained steady even though the tests had gotten harder was misleading. My point was that the “test equating” process had adjusted the minimum raw percentage scores to insure that the percentage of students scoring basic remained steady. It was a rigged result, and no one really knows whether our students' learning remained steady from 2013 to 2014.

But the LDOE description of the LEAP design process made another statement that in my opinion further destroys their credibility in approving LEAP cut scores. Here it is, from the LDOE website. The following is the statement that concerns me.

The scaled scores and cut points for LEAP – what it takes to earn Basic, Mastery, Advanced – were set in 1999 when Louisiana first created the LEAP assessments; the scaled score ranges for iLEAP were set in 2006. To ensure rigorous achievement levels, Louisiana set these cut scores using the National Assessment of Educational Progress (NAEP) as guidance. Thus, Basic on LEAP roughly equates to Basic on NAEP and Mastery on LEAP roughly equates to Proficient on NAEP.”

I decided to check whether or not the LEAP scores really do equate to the NAEP and also to study the trend in NAEP scores for Louisiana compared to LEAP over a period of years. There were a few problems to overcome however. Since John White came in as State Superintendent, much of the data from previous years has disappeared from the Louisiana Believes website, and also the NAEP provides only reading and math scores instead of ELA and math, and their data only goes to 2013. So I got LEAP data as far back as I could and used the reading score on NAEP to compare to the ELA score on the LEAP. So here are the results of my comparison of percentage of students attaining Basic or above on LEAP and NAEP from 2005 to 2013:

2005 LEAP Results compared to NAEP (percentage of students at Basic or above)

4th grade ELA – LEAP- 66.9%     4th grade reading – NAEP – 53%

4th grade math – LEAP- 63.4%    4th grade math - NAEP – 74%

8th grade ELA- LEAP- 53.2%      8th grade reading – NAEP – 64%

8th grade math-LEAP-54.9%         8th grade math - NAEP – 59%


2013 LEAP Results compared to NAEP ( percentage of students at Basic or above)

4th grade ELA -LEAP- 77%        4th grade reading – NAEP – 56%

4th grade math – LEAP -71%      4th grade math - NAEP – 75%

8th grade ELA - LEAP - 69%      8th grade reading – NAEP – 68%

8th grade math - LEAP - 66%      8th grade math - NAEP – 64%

It does not look to me like LEAP and NAEP scores are very compatible. If we compare LEAP scores to NAEP, in 2005 we find that the 4th grade ELA LEAP scores are a lot higher than the 4th grade reading NAEP scores, but the 8th grade ELA- LEAP scores are significantly lower than the 8th grade Reading- NAEP percentages.

Then looking at math in 2005, we find that the 4th grade LEAP math percentage attaining basic or above is a lot lower than the NAEP percentage and the 8th grade LEAP math is pretty close to the NAEP math.

But look at the change in LEAP and NAEP results for Basic from 2005 to 2013. The LEAP scores in both ELA and math went up pretty dramatically during that 8 year time period, but the NAEP scores went up only a little. The average increase in LEAP in the 4 categories reviewed above was about 11 percentage points but the average increase in NAEP was only about 3.25 percentage points. But since both tests were measuring the same thing and the LDOE has told us that Basic on LEAP equates to Basic on NAEP, both sets of scores should have gone up pretty much the same.

So now in addition to obvious test score manipulation, we have test score inflation on LEAP!

Our LDOE (or the testing company) has artificially inflated the percentage of students passing the LEAP to make it look like the gut wrenching deforms to our public education system and the obsessive teaching to the tests have all been worth it. The truth is that using the NAEP as a more objective measure of our real progress, Louisiana has gained an average of only 3.25 percentage points for students reaching Basic. But not only that, using NAEP comparisons, the gap has widened slightly between Louisiana and the national average.

So in the last 10 years, Louisiana has spent millions upon millions of dollars on a testing system that manipulated and inflated scores while our students have lost ground in comparison to the rest of the states. In addition, we have lied to the rest of the nation, telling everyone that our students in the Louisiana Recovery District have made tremendous progress in attaining grade level performance when that measurement of grade level (Basic on LEAP) was inflated to the point of making a mockery of the Louisiana accountability system.

It pains me as a public education advocate to point out these false claims of test measured progress of our educational system in Louisiana, but the truth always eventually comes out. We should always base our education policies on the truth.

I believe that all this emphasis on testing is wrong because scores are manipulated and because these so called standardized tests are not a fair and accurate measure of our students and our schools in the first place. In my opinion, all that time our teachers were forced to spend drilling our students for taking the LEAP could have been much better spent teaching kids how to enjoy and appreciate reading, and math and science and history and music. We are killing all the joy of learning for both students and teachers and we have almost nothing to show for it.