Saturday, December 1, 2018

Another Fake School Conning Parents and Donors

Why did it take the New York Times to send a team of reporters to Louisiana to expose this fraud of a school? Where were the reporters of the Lafayette Advertiser and The Advocate while innocent children were being preyed upon and their parents conned into paying huge monthly tuitions to a "school" where their children were being beaten, humiliated, and taught to lie on their resume's so they could end up flunking out of elite colleges? The Times report exposes T.M. Landry near Breaux Bridge as just another fake "miracle school".

Its easy to promote miracle schools because we want to believe.
Why were so many so called "important people" taken in by these con artists? Why were TV personalities like those on the "Today Show and "Ellen" taken in by this scam that conned their viewers into making contributions to these crooks?

The so called education reforms where standardized test scores and graduation rates and "credit recovery" and "high performing" and "failing" schools have taken the place of real education have set the stage for a deluge of phony miracle schools.

Why are our public schools being forced to allocate a part of our public school budgets to buy television and radio ads in an attempt to prevent the bleeding of school funds to these fake schools?

Our politicians are on the take from education con artists
Our legislature and the federal congress has opened the floodgates allowing our public tax dollars to flow to con artists who are getting rich by faking school test results and graduation rates and college admissions rates, just to get our school tax dollars. The politicians are handing over our tax dollars to various non-educator con-persons because they themselves are "on the take" by receiving political contributions and various favors from these crooks who claim to be running miracle schools that can make all students into elite college goers.

Taxpayers wake up! We have a system for insuring that our school tax dollars are used to educate our children. We elect local school board members every 4 years who can be replaced if they don't spend our tax dollars wisely. Its not perfect, but its much better than allowing charter school operators who appoint their own boards so that they can pay themselves exorbitant salaries while hiring the cheapest teachers who often are not properly certified. But our politicians who are on the take to these charlatans have passed laws that make it easy for these privatizers to just hijack education funds from our local school boards.

Campbell's law tells us that if a school appears too good to be true, it probably is.
There is an important concept that applies to the ratings of public institutions such as public schools. Its called Campbell's law. This principle first articulated by social psychologist Dr Donald Campbell, tells us this: When a particular statistical/social measure is used to measure progress in any enterprise that carries with it high stakes, rewards, and punishments, that statistical measure is sure to be corrupted and/or misreported to the point that it becomes useless as a measure of success. If you make the graduation rate the measure of success of a school and you make the future of school administrators dependent on that graduation rate you will soon see a dramatic improvement in the graduation rate, mostly accomplished by handing out worthless diplomas. If you make high standardized test scores the criteria for keeping a takeover charter school open, you will get higher standardized test scores by hook or crook.

The education of children is a complex, difficult, but intrinsically rewarding job that has been done very well for many years in our country by dedicated professional educators. That job has changed and gotten more difficult in the last 50 years when our country set a goal of educating all children including those with disabilities and attempting to close the achievement gap between rich and poor, and white and brown students. That's when the myths about "good"schools and "bad" schools started proliferating and causing the diversion of our school tax dollars to privatization efforts by many non-educators who saw an opportunity for big profits. One of the most damaging myths is that anyone should be allowed to run a school even if they have no education credentials, as long as they can produce high student test scores. We have all been taken in by con-men who used these myths about eduction to extract our tax dollars. This is what creates this new rash of phony "miracle schools".

The con-man phenomenon is being modeled at the top.
It seems that we are living in a time of unbridled con-men and schemes to fleece the public that goes all the way to the top office in our government! Remember: If it looks too good to be true, it probably is; if the politician is promising to "make our country great again" while playing on peoples' prejudice and fears he is probably just a con-man. He probably needs to be locked up.

Thursday, November 15, 2018

The New School Performance Scores and Grades Ignore Critical Factors

By omitting critical factors affecting test performance, the new public school grading system promotes the myth that all schools operate on a level playing field.
The new school grading system is very complex, requires many arcane calculations, and promotes the myth that schools are being rated in a fair manner.  Unfortunately the new school rating system amounts to a useless Rube-Goldberg ratings machine resulting in continued stigmatization of schools serving disadvantaged student populations. It operates with several serious flaws. (See the analysis below by Herb Bassett.) The school grading system tells us what we already knew without having to go through these complex calculations. That is that schools with high percentages of economically disadvantaged students and students with disabilities produce lower test score averages year after year. Also schools that select their students by high academic ability and economic advantage produce higher average test scores year after year.

But when we look at the extremely detailed formula for assigning grades to schools, it seems to include almost everything but the factors that really matter. The addition of progress points changes little in the final result. It is my opinion that schools with high percentages of  disadvantaged students are just as handicapped in winning progress points as they are in attaining high average test scores.  All children can make academic progress but at different rates according to their varying environments and talents.

We know that acedemically selective magnet schools will always get high performance scores and an "A" letter grade almost without regard to the quality of instruction. That's because students selected based on good academic performance in the past will generally produce high test scores in the near future.

We also know that lab schools that admit mostly wealthy students will produce high average test scores and will also get rated as high performing schools. 

Alternative schools which receive mostly low performing and troubled students get the lowest grades year after year. 

Simply noting the designation "magnet" or "alternative" in the SPS tables has no effect on the calculation of the scores.

The correlation of poverty and school scores in the new grading system could be observed by including a column listing the percentage of students in the school who are economically disadvantaged in addition to the "magnet" or "alternative" designations. But our LDOE has always failed to include the economic data next to the school ratings. Here is a sample of such data for the Natchitoches Parish school system: (Click on chart to enlarge it)

The 3 "A" schools in Natchitoches Parish don't just happen to be magnet and lab schools.  The 4 "F" schools include one alternative school and three schools that serve some of the most impoverished students in the school system. Student selection profoundly affects the school scores and letter grades. (Note: high poverty populations do have some high performing students. These are often siphoned off to magnet schools, further depressing the score of their original school)

It is not my intention to unfairly spotlight the Natchitoches School system. The same trends can be observed in any of the public school systems in Louisiana.  In addition, the school systems that have the highest district grades are invariably those with the lowest levels of poverty. The LDOE conveniently fails to provide the data to allow the public to make this comparison.

So what is the purpose of reporting to the public that such schools repeatedly have low average test performance? How does it help to add insult to injury for struggling schools? How does it help a school to attract good teachers when it is repeatedly rated as a "D" or an "F" no matter how hard the teachers work? If the ultimate goal is to help students perform better, then it must be obvious that the school rating system is counter productive.

This grading system continues to perpetrate the myth that "A" rated schools have the best teachers and the "F" rated schools have the worst teachers. A great way to destroy this myth would be to switch the teachers from an "A" rated school with the teachers from an "F" rated school for a few years. I would be willing to make a sizable waver that there would be little change in the performance of either school. But to my knowledge, this has never been done. Why do we continue to perpetuate the myth that school performance is mostly determined by the quality of the teachers when we have so much evidence that there are much more powerful factors at work?

But Superintendent White and his allies are irrevocably committed to the rating, blaming and shaming of schools for factors over which they have no control. The facts don't matter. The only thing that matters is the mistaken ideology that such tactics will somehow produce improved results.

We have shown in this earlier report that the LEAP test results have been inflated in recent years to give the illusion of progress, but this report shows that Louisiana is falling further behind on the Nation Assessment of Education progress which compares Louisiana to all other states. 

In Louisiana, this policy along with the imposition of the unteachable Common Core standards have only resulted in a lower ranking compared to other states on the National Assessment of Educational Progress.

Monday, October 29, 2018

Interpreting Louisiana's Growth Scores for Schools:

Editor's note: The following post is by Herb Bassett, a teacher who has a commanding understanding of the complex school rating and grading systems imposed upon our schools by the Louisiana Department of Education. His analysis of the latest school rating system to be released by the LDOE on November 6 should be very helpful to parents and teachers in avoiding false conclusions about their schools. It points out serious flaws in the rating system that produces high and low scores often in an arbitrary manner.

Over the last six years, Bassett has documented the effects of changes to the methods of computing Louisiana's school performance scores and commented on the use of growth measures in school and principal evaluations. (Links to SPS articles: 2012,2013 predictions2013,2016 history) This analysis reviews the latest changes in the school rating system.

Interpreting Louisiana's Growth Scores for Schools:
A guide for parents, educators, and legislators.
by Herb Bassett, Grayson, LA

Growth scores will be a new feature of Louisiana's school performance evaluations this fall. In the past, the Louisiana Department of Education (LDOE) graded schools primarily on how many students had reached the levels of basic, mastery, and advanced. Schools whose students arrived at their doorsteps as high achievers were more likely to receive higher letter grades than schools who started with struggling students. In some schools, many students started far behind but made remarkable gains; however the schools were penalized because the students did not progress all the way to the higher achievement levels in a single year. Growth scores are an attempt to reward and recognize schools whose students make outstanding improvements regardless of the achievement level they attain.

Growth measures are simple in concept, but they are difficult to implement. In 2013, LDOE introduced a growth measure into its school accountability system. Called Bonus Points, they applied only to students scoring below basic. The initial implementation of the Bonus Points was problematic. Some schools that year were penalized because they moved too many students from below basic to basic. Had those students scored lower on the tests, those schools would have received a higher letter grade. LDOE later modified the rules to address that problem.

We now have a preliminary indication that the new growth scores will also present problems. 

LDOE released"Top Growth" rates in late August. While they are not the final growth scores, they indicate how many students earned an "A"-level of points for their schools. Students qualified either by being in the top 40 percent of a value-added-model (VAM) ranking[i]or by meeting a "growth-to-mastery/advanced" (GTM) target. Since the top 40 percent of the VAM ranking qualify, it guarantees that the statewide "Top Growth" rate will be at least 40 percent. While many students qualified by both measures, a significant number qualified strictly through GTM. Any "Top Growth" rate over 40 percent can be attributed solely to GTM.

Since the "Top Growth" rates indicate that GTM will introduce substantial biases, I offer the following advise to parents, educators and legislators to help them interpret the new growth scores.

1) You should not compare junior high schools and high schools to elementary schools.If you have children in both elementary and junior high schools, do not be alarmed if the junior high has a lower growth score than the elementary. This does not necessarily mean that the junior high is not helping students improve. GTM gives elementary schools a significant advantage over junior high schools and high schools. 

A student's year-end GTM target is the number of points the student was below Mastery in the prior year divided by the years remaining to the end of the eighth grade. An eighth grader has only one year to the end of the eighth grade; a fourth grader has five years. Given the same baseline score, GTM requires an eighth grader to improve five times as much as a fourth grader to meet the target. [ii]

This puts junior high schools at a clear disadvantage and explains why elementary schools have significantly higher "Top Growth" rates[iii]

"Top Growth" avg. rates:           ELA                 Math
Elementary (no grade 8)               52.9%               45.3%
Junior High                                   44.7%               40.6%
2) You should not compare the growth scores of schools with many high-achievers to those with many low-achievers.  If your child's school was an "A" school last year, do not be alarmed if the new growth score is not an "A". "A" and "B" schools have many students already at Mastery or above; GTM targets require them to make roughly twice as much improvement as lower achievers. 

The minimum score for Basic is only 25 points below Mastery; to reach Mastery from Basic, the most a student has to improve is 25 points by the end of the eighth grade. Students already at Mastery have to reach Advanced to earn points through GTM. A student at low Mastery has to progress about 50 points to reach Advanced by the end of the eighth grade.[iv]

Compared to schools with lower letter grades, "A" and "B" schools have many more students starting from Mastery or higher; such students were only half as likelyto earn "Top Growth" status solely through GTM as those who scored Basic in the prior year.[v](However, LDOE will partially compensate for this bias through its final scoring method.[vi]

3) Growth measures are unstable, and GTM yields erratic results dependent on the style of student growth.Your child's school's growth score will contain a lot of statistical noise.

LDOE will base its scores on growth data from two years rather than a single year. This will mitigate some year-to-year swings, but will also create winners and losers among schools based on how and when in the two-year cycle that students show growth. 

Slow and steady wins the race, even when students finish behind. GTM penalizes schools if students reach Mastery too soon. A student who crosses the threshold for Mastery in the first year is then assigned a much more ambitious target for the second year. If a student gets close to Mastery without going over, he is given an easier-to-meet target. Slow, but steady growth earns more points than growth in spurts, even when the two-year total of growth is greater through a spurt. In this way, schools may be rewarded for holding students below Mastery until they reach the highest grade in the school.

The table below shows the uneven scoring of different growth styles; Bobby had the least growth over two years but earned the most points:
The LEAP 2025 score range is 650-850; minimum scores for: Mastery - 750; Advanced (eighth grade) Math - 801, ELA 794. LDOE awards 150 points for meeting a GTM target. If the target is not met, points are awarded based on the VAM ranking. 150 points for the top 20 percent, 115 points for the next 20 percent, then 85, 25, and 0 for the next 20 percent cuts.

There is a corresponding but opposite effect for students who decline over two years. A student who declines slowly but steadily can earn fewer points than one who drops precipitously then rebounds, even if the total decline of the latter is greater.

Some elementary/junior high school pairs will be impacted. If student achievement spikes in the highest grade of an elementary school, the junior high the next year is disadvantaged; especially so if the elementary spikes the rate of students at Mastery. In junior high they would then receive harder-to-reach targets based on reaching Advanced.

Further study is needed to quantify this effect. I have found several examples of extreme swings in growth measures from feeder school to receiving school, but I do not have enough data to determine if the examples are part of a clear trend.

4) Public schools significantly outscored voucher schools in "Top Growth".If students are to be given school choices, then growth measures provide the most relevant information about which schools are the most effective, especially for struggling students. In the past, school letter grades essentially indicated how likely it was that the student in the next seat scored Mastery. It would be better to know the likelihood of a student in the school making significant improvement. That is what growth scores are meant to tell us. 

The "Top Growth" rates indicate that voucher schools overall made significantly less progress with students than the average public school.[viii]

While GTM introduces biases into the growth scores, the biases do not account for the lower overall "Top Growth" rates of voucher schools. Indeed, if voucher schools attract struggling students, the growth scores should be biased in their favor as described in the second point above. Since the rates are below 40%, they indicate that voucher schools must perform poorly on the VAM measure as well.

Thus, the data contra-indicate the state policy of awarding vouchers to students to leave public schools for voucher schools. In voucher schools overall, students are less likely to make substantial gains.

5) Finally, read any LDOE claims about growth with a critical eye.Because letter grades are tied to school choice options, LDOE sometimes spins data to support the policy of allowing students in "C", "D", or "F" schools to switch to "A" or "B" schools. If school choices are really about optimizing student growth, "A" and "B" schools are not necessarily the best choices. Read correctly, the existing growth measures indicate that "D" schools overall produce more student growth. However, in the recent past, LDOE has sorted data so as to skew the  "A" schools' average growth measures upward and the "F" schools' average growth measures downward. 

From one year to the next, schools with high growth may move up a letter grade; those with low growth may drop a letter. When averaging "A" schools' growth, LDOE has taken out schools that started the year as "A" schools if they dropped to a "B" at the end of the year. However, schools that had high growth and rose from a "B" to an "A" by the end of the year were put into the average. By systematically including high growth and excluding low growth from the average, the growth average for "A" schools was exaggeratedly high.

For "F" schools the effect was the opposite. High growth schools that moved up to a "D" over the course of the year were excluded from the average while low growth schools that dropped to an "F" were included. This depressed the average to an exaggerated low.

It is important to note that parents make school choices based on a school's prior-year letter grade, not the one it will get at the end of the year. The data should reflect that reality.

In a proposal for the new growth measures, LDOE provided data to the accountability commission that exaggerated the average growth score of "A" schools. I cited that data in a public comment regarding ESSA and requested that the data be run again. LDOE complied and provided the following data table. Shown below are VAM-only growth score averages by the school letter grade of prior-year (2015) and end-of-the-year (2016). LDOE had provided the data only in the end-of-the-year (2016) sorting prior to my request. (This proposed measure, which was not the final method adopted, used the VAM ranking only - without GTM.)

When sorted by prior-year letter grades (2015), the span between the growth scores of "A" and "F" schools was reduced from 12.6 points to 5.2 points, and "D" schools were shown to outperform "B" and "C" schools. Thus, the data do not support the current state policy of using school letter grades as the determining factor for allowing students to transfer from "D" to "B" schools.

I urge you to bear in mind the biases described above when considering the new growth scores. For any given school, the growth scores are expected to vary more from year-to-year than the proficiency rates which have been the basis of school letter grades for many years now. Prudence dictates adopting a wait-and-see attitude before making significant course changes in response to these new and problematic measures.

[i]           VAM is an elaborate statistical process that competitively ranks students. It considers student characteristics beyond mere test scores. Now, in any ranking, there will be a top 40 percent, so VAM guaranteed a "Top Growth" rate for the state of 40 percent. However, some students qualified for "Top Growth" solely through GTM. The amount above 40 percent can be attributed solely to GTM.
[ii]          Students who scored mastery or advanced in the prior year receive similar targets with a goal of reaching advanced. For high schools students in grade nine and ten, targets are set based on reaching the goals by the tenth grade.
[iii]         . LDOE classifies schools as Elementary (only grades K-8), High School (only grades 9-12), and Combination schools (schools with at least one K-8 grade and one 9-12 grade. By cross-referencing LDOE classifications with multi-stats files from LDOE, it is possible to establish what grade levels a school contains. Here, I define a junior high school as an elementary school that contains grade 8 but no grade lower than 5. Combination schools and elementary schools that contain all grades 4-8 were excluded.
[iv]  The level for advanced is approximately 800, but varies from test to test. On the current LEAP 2025, the level for advanced in the eighth grade is 801 in Math and 794 in ELA. 
[v]  Slide 11 of the presentation. The amount above 40% cam be attributed solely to growth-to-mastery. 

[vi]  For the growth score system, LDOE will award any student ending the year at Mastery a minimum of 85 points out of a possible 150 even if the VAM ranking would have only awarded zero points.
[vii]About eight percent of students who would have scored 85 points or less from the VAM rankings will earn 150 points through GTMSee this Accountability Commission presentation from January 2017slide 40.
[viii]       I identified these schools as scholarship program (voucher) schools in the "Top Growth" file and excluded those with NR data: