CHICAGO SCHOOL OFFICIALS ALTER CHARTER SCHOOL TEST DATA

UPDATE: The day before this article was posted (December 9, 2014), CPS officials assured us they would give us the data files explaining the changes in charter school test scores.  As of today, they have not lived up to this assurance.  The files remain closed to the public.

by Troy LaRaviere

Chicago_SchoolsCEO_920_576_80

Background

The members of the Administrator’s Alliance for Proven Policy and Legislation in Education (AAPPLE), have once again—unfortunately—found it necessary to attempt to hold Chicago Public Schools officials accountable. Ironically this time it is the CPS Office of Accountability that needs to answer to students, teachers, parents and school leaders across Chicago.

This office is responsible for developing and executing the system used for rating schools using CPS’s five-level “SQRP” system. However, an AAPPLE comparative analysis indicates the Office of Accountability altered the school level growth data used to assign those ratings. This basic finding was confirmed by an independent analysis conducted by the Chicago Sun-Times.

Investigation

In August, AAPPLE analyzed NWEA MAP student growth results and discovered students in public schools were learning far more than their peers in charter schools. Our findings were published in a Chicago Sun-Times Op-Ed. In addition, the Sun-Times published its own independent analysis, which affirmed our findings. Our analysis was based on a file containing the school level results of the Northwest Evaluation Association (NWEA) Measures of Academic Progress (MAP) assessment. This file was released by the CPS Office of Accountability in early August. That original file is no longer available on the Office of Accountability website. At some point between the publication of our findings and the release of school ratings, CPS removed the original file containing school growth data and replaced it with a different version. There are no indications or acknowledgements on the site that the data in the file has been changed.

Fortunately, we saved the original version.

An analysis of both versions indicates massive changes were made to the student growth data for charter schools at some point during the last few months as CPS delayed the release of school quality ratings.

Findings

We found these changes led to certain schools appearing to have greater academic growth by lowering the average pretest scores while leaving the posttest scores as they were. For other schools, the changes raised the pretest scores and once again left the posttest scores as they were, giving the impression of less student growth.

The changes were made to the data for nearly every charter school while affecting less than 20 public schools. Charter school scores were changed by more than 50 percentile points in some cases while most of the public school changes were 2 points or less.

Winners and Losers

Two charters schools had their student growth percentile raised by 50 points or more. In general, the CICS chain of charter schools was made to appear much higher performing—in terms of student growth—than the original unchanged results indicated. The charter chain got two score increases of 30 or more percentile points, six changes of 20 points or more, and 10 changes in the data created increases of 10-points or more. On the other hand, schools like Shabazz Charter sustained a 20-point loss and a 10-point loss as the result of the changes. Click here for an Excel spreadsheet detailing these changes. It is interesting that many of the schools whose growth was changed upward are in gentrifying areas of Chicago.

Action Taken by AAPPLE

We requested an immediate meeting with the CPS Accountability Office and got an immediate reply. The request was made on Sunday evening, and the meeting was scheduled for Tuesday afternoon. We then informed the leadership of the Chicago Principals and Administrators Association (CPAA), the parent group Raise Your Hand, and the Chicago Teachers Union (CTU). These organizations represent principals, teachers, and parents, and we received valuable input and feedback from each group. The CPAA and CTU expressed interest in attending the meeting with CPS and each sent representatives.

CPS Response

CPS representatives—led by John Barker—acknowledged the scores had indeed been changed and told us there were two broad reasons for the changes.

Justification #1 Differing pre-test dates

The first was because students took the tests at different times and the growth numbers were not entirely comparable as a result. Public school students take the test each spring and student growth is measured by how much scores increased from spring 2013 to spring 2014. To better understand this, it helps to think of the 2013 MAP assessment as a pre-test and the 2014 MAP assessment as the post-test given to see how much a student learned over the course of a year.

However, many charter students did not take the pre-test in spring 2013 and instead took it in fall 2013, so their student growth was measured by how much scores increased from fall 2013 to spring 2014.  CPS stated it needed to change the scores to reflect this.

AAPPLE’s Response to Justification #1: Differing pre-test dates

Whether a test was administered in Spring or Fall does not provide an advantage or disadvantage since NWEA already takes this into account when it provides student growth projections to CPS for both Spring and Fall. Making any type of adjustments does not make sense.

If there were an advantage, it would be charter schools that had the technical testing advantage. Students who took the pre-test in spring rather than the following fall typically score higher on the pre-test because they took the test before the learning loss that occurs over the summer months. However when measuring growth, students who score higher on the pre-test are at a competitive disadvantage. For example, “John”—a public school student—took the pretest in spring 2013 and scored a 203. If John had been a charter school student he would have taken the pre-test in fall 2013 after the summer learning loss. As a result he would have gotten a lower pre-test score, such as a 200. With a post-test score of 210—with all else being equal–John’s public school growth would have been 7 points, while his charter growth would have been 10 points, simply because of the timing of the pre-test in relationship to the summer learning loss. Again, that’s with all else being equal.

All things between public schools and charter schools, however, are not equal. Despite having such a massively deceptive technical advantage, the MAP results demonstrated conclusively that charter school students learned far less than students in public schools, especially in reading. It is frightening to think of how much worse charter school student growth scores would have been had they not started with such a competitive advantage.

All of this however is moot since NWEA already provides CPS with aligned projections that take into account when students take the assessment. However, CPS decided to move forward and change the scores anyway on the claim that they need to reflect the different testing dates. I told CPS officials that if that were the case then charter school results should have gone down. Indeed, about half of the changes were negative but all charter growth scores should have gone down when adjusted for different pre-test schedules. Why did some of them go up? That was when CPS officials articulated a far more puzzling justification for changing scores.

Justification #2: Students took different versions of the same test

The MAP assessment comes in several different formats. Among them is a version aligned to the Illinois State Standards and another version aligned to the Common Core State Standards. All public schools took the Common Core aligned MAP assessment. Apparently, CPS allowed charter schools to choose the version aligned with the Illinois standards. CPS officials then told us that these schools are likely the ones whose growth scores where adjusted upward.

AAPPLE’s response to justification #2: Different versions of the same test

CPS’s changes had significant effects on school growth percentiles as well as on the percentage of students who made national average growth. Perhaps the most egregious example is CICS Irving Park. Before the scores were altered, the percentage of students making national average reading growth at CICS was 40; after the changes it was 63. Before the changes, CICS Irving Park had a national growth percentile of 34; after the changes it was 86. What statistical model is CPS using to estimate their results would have been so vastly different had they simply taken a different version of the same basic test?

Defending this action puts CPS in an interesting no-win situation. First, they need to acknowledge that changes like the ones they made to CICS Irving Park are incredibly implausible—if not impossible. If CPS insists that the tests were so vastly different that they would change the growth results so drastically, then CPS may need to consider invalidating the ratings of any schools whose scores where changed as a result of taking different tests since applying ratings metrics to different tests is extremely poor practice.

If CPS officials stand by their assertion that CICS’s results would have been so vastly different had they just taken a different version of the same test, what they’ve done is to acknowledge that a school system can get the results it wants based on the test it selects. This calls into question the validity of the testing process itself. More importantly it raises unavoidable doubts about the intelligence of using such wildly inconsistent assessments to make high stakes decisions about schools; decisions such as their “levels” in a system of school choice, their probationary status, or–in many cases–their very existence.

Conclusion and Implications

Summary of Above Responses

First, the scores should not have been altered at all. The original scores should have remained with an asterisk explaining the different testing dates and differing test versions. However, since the scores were changed, then logic would have us believe all charter school scores should have shown even less growth given the deceptive technical advantages they had in both testing dates and test versions. This however is not the case and it raises serious questions about both the methodology and motivations behind the changes in charter school scores. In addition the massive variance in scores based on CPS’s hypothesis about what schools would have scored had they simply taken a different version of the same test, raises serious concerns about the use of these tests for high-stakes decisions such as school ratings, probation, turnaround, and closure.

CPS Creates a Non-Level Playing Field

I have no doubt charter schools chose to ignore the CPS practice of taking the pre-test in the spring in order to give themselves a technical advantage over public schools when it comes to measuring growth.   I also have no doubt some chose to take the Illinois standards based assessment because they assumed it would be easier and their students would appear to score higher than public school students as a result. Despite having both of these deceptive technical advantages, the results demonstrated the monumental failure of our city’s irrational experiment with charter schools. However, to the point of this report, the question must be “Why did CPS allow them to have these deceptive technical advantages in the first place?

Continuing Lack of Transparency Increases Likelihood of Tampering with Scores

We asked CPS for a technical document that explains and illustrates precisely how adjustment were made for those students whose scores were altered as a result of different testing dates as well as for those whose scores were altered based on taking different versions of the test. CPS also needs to explain why it chose to alter scores for a factor (test dates) that NWEA already takes into account when it determines student growth expectations and grade level growth percentiles.

Furthermore, the question needs to be asked, “Why did CPS go through the trouble of actually changing the pretest scores to make it appear that the new growth scores were based on a real score, rather than on their statistical model of what growth “would have been?”

These score changes happened without any public disclosure. Despite having dozens of opportunities to tell the public about their actions, CPS remained silent. For several months CPS delayed the release of school ratings. During this time several principals, union representatives, and media representatives continually pressed CPS officials about what was causing the delay in releasing the ratings. At no time did those officials state or acknowledge what they were doing with the scores. I supposed “We’re changing the charter school scores” would not have made for a good media sound bite in their minds. It would have, however, been an accurate one.

Moving Forward

In a system based on “choice,” parents and other stakeholders must be provided with accurate indicators of school quality. CPS’s school rating system is intended to serve as this primary indicator. It cannot serve this purpose if there are clouds of suspicion about tampering with the data used to determine these ratings.

To be clear, this rating system and school choice have done nothing to improve the overall quality of education in our city. In fact, the achievement gap between black and white students has widened as a direct result of the poor quality of the charter school options that have infiltrated our district. Again, the rating system and “choice” have failed our parents and our students. Some of us are principals of schools that received the highest “1+” rating. Yet we believe the rating system should be abolished and CPS should take the resources it puts into maintaining this system and put them into efforts to improve all of its schools.

However, as long as the rating system exists the data used to inform it must be accurate, and the people who prepare it must operate with transparency and integrity. The changes in the charter school data—with no public disclosure or explanation whatsoever—puts both into question.

*************************

DATA FILES

Original file

The original file is no longer posted on CPS’s site, but can be viewed at:

https://drive.google.com/file/d/0BytSj0QyFz1ea2dta1dVU3ZvbE0/view?usp=sharing

Altered file

The altered file up now on the Office of Accountabilities site can be downloaded at http://cps.edu/Performance/Documents/DataFiles/NWEAreportSchool_2014.xls

There website is http://cps.edu/SchoolData/Pages/SchoolData.aspx (see “Assessment Reports”).

An AAPPLE version of the altered file that contains additional columns that show the original results side-by-side with the altered results can be viewed at the following link.

https://drive.google.com/file/d/0BytSj0QyFz1eU1BjMW83MkY5T3c/view?usp=sharing

File for quick and easy at-a-glance analysis

Click the link below for an AAPPLE document that contains only those schools with altered information. It contains additional columns that show both the original results and altered results side-by-side. It also calculates the difference between the original scores and the altered scores. Lastly in contains small tables on the side of the date of certain charter chains that indicate how that particular chain was affective by the changes to the data.

https://drive.google.com/file/d/0BytSj0QyFz1eb1l6LVlZNkRMRmM/view?usp=sharing

19 thoughts on “CHICAGO SCHOOL OFFICIALS ALTER CHARTER SCHOOL TEST DATA

  1. Thank you very much for speaking truth to power again and again, and for your willingness to attend to the details–and showing us the devil is in the details. And for never overestimating the true intentions of the ed ‘reformers.’

    Liked by 1 person

  2. Yes the hypocrisy of the Chicago board of education is stupendous. This only the tip of the iceberg in thus system and across the country with educational reform. Education once it leaves the classroom has become less and less ethical as the years have gone by. It’s become a dirty lying business that has hurt our school systems and students. Politicians and big business. The deeper they get into it the deeper the deceit gets.

    Liked by 1 person

  3. Thank you so much for compiling and sharing the data changes for all of the schools. I work at a charter where we were not given (by NWEA) the option of switching to the Common Core test until the summer of 2013 (after we had finished spring testing). Taking the (easier) Illinois Standards test in the spring of 2013 as our “pre-test” definitely hurt our our growth scores compared to our Fall to Spring or Fall to Fall data (although the “adjusted” scores that CPS shared as our final scores were even worse—go figure).

    I think you are definitely right that the huge changes in scores depending on the date and format of tests used should make us all very skeptical about the validity of the testing measures. CPS’s lack of transparency in this process is incredibly problematic and having them make changes that help some schools and hurt others (and looking at your list, more schools were hurt than helped in both math and reading) is patently unfair.

    Like

  4. A few comments.

    First, as to the adjustment based on test date, I don’t think you correct in assuming that the fall scores are uniformly lower than for the preceding spring. E.g., the 50th percentile math score for K in the spring is 159 and the 50th percentile for 1st in the fall is 163. There are also certainly examples where the RIT score decreases from spring to fall. It varies by subject, grade and percentile (and does not vary in the straightforward way I would have expected). Just eyeballing things I think it’s largely a wash and wouldn’t be the source of major changes. So I wouldn’t make much of it.

    Second, depending on how the growth expectations were generated, it could be appropriate to make an adjustment. Using your “John” example, a spring score of 203 would generate a certain distribution for John’s growth. If John actually scored 200 from a fall test (in your hypothetical), then it would be wrong to treat that 200 as a spring score and generate a growth distribution from it. I don’t know that that’s how the growth distributions were generated but it’s plausible. I think you could do this fairly reliably as you have a mapping by grade and subject of percentiles from fall back to spring. And it would not have the bias in the direction you claim because you assumption that scores generally decline from spring to fall is incorrect. But as I said, the spring to fall changes are I think largely a wash (though I could be proven wrong with a more careful assessment), so this discussion is largely academic.

    Third, I am at the moment agnostic on whether CPS did something improper and/or unreliable in adjusting for different versions of the test. If you had reliable nationally normed percentiles for both versions, it seems to me not unreasonable to map one onto the other. E.g., if you had a student scoring at the 50th percentile in reading on the IL state version, you could look up what the 50th percentile was for reading on the CC version (adjusting as needed for the spring-fall issue), then get a growth expectation from that score and compare to the students actual performance on the CC version the subsequent spring. I am not saying this rough approach is necessarily exactly right, but something along these lines should be considered. If the IL version is indeed “easier” as claimed in the comments, then it would be legitimate to think about an adjustment. There may be further considerations to be assessed but to dismiss an adjustment out of hand is not warranted.

    I therefore think your conclusion that the scores should not have been adjusted (you use the more pejorative “altered”) is likely wrong or, at a minimum, premature. There can be legitimate reasons to adjust scores. I have suggested some above. I do agree with you that CPS has to date been lacking in transparency and I applaud your efforts to push them in this regard.

    Like

  5. My son attends cics Irving park and it is difficult to say if the school is doing its job or not. this is the only school he has attended so far. I can say it’s not what I expected. We were told that since they began common core the students test results would drop. But his did not. He has told me he doesn’t really do much in class besides take some notes which I never see, they only do worksheet homework and reading logs. Oh yes and their science fair project which they are basically walked through step by step. Spanish is learned now thru the Rosetta Stone probably because for the last 6 years he has been there he still doesn’t speak Spanish. The teachers are nice but I don’t see that much effort to challenge these students. Do you know they push us to have the students use IxL and myOn almost daily, it’s homework. But don’t think that it’s only happening at charter school, my daughters public school Norwood also pushes IxL and says “it is in line with their ciriculum”. Overall I don’t think a charter school or public school student is learning as much as they should. It is if the teachers are not allowed to teach, they are being held back and in turn holding back our kids. The only solution to the learning gap for us was to enroll him in a tutoring program and he is doing better but he asked why are we giving more homework than school? All I could say was because I care. I wish we could compare my sixth grader to a public school sixth grader, I want to see the difference.

    Like

  6. FYI dr Elizabeth purvis will be at the school Tuesday at 8 for “coffee hour” to answer any questions about the school, the info you shared was not mentioned to any of us so I’m guessing not many will attend.

    Like

  7. Shades of New Orleans. The charter school movement is BS, as is much, if not all of the of the high stakes testing fiasco. In my opinion all of this has never been about kids. It is all about adults and politics pure and simple.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s