With those questions rolling around in my weak brain, I went back in and spent a bit more time comparing 2005 test scores with 2006 (the testing is done in Spring of a School Year...so that's "last year's numbers" and the just released figures that made the papers, etc.).
I have all these figures scribbled on various pieces of scrap paper like I'm taking a Math test myself, but I'll try not to overburden you with too many numbers, dear reader. To get to something approximating an answer to the questions, let's first look at scores for entire school populations, then tackle more on last week's topic of identified school sub-groupings like "Students with Disabilities" and "English Language Learners".
Overall School Scores
- 21 APS schools flunked at least one of the standardized testing components (Math/Reading) in 2006. This is up from 12 in 2005.
- One of the many reasons for this, and perhaps the biggest, was that the necessary goal percentage was raised by around 4 percent.
In 2005, Polk's overall school percentage in Math was 8.24%. In 2006, it was 9.9%. Yet, in 2005, Polk "passed" with its 8.24%, and "failed" in 2006 with it's higher 9.9%. That's because the test evaluators are raising the "Annual Yearly Progress" goals every year. Polk's Math expectations went from 10.58% in 2005 to 15% this year.
Careful readers might be asking..."Well why did Polk "pass" with a 8.24% in 2005 when the expected goal was 10.58%?" Good guestion. That's where we get back to the statistical work I droned on about last post. Namely, Polk's "Lower Bound Confidence Interval" for Math in 2005 was only 7.02%. They got 8.24%....so they passed. This might lead to a few follow up questions:
- Isn't setting a goal for only 7.02% (or 10.58%, or 15% for that matter) to show proficiency in Math ridiculously low?
- How do the test evaluators come up with the rising expectation percentages every year?
- Will Polk be publicly recognized for the well over 10% rise in their math proficiency scores even though they "failed" while passing with a lower score the year before?
By the way, almost the same exact thing happened with Polk's Reading scores. The scores were roughly flat for the two years at 28/29 percent, but the rising expectation goal caught them and they went from "pass" to "fail" with roughly the same score.
- This rising expectation goal was not the only reason for the jump in school failing overall. For instance, James Monroe MS failed its Math score en masse because it was caught offering an unauthorized overnight break in the Math test.
- Other schools like La Luz ES, Lowell ES, and Pajarito ES scored just flat-out scored dramatically lower in Reading. Lowell, for instance, dropped from 50% proficient to 30%. You might wonder why such a drop should occur, and that gets me to looking again at those school subgroups.
- Speaking of subgroups, the single strongest correlation in a school passing the test overall is the percentage of "English Language Learner" (ELL) students they have.
Given the obvious obstacles an "English Language Learner" has to overcome to become equally proficient with English speaking children, its utterly unsurprising that this correlation exists. But, noone will talk about it in terms of these tests scores. One wonders if the racial components and layered nuances of such a correlation make it too dangerous to bring up in public. The same can be said about the "Students with Disabilities" issue.
By the way, I didn't break it down on a complete "elevation/ELL student" basis, but it's worth pointing out that the sole middle school to pass this year, Desert Ridge MS, has only 45 ELL students in its testing population of 1,146. La Cueva, the only high school to pass had an overall tested population of 1,034, and 35 ELL students. In fact, La Cueva didn't even report a pass/fail score for its ELL population because the number tested was below that of statistical significance.
Testing Population Subgroups
Those who soldiered on and finished my last meandering post might remember that I focused much attention on "Students with Disabilities", i.e. Special Education students. As I mentioned at that time, many schools were identified as flunking overall because their SpEd subgroup failed. Let's get a bit more specific:
- In a quick count, I show 21 schools that passed overall and with every subgroup except "Students with Disabilities". Because of this the school is designated as failing.
- 8 schools passed every barometer except "English Language Learners" (ELL).
- Many smaller schools didn't get counted for subgroups like those above because the number of students in these subgroups was too small to be statistically significant.
- Remember: the goal expectations were also raised for these subgroups in tandem with the school overall.
Scot Divergently Rants A Bit
Keep in mind that when talking about these groups I am not ranting that we shouldn't count these subgroups, or that I have something against either population. I am instead ranting that not enough 30-second soundbite news is delivered about the fact that many of the 83 (or whatever) schools shown to be failing are doing so only because of the scores of these subgroups. And now more questions:
- Do I think the scores are too low in these subgroups and that much work needs to be done? Yes, absolutely.
- Do I, in my heart of hearts, think it's fair that SpEd and English Language Learner students be held to the same expectation goal as the overall student population? No.
- Am I shocked that reading scores in particular are very, very low for English Language Learner students? Again, you can't be serious.
Scot Tries To Conclude By Getting Back to a More Dry, Objective Analysis
So, if I remember correctly, there were two questions:
- Did APS schools test scores go down, or not?
- What constitutes a good or bad APS school?
If the reader can take anything from these two long diatribes on the subject, your humble blogger would ask that it it be this: determining whether schools are doing better or worse is far more complicated than a simple list of "those who failed". And I haven't even gotten into factors like transient school populations, the fact that testing follows the school and not the kids who actually take the tests, etc. And as a punchline, I reiterate that APS has already said they are changing the test for Special Education kids next year. So the rules change, the parameters change, the test subjects change, the test is itself always in "beta". It all changes, yet this standardized test is considered the only true accountability measure we have for school performance.
So, to finally answer the question...some schools did better, some did the same, and some did worse. Overall, it's actually a wash, but for the factors listed above it will be seen by media and those make hay from pontificating about how "public schools suck" as worse overall. Period.
Question Two:
It might sound flippant, but if a parent was to look at these test scores as a determinant as to where in Albuquerque they should move to from out of town (a question asked by a respondent to the last post), the obvious answer would be near a school at 5,200 feet or higher and with low numbers of English Language Learner students. Of course parents shouldn't merely look at the scores on these, just as they should have better criteria than elevation and avoiding foreigners.
The fact is that there are good schools in all parts of Albuquerque, whether the test scores indicate it or not. And yes, there are bad schools in Albuquerque, run by unimaginative adminstrators and full of teaching burnouts.
My suggestion to parents looking at schools is to be somewhat like someone looking a good restaurant. Like schools, there are plenty of good restaurants in the "bad" part of town, and vice-versa. To find a new good dining spot one has to research beyond the Health Safety Report, reading reviews and checking out the number of cars in the parking lot. Just as restaurants might have a line out the door, some APS schools have a waiting list for kids to transfer to. Finding out information like that can successfully guide school/location choice much more than a simple newspaper list of "failing" schools. In fact, as pointed out in a recent Journal story, many of the most in-demand schools for transfer are schools that didn't "pass" this year's standardized test.
In regards to full disclosure, I admit that I teach at such a school. Jefferson MS didn't pass this year (our "Students with Disabilities" scores were too low), and we have a waiting list. I also admit that part of my motivation in writing these thousands of words on the subject of tests was generated by my reading of the Journal story with my school listed as "failing". Given that fact, I hope I have been objective enough to demonstrate not simply that the "tests are stupid" , as we teachers and other Bush/No Child Left Behind detractors are wont to scream, but much more complicated and thought provoking than a pithy newspaper headline and 500 word story can adequately present.
2 comments:
Excellent multi-course meals of this while elsewhere is all fast food. Coco
Why aren't we seeing anything about how this compares with other cities and states, if the national system is so flawed?
And what about the Charter Schools, which are part of the public schools and must take anyone who applies? I went to the web page you recommended in your Aug. 3 post (NM Public Education Department (PED))http://www.ped.state.nm.us/div/acc.assess/accountability/2006/ayp_dar_2006.html, and noticed that Charter Schools are listed separately for the whole state, without saying which city each is in. I tried several I knew were in Albuquerque (Amy Biehl and S. Valley Academy), and saw that they passed.
Post a Comment