Wednesday, March 14, 2012

Thoughts on "Educational Productivity"

Last week, Matthew Ladner produced a stunning chart showing an "implosion" in our nation's educational productivity.  Productivity here seems to be defined as ratio of per-pupil expenditures on public education to average NAEP test scores.  The former has tripled since 1970 while the latter has essentially remained flat for the upper grades.  I'm not sure of the impetus behind that particular post, but the Bush Center has written a similar post that makes all the same mistakes.

Before I delve into those mistakes, I'll point out that they've also created a nifty website that allows people to compare students' standardized test scores in any district's to the scores of other students in the state, nation, and world.*

So, what's wrong with comparing spending to achievement?  Seems straightforward.  And the graph is certainly compelling.  But, alas, the statistics that seem the most straightforward are often the least useful.  Among other issues:

1.) Spending and test scores are on different scales.  Spending can multiply almost infinitely while the test scores have a ceiling.  In the chart on the site, the average 17 year-old scored 306 out of 500 on the NAEP math test in 2008.  Which means that even if every kid in the country earned a perfect score the next time around, the average score would only increase about 63%.  Since school spending has tripled, the ratio of spending to achievement would still be far greater now than it was 40 years ago.

2.) Why would we assume that it takes the same level of effort for a school to get a student to earn a certain score now as it did in 1970?  A zillion factors other than education spending influence achievement levels.  If parenting ability, economic circumstances, living conditions, and such increased dramatically then we shouldn't think it's miraculous if scores increase with no additional school spending.  Similarly, if societal conditions worsen in some way, then it would necessarily mean that more effort is necessary to achieve the same scores.  I have no idea whether it's now harder or easier to get the average 17 year-old to score 306 on the NAEP now than it was in 1970, but we'd need to know that answer to accurately measure educational productivity.

3.) Why would we assume that the same level of spending is commensurate with the same level of effort on behalf of districts now as it was in 1970?  The economic and social context of schooling is dramatically different.  Perhaps most importantly, the number of women in the workforce -- particularly in fields outside of education -- has exploded.  Simple economics dictates that it must cost more to buy the services of an equally qualified teacher.

4.) Test scores were not the sole goal of that increased spending.  Surely, we also aimed to increase the number of high school and college graduates (I don't have the HS stats handy, but almost twice as many 25-29 year olds have bachelor's degrees now as did in 1970 (though it's increased by only about a third since 1975)).  I think a reasonable argument could be made that it's increasingly costly to get each additional student to graduate (i.e. moving the HS graduation rate from 50 to 60% is easier than moving it from 80 to 90%), so we might not expect the same returns per dollar on those measures.  And, surely, we also aimed to improve many other skills (e.g. critical thinking, physical/emotional health, social skills, art appreciation, etc.) that aren't measured by the math and reading tests listed.

So, should Ladner's alarming chart worry us?  I wouldn't dismiss it out of hand.  I wouldn't be the least bit shocked if our returns to effort and spending have decreased the past 40 years.  But we can't tell whether productivity has decreased, remained steady, or increased by looking at that chart.  The most compelling figures he presents are the large increases in non-teacher staff in schools, but some unknown number of support staff are certainly invaluable so even that doesn't prove all that much.

And, by the way, those same four problems apply to any international comparison of a simple spending : test score ratio.  Were we to completely eliminate schools, culture, society, and a myriad of other contextual factors would still produce kids who scored much higher and lower on tests in different countries; it would be harder and easier and cheaper and more expensive to change that in different countries; and each country emphasizes different outcomes to a different degree.

I'll be the last person to argue that our nation's schools are just fine -- we face countless problems with a nearly infinite number of solutions -- so please don't interpret my criticism as an argument for the status quo. I hate the status quo.  But also realize that Ladner's chart gives us exactly zero information about what ails our schools.



*The "world" here is 25 developed nations.  On another note, here's a fun game: they don't seem to have compiled the list of the top-performing districts, so go see which ones you think might rank highest.  So far, I've found:

-The average student in Chappaqua, NY outscores 89% of students internationally in reading and 82% in math
-The average student in Chatham, NJ outscores 88% of students internationally in reading
-The average student in Brookline, MA outscores 77% in math.


8 comments:

Sherman Dorn said...

Corey, the chart is Andrew Coulson's -- Ladner is just linking to it and using. Also see my discussion of why two-vertical-axis charts here are misleading.

mmazenko said...

Matt has a clear privatization and school choice agenda which governs most of his post and his "research," or reading of data with an anti-public education slant.

Three primary drivers of cost which Matt completely ignores are school security, special education -including increased ELA, and technology. It can be incredibly expensive to teach autistic kids, and the numbers for these diagnosed illnesses have skyrocketed. Additionally, the incredible expansion of students on medication with IEPs and 504s which require specialized instruction, individual case workers, and mountains of paperwork also balloon costs.

The added costs of school security - particularly in a post-Columbine and post-9/11 world are completely ignored by Matt, and these would have no impact on any test score measure. The technology and staff alone are burdensome. And that begins to allude to the primary costs of any organization, which is labor. While salaries is one source of criticism, the bigger worry is health care costs. Health care premiums have skyrocketed, and that is producing a drain on all companies. The greater problem is for companies like schools which are essentially non-profit and can't cut staff to minimize cost.

Thus, you correctly identify Matt's "analysis" as incredibly naive and misguided.

Matthew Ladner said...

Coulson offered the following reply to Professor Dorn:

http://www.cato-at-liberty.org/this-one-is-of-the-charts/

Mazenko offers up the tired "blame the special education kids" excuse which I addressed in the comments section. Just to repeat- let me know when your tribe makes any serious effort to reform the nation's special education laws and practices in a way that improves outcomes and lowers costs. Or any other aspect of school law for that matter.

In the end, we more than doubled the ratio of school employees per pupil and saw little in the way of improved academic outcomes. No amount of lipstick is going to make a beauty queen out of such a pig.

Corey Bunje Bower said...

Matthew:

Yes, I understand that's only a small piece of the pie. I read Jay's piece that was linked to in the comment thread on your blog. I did not raise that issue.

And also note that I wrote in the original piece "The most compelling figures [you present] are the large increases in non-teacher staff in schools". I'm skeptical that *all* of those staff are necessary. And, actually, I suspect that an equally large piece of the puzzle (which you don't mention) is the number of teachers employed to do things other than teach a general education classroom.

I'm not sure what "tribe" you think I'm a part of, but you don't actually address a single point I raise in the piece. Nor, for that matter, has anybody offered up a good explanation of why a we should structure the dual axes the way they are on the chart.

Anyway, I don't doubt that our school system has become less efficient. Nor do I ever argue we shouldn't change our system. What I *do* doubt is that the precise ratio calculated in this chart is at all meaningful.

Roger Sweeny said...

What I *do* doubt is that the precise ratio calculated in this chart is at all meaningful.

I think it's meaningful in setting up some questions. The chart says "we" are spending more and not getting more when it comes to school. As a matter of simple arithmetic, that means something important. If "we" do not get more productive in other areas, we will get poorer; there will be less to spend on other things. So how are we doing in other areas? Not too good recently and there is a large debate going on about what future prospects are. Should we in the business expect that spending on education will increase like it has in the past? Probably not.

It also pushes the mind toward asking the question that mazenko and Ladner alude to: Is all the extra spending useful? Should some of those laws and policies be changed?

Corey Bunje Bower said...

But, Roger, it doesn't mean that at all. Yes, it begs some questions . . . but it gives us zero answers.

First, we'd need to know why spending increased. Did it simply become more expensive to provide an equal quality education to a student? We lost the effective subsidies granted to schools through restriction of women's occupations -- that certainly raised the price of employing an equally qualified teacher. Prices of health care have skyrocketed, which has the same effect. And what about environmental factors? Far more kids now live in single parent households. Did that make it more difficult to push children to the same achievement level?

Second, we have to know what "more" we're expecting from schools. Clearly test scores are not the sole goal of kids, parents, schools, or many of the programs that have driven up costs. Dropout rates have declined while graduation rates and college attendance have increased considerably. Should we consider that as well? What about other things we expect from schools like developing social skills, teaching critical thinking, building social cohesion, and so on? How have those changed? Schools also offer far more services outside of the realm of general ed (not just special ed, but parent coordinators, ESL/ELL classes, school security, counseling, etc.) whose main purpose is *not* to increase average test scores.

Has educational productivity increased or decreased in the past half-century? I don't know. And we can't know from this chart.

Roger Sweeny said...

Corey,

I think we may be "violently agreeing."

Yes, we should ask why spending increased. How much of it is controllable and how much of it isn't (e.g., increased opportunity for women)? We then need to ask about the things we can control. "Should we, and if so, how?"

That opens up a tremendous amount of questioning that simply hasn't happened in the past.

As you point out, the statement, "We spend three times as much on education for the same results" is largely content-free. It is an assertion, not an explanation--and begs a lot of questions. However, the same can be said of the statement, "It costs three times more to educate students nowadays because we do more and it's harder."

mmazenko said...

Once again, Matt has oversimplified the issue - and my comments. The gist of my response was in no way "blame the special education kids," and to imply as much is a gross distortion of not only my point, but my integrity as an educator. The funding of education - and the myriad factors which lead to increased expenses - is incredibly complex, and there is no doubt that accomodative services for students increase costs. As do technology and security and inflation and rising health care costs and rising construction and cost of living costs, etc. And to argue that increased costs should equate to higher standardized tests scores is incredibly naive and myopic. And as a researcher, Matt should know better.

That said, I do not dispute that schools have added significantly to non-educator related costs. And, like Corey, I question each and every additional position that can't be linked to student achievement. But, expanding schools so they have additional counselors and postgrad departments to increase the number of students pursuing post-secondary ed is not necessarily a bad idea. I wouldn't even argue that all public education is becoming less efficient. Scrutiny of individual budgets is certainly reasonable. But statements of blanket criticism about cost and results are inflammatory at best.