Showing posts with label higher ed. Show all posts
Showing posts with label higher ed. Show all posts

Tuesday, March 18, 2014

Face Validity and the New Ed School Rankings

For those of you outside academia, "face validity" is a fancy term academics use that simply means that something makes sense upon first glance (or "on face").  US News and World Report released their latest grad school rankings last week, and one thing I notice is the lack of face validity.

First, I should note that I'm a Vanderbilt alum and Vanderbilt dropped to #2 in the education school rankings after ranking #1 for five consecutive years.  I hesitated to write this post lest anybody think it's simply sour grapes.  Or maybe an attempt to draw attention to the fact that Vanderbilt dropped in the rankings as soon as I left . . .

In all seriousness, I really don't care all that much where Vanderbilt ranks.  I don't even know what it actually means to be the top-ranked school of education.  What matters most?  Outcomes of students?  Research of faculty?  Selectivity?  The current rankings measure the latter two but not the first (my biggest criticism of them would be that they make virtually no attempt to measure the education students actually receive).  We could construct the rankings 100 different ways that would all make some amount of sense, so it's a bit ridiculous that the USNWR rankings draw so much attention (and yet, here I am, helping them draw even more attention).

Despite the fact that I claim I have no idea what it means to be the top-ranked school of education, if pressed I'd have to posit that the top four, in some order, are Vanderbilt (Peabody), Columbia (Teacher's College), Stanford, and Harvard.  I think those four have the most history and prestige, but I could be wrong.  By one measure, they have the most recognizable scholars (Stanford has 21; Harvard 19; Columbia 12; and Vanderbilt 11 of the 200 scholars ranked) -- so I don't think I'm totally off-base here.  So it's interesting to me that those schools rank second, third, fourth, and eighth.

Another measure would be to look at which schools have the top-ranked programs.  USNWR ranks the top 20 or so programs in 10 different fields, though they do so solely based on nominations by Deans.  I'd consider these rankings "face validity" because they're simply asking knowledgeable people what looks like it would make sense to them rather than doing any sort of comprehensive multivariate analysis. By clicking on each school's profile, one can see how many programs that school has that made the cut. Below are the top 25 ranked schools of education and the number of fields in which they were ranked:

2.) Vanderbilt: 9
5.) Wisconsin: 9
8.) Columbia: 9
15.) Michigan St.: 8
16.) Ohio St.: 8
4.) Stanford: 7
8.) Columbia: 7
8.) Michigan: 7
11.) UCLA: 7
7.) Washington: 6
10.) Texas: 6
22.) Virginia: 6
25.) Indiana: 6
3.) Harvard: 5
14.) UC-Berkeley: 5
5.) Penn: 4
20.) NYU: 4
20.) Minnesota: 4
18.) USC: 3
13.) Oregon: 1
17.) Kansas: 1
22.) Pitt: 1
24.) BC: 1
1.) Johns Hopkins: 0
11.) Northwestern: 0
18.) Arizona St.: 0

A few things stick out here:

-There looks to be only a mild correlation between a school's overall rank and the number of top programs it has.

-A number of schools have quite a few top-ranked programs but are outside the top 10 -- Michigan State, Ohio State, Virginia, and Indiana are particularly notable.

-Meanwhile, Northwestern and Johns Hopkins have exactly zero top-ranked programs and yet rank above all of those programs.

-Yes, you read that right: Johns Hopkins -- the new #1 School of Education -- has exactly zero programs ranked among the top 20 or so in the country.  Now, I freely admit that I have absolutely no idea whether or not Johns Hopkins has the best faculty, students, research, or anything else we try to measure, but that's pretty striking.

So, something doesn't make sense here.  How could Deans perceive that schools have a slew or dearth of top programs while the overall rankings indicate that the school as a whole is actually merely really good or the cream of the crop?

One possibility is that the Deans' perceptions are wrong and that the data collected by USNWR are a better indicator of quality.  Another possibility is the opposite -- that the rankings are a sham and that we should listen to the Deans.  A third possibility is that the program rankings are misleading in some way (e.g. some fields are more important than others, important fields are missed, the narrow margins are insignificant, or that a few top-5 programs is better than a bunch of top-20 programs).

If it's number one or two then I'd argue that either the overall rankings or the program rankings are lacking in face validity.  In reality, it's probably some mixture of all of the above.  But if memory serves, I believe that Texas, Oregon, and Johns Hopkins have been second one year and 10th or lower another year just in the past five years.  It's possible that school quality changes that fast, but it seems rather unlikely.

Another way of looking at the rankings would be to look at programs that rank at the very top of their fields.  If we look at the number of programs that rank in the top 10/top 5/#1 in their field, we get a different picture for each one and from above:

Wisconsin: 8/7/1
Vanderbilt: 8/5/2
Michigan State: 7/4/2
Columbia: 6/5/0
Michigan: 6/5/0
Ohio State: 6/1/0
Stanford: 5/5/2
Virginia: 4/1/0
Harvard: 3/2/0
UCLA: 3/1/1
Texas: 3/0/0
Indiana: 3/0/0
UC-Berkeley: 2/0/0
Washington: 2/0/0
Penn: 2/0/0
Minnesota: 2/0/0
Oregon: 1/1/0
Kansas: 1/1/0
USC: 1/1/0

When we look at this way, we notice that four schools ranked in the top 10 didn't have a single program ranked in the top five in its field (#1 Johns Hopkins, #5 Penn, #7 Washington, and #10 Texas).

Another thing you may notice is that there are only eight #1 rankings in 10 fields.  That's because the schools ranked first in student counseling and personnel services (Maryland) and technical/vocation education (Penn State) didn't make the top 25.  Which may confirm my earlier hypothesis that some fields are viewed as more important than others.  Or not.  If we look at the number of programs ranked among the best in their field or the top 10/5/#1 in their field, we find a few schools outside the top top 25 that dwarf most of the top 25 schools:

33.) Penn State: 9/4/2/1
33.) Georgia: 9/6/3/0
26.) Maryland: 7/2/1/1
26.) Illinois: 6/2/1/0

All of which leads to a whole lot of confusion.  I'm not really sure what the rankings are measuring to begin with, but it sure seems odd that their specialty rankings would be so misaligned with their comprehensive rankings.  I'd bet that a lot of the Deans polled for the specialty rankings (and academics who think like they do) probably think the overall rankings are lacking in face validity.

Friday, February 21, 2014

Friday Notes

A few brief notes on some smaller stories this week:

-A lot of people seem to be discussing the new report released yesterday finding that students who didn't submit test scores to colleges performed virtually the same as students with similar GPAs who did.  It seems like most people and most of the coverage (e.g. NPRThe Chronicle, and EdWeek) are interpreting this to mean that SAT/ACT test scores don't predict performance in college beyond what we know from test scores. But, as many have pointed out on Twitter, one glance at the tables on pages 47 (below) and 56 show the error in this interpretation.  In reality, in a number of the groups examined, students with higher test scores earn higher grades in college and are far more likely to graduate than than those with lower test scores and identical high school GPAs.  Somehow, though, students who don't submit test scores to test-optional schools do about as well as those who do despite having earned significantly lower scores.  What this says to me isn't that test scores don't matter but, rather, that non-submitters are savvy and that the act of choosing not to submit their test scores tells us something about their abilities and future chances of success.

-Another day, another wild-eyed report about how it's impossible to fire teachers.  Yawn.  Not only do I not buy that administrators can't fire a teacher if they really try, but I'd bet my life savings that at least 90% of teachers who are fired aren't officially "fired".  Looking for a dissertation topic?  Go figure out how Principals go about dismissing teachers . . . most non-educators don't understand how much soft power administrators wield and how persuasive they can be when they suggest that a certain teacher look elsewhere for work.

-A new Pew report illustrates why it worries me so much when people argue that X person/people shouldn't go to college: the widening gap in college v. non-college outcomes (and keep in mind that is despite dramatic increases in college enrollment over this time that have made colleges far less exclusive than they were decades ago).  And income isn't the only thing that's changed, a wide range of social outcomes have as well -- take a look at how the relationship between income and marriage has reversed in the second picture.

-Here's the story on a fascinating chart (below) that makes us all look foolish for debating whether we want our system to be more like Korea's or Finland's

-Morgan Polikoff discusses some studies that found some positive impacts of standards-based reform.  On the one hand, I'm not sure how one could read the research and not conclude that the U.S. needs more coherent standards like the rest of the developed world.  On the other hand, I remain deeply skeptical that standards matter all that much.  They seem like step 1 in a 10,000 mile journey.

Next week, I'll offer some thoughts on the biggest changes I've seen in the education debate and the following week I'll start a multi-part series examining the ways in which urban poverty impacts students' performance in school . . .

Wednesday, February 19, 2014

Is It OK to be a Public Intellectual?

Nick Kristof's column the other day about the lack of interaction between Academia and the public sure ruffled some feathers.  That's probably partly due to long list of provocative quotes in the column, including the following:

A basic challenge is that Ph.D. programs have fostered a culture that glorifies arcane unintelligibility while disdaining impact and audience. This culture of exclusivity is then transmitted to the next generation through the publish-or-perish tenure process. Rebels are too often crushed or driven away.

Many academics frown on public pontificating as a frivolous distraction from real research,” said Will McCants, a Middle East specialist at the Brookings Institution. “This attitude affects tenure decisions. If the sine qua non for academic success is peer-reviewed publications, then academics who ‘waste their time’ writing for the masses will be penalized.

All the disciplines have become more and more specialized and more and more quantitative, making them less and less accessible to the general public

Political science Ph.D.’s often aren’t prepared to do real-world analysis

Many academic disciplines also reduce their influence by neglecting political diversity. Sociology, for example, should be central to so many national issues, but it is so dominated by the left that it is instinctively dismissed by the right.
On the whole, I agree with Kristof.  Faculty aren't expected, encouraged, or rewarded for communicating with the public.  Which, I think, is a big problem (as you could probably guess given that I'm spending my time writing here).  I've written in the past that about the personal experience I've had with this, and I continue to dislike the degree to which academics are discouraged from interacting with the public.

It's easy to go off on tangential arguments here about who should write what when and for how long, but let's not miss what I think is the largest problem here: the active discouragement.  There are plenty of good reasons that academics should strive to be "public intellectuals," but I don't think we should expect every Professor out there to spend oodles of time reaching out to the public, nor do I think it should be a tenure requirement to do so.  But I do think academics should be rewarded for doing so.  At the very least, it shouldn't be seen as a negative for one to use his/her time this way.  I cringe every time I hear somebody gasp at the time an academic is wasting writing for a popular audience.

One of the few counter-examples to this trend is is Rick Hess's edu-scholar rankings, which seem to receive more attention each year -- this year I noticed press releases from quite a few schools touting the number of Professors in their college who'd made the list.  Overall, though, interaction with the public is still largely discouraged.

I do think the responses from academics were interesting, though.  These include Daniel Willingham, who writes that communicating applications of research isn't the job of most professors and should often be left to others with different skill sets; Corey Robin, who writes that quite a few Professors blog and write in the popular press and that many grad students aspire to do so, echoing Erik Voeten, who runs down a list of ways in which different Professors communicate with the public.

I think these are all fair points.  Not all faculty need to be out in the public sphere, and pure academic research certainly has value.  But, again, none address the degree to which faculty are actively discouraged from communicating with the public.  The fact that some people do so anyway doesn't change that fact.  Nor does the fact that many grad students want to communicate with the public, since the problem here isn't lack of desire but, rather, lack of opportunity.

And I think the argument that there are outlets within academia to communicate is rather shaky.  Voeten, for example, points to the journal Perspectives on Politics as the new vehicle for Political scientists to communicate with the public (which Kristof omits), so I decided to check it out.  Here's an excerpt from the first abstract I read from the current issue:

In an effort to bring empirical clarity and epistemological standards to what has been a deeply-charged, partisan, and frequently anecdotal debate, we use multiple specialized regression approaches to examine factors associated with both the proposal and adoption of restrictive voter access legislation from 2006–2011 . . . Further, we situate these policies within developments in social welfare and criminal justice policy that collectively reduce electoral access among the socially marginalized.

Sorry, but that's academic-speak.  That is not how one communicates with the public.  People don't say "empirical clarity" or "multiple specialized regression approaches" or "situate these policies within" in everyday life.  So I remain unconvinced that many journals speak directly to the public.

Or maybe this just proves Willingham right: many Profs simply don't have the skills to communicate with the public.  I have to admit, though, that his argument just brings to mind the scene in Office Space where Tom Smykowski explains that the company needs his people skills to communicate between the engineers and the customers:

All kidding aside, though, I think the issue merits serious consideration by everybody involved in academia.  All can (I hope) agree that more research needs to be translated to practice, but this could happen in any number of ways.  Maybe academic journals should publish more readable (i.e. ~10 page jargon-free) essays for the public to read.  Maybe public outreach should count in tenure reviews.  Maybe some Professors should be classified as "public intellectuals" and have different expectations.  Or we could try any number of other ideas.  But I don't think that denying the problem exists will get us anywhere.

Ultimately, we need to find a way to make it okay for people to be public intellectuals if they wish to do so.

Tuesday, September 18, 2012

More Liberal Arts for the Least Affluent

I've disagreed with Peter Meyer multiple times in the past, bot in posts (here, here, and here), and in comments on his blog posts (which I'm not going to take the time to dredge up).  So I think it's fair that I point out that he recently wrote what I think is an outstanding post last week on college attendance and poverty.  Also, digging up those old posts just made me realize I've been misspelling his name; my sincere apologies.

Anyway, Meyer makes a strong case regarding why, in an ideal world, we should want everybody to attend college -- and how obtaining a broad, liberal education particularly advantages the most disadvantaged.  Among other things, he points out that:

-exposure to new ideas, new institutions, and new styles of thinking is particularly beneficial for those who were exposed to the fewest of these in their childhood

-a college education opens more options for students compared to limitations placed on them by hyper-specific vocational training

-underemployed college grads still make far more than non-college grads in the same field (a college educated dishwasher makes 83% more, for example)

-increasing college attainment hardly solves our problems but not sending more kids to college creates more

While zillions of logistical hurdles stand in the way of all students procuring a top-notch liberal college education, Meyer concludes by arguing that:

I personally don’t care if a kid decides not to go to college. I would, however, demand that every high school graduate at least be capable of reading (and understanding) David Leonhardt’s story—i.e., your options are probably pretty constrained if you don’t go to college—and that every district superintendent be judged by the number of his or her truly college-ready graduates. If a student decides not to go to college, fine. But at least he or she would have, I would hope, the option of going if he or she wanted to—which is better, I would assume, than not having that option after twelve years of schooling.

I can only find two small points of contention in the post:

1.) the argument that teaching poor kids "a new kind of thinking -- reflection" is the key to getting them out of poverty is either inartfully expressed or demonstrates a lack of understanding.  I'm leaning toward the former, since he also wrote a pretty good piece explaining the genesis of that quote.  At first glance, it might look like Meyer is arguing that kids are poor because they think wrong.  I think, though I could be mistaken, that this was actually was a way of saying that exposing kids to more culture, society, and ideas (e.g. plays, museums, concerts, lectures, etc.) will benefit those who previously had the least exposure.  Indeed, the program driven by this notion was the result of the suggestion of an impoverished prisoner who said that kids needed to get more involved with the what was happening downtown in order to interact in new ways with government, society, etc.

2.) While I agree with Meyer that, ultimately, we shouldn't force every kid to get a high-quality college education: that giving every kid both the option to obtain it and the understanding of how it will benefit them is a better policy goal, I do hope that he personally does care which path any given student chooses.  Given that he argues that more students obtaining high-quality educations improves the lot of our entire society, I'd certainly hope he would then wish that all students chose to obtain that type of education.

Tuesday, January 31, 2012

Top 50 Endowments Per Pupil

I'm sure I missed a couple schools (I'm human, I didn't include public schools, and there may be some ultra-small schools I missed), but these are all the schools I could find who are able to spend at least $10K per student per year (assuming a 5% endowment spending rate) while glancing through the rankings and endowment data.

1Princeton$17,109,5087,802$2,192,964$109,648Res 1
2Yale$19,374,00011,701$1,655,756$82,788Res 3
3Harvard$31,728,08019,627$1,616,553$80,828Res 1
4Pomona$1,700,4541,560$1,090,035$54,502LAC 4
5Swarthmore$1,508,4831,524$989,818$49,491LAC 3
6MIT$9,712,62810,566$919,234$45,962Res 5
7Amherst$1,641,5111,795$914,491$45,725LAC 2
8Grinnell$1,500,2191,655$906,477$45,324LAC 19
9Williams$1,784,3052,083$856,603$42,830LAC 1
10Stanford$16,502,60619,535$844,771$42,239Res 5
11CIT$1,772,3692,175$814,882$40,744Res 5
12Rice$4,451,4525,879$757,178$37,859Res 17
13Cooper Union$607,135910$667,181$33,359RCN 2
14Wellesley$1,499,8722,411$622,095$31,105LAC 6
15Berea$978,7351,613$606,779$30,339LAC 27
16Washington &Lee$1,218,1322,173$560,576$28,029LAC 12
17Dartmouth$3,413,4066,141$555,839$27,792Res 11
18Notre Dame$6,259,59811,992$521,981$26,099Res 19
19Richmond$1,877,1933,618$518,848$25,942LAC 27
20Chicago$6,575,12612,781$514,445$25,722Res 5
21Bowdoin$904,2151,762$513,175$25,659LAC 6
22Smith$1,429,5273,113$459,212$22,961LAC 19
23Claremont McKenna$543,2361,278$425,067$21,253LAC 9
24Emory$5,400,36713,381$403,585$20,179Res 20
25Trinity (TX)$962,8292,417$398,357$19,918RUW 1
26Duke$5,747,37714,983$383,593$19,180Res 10
27Bryn Mawr$671,1031,755$382,395$19,120LAC 25
28Wash U$5,280,14313,820$382,065$19,103Res 14
29Northwestern$7,182,74519,389$370,455$18,523Res 12
30Berry$752,5442,087$360,586$18,029LAC 121
31Middlebury$907,6682,532$358,479$17,924LAC 5
32Hamilton$657,5291,861$353,320$17,666LAC 17
33Columbia$7,789,57822,283$349,575$17,479Res 4
34Haverford$402,7301,177$342,167$17,108LAC 10
35Colby$611,4411,825$335,036$16,752LAC 21
36Vassar$814,1302,446$332,841$16,642LAC 14
37Penn$6,582,02919,842$331,722$16,586Res 5
38Carleton$653,4652,020$323,498$16,175LAC 6
39Macalester$654,4652,033$321,921$16,096LAC 25
40Harvey Mudd$243,125773$314,521$15,726LAC 18
41Davidson$509,5831,742$292,528$14,626LAC 11
42Denison$654,5842,275$287,729$14,386LAC 49
43Brown$2,496,9268,695$287,168$14,358Res 15
44Lafayette$658,1462,414$272,637$13,632LAC 40
45Vanderbilt$3,414,51412,714$268,563$13,428Res 17
46Mount Holyoke$602,4812,345$256,922$12,846LAC 29
47Cornell$5,059,40620,939$241,626$12,081Res 15
48Colgate$693,4362,903$238,869$11,943LAC 21
49Oberlin$699,8952,974$235,338$11,767LAC 24
50Holy Cross$606,0742,899$209,063$10,453LAC 29

There seem to be four tiers here:

1.) Three schools can spend over $80,000 per student in a given year
2.) Eight schools can spend between $40K and $55K per student
3.) 13 schools can spend between $20K and $40K per student
4.) 26 schools can spend between $10K and $20K per student

I was curious whether national research universities or liberal arts colleges had more resources per student but, outside the top three, the're pretty evenly mixed.