Lists and surveys can fail to appraise a student’s personal needs, goals
Americans love to rank everything, from David Letterman’s Top Ten list to the best vacation spots.
College rankings are an especially popular fixture of our society, because they appear to offer simple answers to guide an increasingly complex, expensive and life-changing investment. As consumers, prospective students and parents have a right to know what they are paying for, and deserve the best information available.
Sparking national debate and an editorial in the Sun Journal (“In college choice, information and insight reign,” July 11), the Annapolis Group, whose 124 members include most of the top national liberal arts colleges, recently took on the issue of college rankings.
In a non-binding poll, a majority of college presidents attending the group’s annual meeting signaled their intent to stop cooperating with U.S. News & World Report in particular. Individual colleges will determine the level of participation appropriate for them.
Bates is using the time before the next U.S. News survey deadline to study the issues and alternatives more completely. Until now, we have participated with this and many other college guidebook requests because we have considered doing so a useful service to parents and prospective students.
What frustrates most college officials about the U.S. News rankings is its suspect methodology and oversimplifications. They know that what appears to be so precisely “counted” doesn’t necessarily reflect what really “counts” as academic quality, and they also worry about the influence of the rankings on institutional decision-making.
Many of the 14 factors used by U.S. News to reduce colleges to a single ranking score are not entirely comparable across institutions and offer only rough proxies for what they claim to measure. The percentage of alumni who donate to their colleges, for example, accounts for 5 percent of the total score; U.S. News says this reflects satisfaction with a college, although it may merely reflect the wealth of the student body, and recent articles have shown how easily giving rates may be manipulated.
Minor differences in ranking variables can also give the impression that a school’s quality can change dramatically in a single year, and they seem to magnify the apparent distances between schools of similar quality. Our research suggests, moreover, that up to 70 percent of the overall ranking score for the top 25 liberal arts colleges could be explained by using a single factor – instructional expenditures per student.
This may seem harmless until you ask a simple question: With college costs rising so rapidly, why should the rankings reward higher spending? Many college leaders feel constant pressure to focus on flawed ranking indicators rather than their primary mission – improving student learning.
Far from trying to restrict information, Annapolis Group members are working with other education associations to develop something better: A Web site with a common set of key statistics will soon be available, and it will provide links to other sites to let colleges provide more detail about student outcomes measures that aren’t easily quantified.
At Bates, we try to provide evidence for what really counts in college. We advise prospective students to remember what they are looking for in a college experience can’t be reduced to a mere number or ratio – just as we remind them that in our admissions process, their talents won’t be measured by a single standardized test score.
We suggest they place far less emphasis on financial resources and the types of students a college enrolls, and more on actual student and alumni experiences and outcomes in college and beyond. Bates currently publishes far more information on its own Web pages than any of the guidebooks request (www.bates.edu/planning-analysis.xml).
Where statistics alone can’t reliably show how college may change a life, we let students and alumni tell their individual stories. We also encourage parents and students to visit several free interactive Web sites that allow them to use public data to prepare custom comparisons of colleges by whatever factors are most important to them.
Among the best are the U.S. Department of Education’s College Opportunities Online (nces.ed.gov/ipeds/cool/), the Education Trust’s College Results Online (collegeresults.org), and Economic Diversity of Colleges (www.economicdiversity.org).
As with any high-stakes investment, what matters above all is the best fit to personal needs, talents and goals – as financial analysts warn, “individual results may vary, and performance is not guaranteed.” Students must do their homework and make the same careful evaluations when selecting colleges that their professors will expect from them when they enroll.
Like it or not, there are no simple answers to complex questions.
James Fergerson, director of Institutional Planning and Analysis at Bates College, is a former member of the Association for Institutional Research Higher Education Data Policy Committee, which meets annually with U.S. News & World Report and other guidebooks.