The newspaper says SAT scores take the biggest drop in 31 years. Why?
a. the high school class of 2006 is dimmer than its predecessors.
b. there’s something dumb about using a test like this to screen students for college.
While you won’t see that question on the next edition of the SAT, the evidence increasingly points to “b” as the correct answer.
The latest flap over SAT scores – coming a year after a massive scoring scandal – is another black mark for a private firm that makes millions of dollars pretending to serve a legitimate purpose.
Tuesday, The College Board, the company that produces and sells the test, announced that average reading and math scores fell across the country. The firm has its explanation: fewer students are taking the test twice, which usually results in higher scores.
But even that explanation, if valid, helps prove that the test serves little practical value in selecting promising students for college admission.
Say Dick and Jane take the SAT test on the same day and obtain exactly the same score. However, Dick’s family pays for an expensive test-taking class and the next time around his score jumps by 30 points, which the College Board says is typical for a second try.
Does that mean Dick is smarter or better prepared than Jane to attend college?
The answer is plainly “no,” and we need look no further than than a well-documented, 22-year test that has been conducted at our own Bates College to confirm that answer.
In 1984, Bates made the SAT test optional for students applying for admission, and made all standardized tests optional for admission in 1990.
In the late 1970s, Bates began to suspect that many potential applicants were not applying because of their SAT scores. The school knew that some students it wanted to recruit often fared poorly on the SAT: minorities, first-generation immigrants and students from bilingual, rural or blue-collar families.
So, the school began concentrating on high school achievement and personal interviews in selecting students.
Then, at the end of five years, the school compared the records of those who had submitted SAT scores and those who had not. The grade point average of test submitters was only 5/100 of a point higher than non-submitters, or virtually identical.
In short, Bates found that the SAT didn’t do a good job of predicting future student achievement.
The crime here is that all but a couple of dozen colleges, plus all graduate, medical and law schools, still rely on standardized tests.
As a result, thousands of students each year take the test, look at their results and either decide they are not college material or that they are unqualified for some of the best and most challenging schools in the country.
That’s a crime because it prevents some of our best, brightest and hardest-working young people from fulfilling their potential. Is the best test-taker always going to be the best family physician or the best lawyer?
Unfortunately, less easily quantifiable qualities like desire, work ethic and emotional intelligence are too often factored out of selection equation.
Last year, The College Board had to reissue thousands of SAT results because of scoring errors. This year, students are left pondering a large one-year drop in their scores.
It’s about time schools nationwide follow the lead of Bates College and look at the whole student rather than simply the ability to take standardized tests.
Comments are no longer available on this story