Holding schools accountable for providing a quality education by setting standards and testing schools to see if they meet these standards is a major focus of U.S. educational reform. But most reform efforts target only the local public school systems, from kindergarten to high school.
As the cost of an undergraduate degree skyrockets, parents ought to wonder what they and their children are getting in return for their tuition payments. Adam Smith warned them long ago: "The discipline of colleges and universities is in general contrived, not for the benefit of the students, but for the interest, or more properly speaking, for the ease of masters."
U.S. News and World Report magazine pioneered a system of national ranking for colleges and universities, hotly pursued by Time and Newsweek. Now all three news weeklies have their annual special reports on colleges and the admissions process on the news stands. They use statistics to evaluate institutions for academic reputation, enthusiasm of the student body and grads and-most important to their target audience of parents-the degree of difficulty of admission.
SAT scores and high school class rankings indicate the degree of difficulty of getting to a university. The magazines also usually indicate how likely it is that students who get in will graduate. And alumni donation rates are a reasonable indication of how happy the graduates are with the education they received.
But none of these directly measure the learning going on in classrooms. All the magazines ignore the standardized tests that many undergraduates take to move on in their educational careers. In addition to the Graduate Record Exams, which are like the SAT I and II and amount to aptitude and achievement tests for grad school in the arts and sciences, there are Law School Admissions Tests, Medical College Admissions Tests, and the Graduate Management Admissions Tests for business school.
If public school reformers are right to insist on standardized tests to measure academic achievement of students leaving high school, surely a proper measurement of colleges and college students would use these test scores. Unfortunately, a general resistance to accountability and a secrecy culture permeate college and university administrations.
Information Please
A recent experimental effort to tally GRE and other test scores for a sample of notable universities was an almost complete failure.
The Educational Testing Service did provide a report on GRE scores for the University of Illinois, but quickly followed up with a belligerent "Ooops!" Refusing to provide further reports on other schools, Anne S. N. Gale, administrative director in the ETS president's office, wrote: "I regret to inform you that the University of Illinois report you received recently may have been sent to you in error. The staff member who processed your order form assumed that you were making the request in an official capacity on behalf of the university. If that is not the case, I must ask that you destroy the report you received."
Directly contacting Harvard, MIT, Yale, Princeton, Penn, Davidson, Washington University of St. Louis, Caltech, University of California at Berkeley, UCLA, Stanford, Illinois, Michigan and Wisconsin was not any more productive. Most of them claimed not to have it. Only the University of Wisconsin and Davidson College provided the report.
Jed Marsh, associate provost at Princeton shouted back in an email: "In accordance with standing university procedure we will NOT authorize the release of the '2000-2001 Undergraduate Institution Summary Statistics Report' for use in your study." Sarah Wood, executive assistant to the Provost at Harvard University confirmed that summary test results on GRE, GMAT, LSAT and MCAT were confidential and that Harvard would not disclose it, even if asked by a prospective student.
Fear of Disclosure
Ellen R. Julian, Assistant Vice President at the Association of American Medical Colleges and Director of the Medical College Admission Test explained that she was afraid scores would be misused: "Average MCAT scores...are dependent on many variables other than quality of education provided.... The seductive power of numbers is such that we fear rankings and inappropriate comparisons would result from the release of such information, putting pressure on schools and advisors to be less inclusive in their encouragement of potential applicants whose strengths are in the non-academic, but vitally important, arenas." She added that the test makers have a "societal obligation to prevent inappropriate use of test scores." Her sole consolation: "The test data belong to the universities. If they want to release them, they can."
They didn't. The University of Michigan's Freedom of Information Office went furthest, supplying heavily redacted copies of two pages of two official MCAT reports. Other schools gave nothing. No matter if a high school student wants to become a doctor, she won't be given detailed information on how well students do on the MCAT at the undergraduate schools she's considering.
Tough luck, too, for high school students who might be thinking far ahead and aiming for an MBA. "We do not provide such information [i.e., institutional scores on the GMAT] to the public," said Dr. Susan Swayze, Director, Research Studies, at the Graduate Management Admission Council which sends every school monthly, quarterly, and annual reports.
"What the schools do with those reports is their business. The GMAC places no restrictions on the data," she later added. No school offered to release information from these reports.
Budding lawyers are also denied information on knowing how students at prospective undergraduate schools scored on their LSATs. They're denied this information even though, as the University of Southern California states on its website, "The LSAT is commonly considered to be the most important piece of the law school application and statistics indicate that it is very important." Only the University of Michigan's Freedom of Information Office supplied a summary report that included average LSAT scores for its undergraduate students.
Culture of Secrecy
Schools want test data to rank students, but they cringe when the same data is used to rank them.
This culture of secrecy has surrounded national college testing from its earliest days. The first national standardized tests for undergraduates were the Scholastic Aptitude Tests (SAT) developed by ETS. Until 1957, SAT test takers were never told their scores. Later the ETS would show test takers their graded answer sheets, but only as the result of a "truth in testing" law passed in New York State in the 1970s.
The national magazines could rock this boat: Their representatives, along with some of the test makers, participate in the Common Data Set initiative, which was set up to reduce the reporting burden for colleges and universities. Robert Morris at U.S. News said that to his knowledge the issue of adding GRE, MCAT, LSAT, and GMAT data was never raised.
The test makers say they have no authority to honor the general public's information requests for how well a college or universities' students do on these tests. Tom Rochon, executive director of the GRE at ETS, explained that ETS has a written agreement with universities and colleges to keep institutional data secret. "Aggregate data for an institution belong to an institution," he said. "Were the ETS to publish these data, universities might well stop requiring GREs."
In January Congress passed the "No Child Left Behind Act," intended to improve U.S. education. It demands that America's K-12 schools focus on what each student accomplishes. States must test every student's progress and publish the results.
For this kind of reform to succeed, educators must be prepared to accept test results without hiding or redesigning them when they reflect persistent educational problem areas. Anything less than full transparency risks making the "No Child Left Behind Act" an expensive generator of meaningless success stories.
Colleges, the training ground of our educators, should be leading the way to full disclosure, not setting their current poor example of avoiding accountability and fearing what test results might reveal about their educational performance.