Marquette Warrior: Marquette Manipulates Reported Freshman SAT & ACT Scores

Monday, May 29, 2006

Marquette Manipulates Reported Freshman SAT & ACT Scores

Marquette University has begun to manipulate the entrance examination scores it reports on entering Freshmen. Beginning with the Freshmen who entered in the fall of 2005, the University adopted a “new methodology” to inflate the SAT and ACT scores reported to U.S. News and World Report, to the federal government and to other agencies.

The change appears to be driven by a desire to improve Marquette’s standing in the U.S. News and World Report rankings.

Analysis of Marquette’s 2004 Freshman Profile and 2005 Freshman Profile shows the extent of the manipulation. For example, it’s possible to compare the old methodology (a straightforward reporting of scores) with the “new methodology” for the years 2000-2004.

First, for the American College Testing program (ACT):
  1. Old: 25.2
    New: 25.7
  2. Old: 25.2
    New: 25.5
  3. Old: 25.4
    New: 25.7
  4. Old: 25.3
    New: 25.6
  5. Old: 26.0
    New: 26.4
If the level of inflation here doesn’t seem very large, the results for SAT scores (verbal plus math) are more dramatic.

  1. Old: 1158
    New: 1192
  2. Old: 1151
    New: 1172
  3. Old: 1169
    New: 1200
  4. Old: 1166
    New: 1200
  5. Old: 1177
    New: 1218
How is this statistical legerdemain achieved? Daniel Gemoll, Marquette’s Director of Institutional Research, explained it to us.

In the first place, a substantial number of students submit both the ACT and the SAT with their admissions application. (24.8% did in the 2004 entering class.) It’s possible to convert an SAT score into an “equivalent” ACT score, using data supplied by the publishers of the SAT. If a student submitted both an ACT score and and SAT score, one of the two scores will be relatively better than the other.

With the 2004 and earlier freshman profiles, both the ACT and the SAT scores of such students were included in the reported averages . Beginning with 2005, only the test on which the student did relatively better was included. Thus if a student scored 30 on the ACT and 1,100 on the SAT, the SAT score (which was relatively worse) was discarded.

But that’s only half of it.

The SAT consists of two subtests (verbal and math). The “SAT score” reported by the College Board (which administers the test) simply sums the two scores.

But a fair number of students have taken the SAT more than once. This provides Marquette with some room for finagling. If the student did better in math the first time he or she took the test, and better on the verbal portion the second time around, Marquette takes the math score from the first test and the verbal score from the second test and combines them to produce a “SAT score” that doesn’t correspond to any SAT score that the student ever actually achieved!

The ACT test consists of four subtests (Science, Reading, Math and English), and Marquette does exactly the same thing with these. If a student who took the ACT twice, Marquette might produce a “score” that is the average of the science and math scores from the first try, and the Reading and English scores from the second try.

And again, if the student also took the SAT, this pseudo-ACT may be entirely thrown out if the student did better on the SAT.

And What is Wrong With That?

If this sounds like cooking the numbers, that’s because it is. In fact, it’s about as close to just making up data as you can get without actually making up data.

While in fact every SAT or ACT score included in the reported averages consists of actual student performance, the sample of performance is skewed. Students will, merely due to the luck of the draw, do better on one subtest than another, or do better on the day they take the ACT rather than the day they take the SAT.

So the procedure is a bit like Major League Baseball calculating batting averages excluding those games in which a player got no hits.

Gemoll explains that Marquette did an informal survey and found that “several” other colleges use this method of reporting scores. Indeed, it is sometimes called the “Notre Dame method” since that Indiana school supposedly pioneered it. Gemoll could not recall the names of other schools using the method.

As of this writing, two phone calls to the Office of Institutional Research at Notre Dame have not been returned.

We did manage to contact a knowledgeable source at Boston College, who denied any such finagling with scores. The source did say that if the student took the SAT twice, the better score is used and the other discarded – consistent with the school’s admissions practice. (The ACT is not a factor with this east coast college.) Our source insists there is no fiddling with subtest scores.

We also contacted a knowledgeable source at the University of Santa Clara (a Jesuit school) who denied any such fiddling with the scores.

Richard Hurst, Director of Institutional Research at Loyola of Chicago, said “we don’t do it that way,” explaining that “the grief is not worth it” and “it’s not terribly beneficial, so why do it?” Hurst said “that kind of chipping away doesn’t really improve your scores.”

Finally, we found one highly regarded Jesuit university, which we are not at liberty to identify, that does exactly what Marquette does with the subtest scores, although this institution does report both ACT and SAT scores for all students who took both tests.

Not Surreptitious

Marquette does acknowledge, but in a rather opaque manner, that it is manipulating the data it reports. For example, the 2005 Freshman Profile contains the following disclaimer on the Contents page:
Note: After extensive analysis, a new methodology was adopted this year for reporting ACT and SAT test scores to the federal government and external agencies, such as U.S. News and World Report. The changes made are consistent with procedures for calculating and reporting test score results used by many other universities and result in higher reported test score averages.
A similar disclaimer goes out whenever scores are reported: to U.S. News and World Report, to the federal government, or to other agencies.

The problem, however, is that the disclaimer quickly gets lost as everything gets stripped away but the reported ACT and SAT averages. Further, nobody to whom these scores are reported knows how to “correct” them, and thus they have to be used pretty much as reported.

The Broader Issue

Regardless of how widespread this fiddling with test scores is, it’s clear that there are many ways of reporting data, and that manipulation is widespread.

Schools can, for example, report scores based on the “census,” or on “final enrollment.” Whichever looks better.

We have an unconfirmed (but believed to be reliable) report of one school that has a two-step application process. The “second step” is the actual full application with the application fee, while the first is nothing more than a “glorified inquiry.” The school treats all the “first step” inquiries as “applications” so they can report a lower “acceptance rate” and appear to be more “selective.”

Some schools no longer require SAT scores. That, however, doesn’t prevent some students from sending them. Since students who send the SAT scores tend to be particularly strong, the institution gets to report “high SAT scores” that overstate its selectivity. This, in some cases, may be an incentive to not require SAT scores.

Further, it is not at all uncommon for schools to exclude certain groups when they report admissions data: scholarship athletes, programs for the “underprivileged” and so on.

Conclusion

The irony of all this is that it may in fact be counter-productive. U.S. News and World Report not only uses freshman test scores in their college rankings, they use graduation rates also. But when they do this they take into account freshman test scores — the higher the freshman test scores, the higher they expect the graduation rate to be.

So reporting higher freshman test scores has the effect of “raising the bar” and making Marquette’s graduation rate (which is quite good) appear less good than it otherwise would.

In an academic world of universities that are intensely image conscious, and incessantly spin the information they release, Marquette hardly stands out, on this issue, as uniquely sleazy.

Still . . . it has a rather bad odor. Is it too much to expect a supposedly Christian university to tell the truth in the most straightforward and forthright way?

0 Comments:

Post a Comment

<< Home