Of late - with the LARGE exception of COVID-19 related information - the buzz in my Twitter timeline is almost wholly that from the pro-test-optional community. I count myself as one who supports that initiative.
(NOTE: here is where I must once again pause to remind you that my words are my words only. These thoughts do not necessarily represent those of my colleagues or my employer.)
So, late last fall when I saw news that a large state university was going test-optional (and a member of the Big Ten to boot) it certainly grabbed my attention:
Students applying to any Indiana University campus may soon have the option of whether to include their scores from standardized tests like the SAT or ACT with their application materials.
During its December meeting, the IU Board of Trustees approved a change in policy allowing each IU campus the option to adopt a test-optional admissions policy. Faculty leadership from each campus will now have its chance to set its own policy.
Academic success at the college level depends on a range of factors, with the greatest importance placed on academic preparation. Research shows that, for many students, high school GPA provides the best prediction of academic success in college.
Okay, so the headline says that IU is "one step closer" to test-optional, and "faculty leadership from each campus to set its own policy". But, if you look at IU Bloomington's admissions website (arguably the "flagship" campus), you get a BUNCH of info and FAQs about the test-optional policy:
Screenshot from admissions.iu.edu |
And it's all effective for students applying for admission in 2021.
Conversely, it was recently revealed that the University of California system - despite mounting pressure to abolish the use of the ACT - would keep standardized exams as part of their admission requirements:
The Academic Senate of the University of California assembled a task force in 2018 to evaluate the system’s current use of standardized tests. On Monday that task force delivered a much-anticipated report listing several recommendations. Not among the recommendations? Tossing the tests.
While the authors considered what it might look like for the large public university system to go test optional and not require SAT or ACT scores in the admissions process, they ultimately declined to endorse that option.
For EM nerds, you can view/download the full STTF report here. It's a lot.
For me, the most interesting part of the story was to keep tests in play despite earlier signs that the UC system would decide to abolish the requirement. Eric Hoover at The Chronicle of Higher Education breaks it down nicely.
On Monday a panel appointed by the University of California’s Academic Senate delivered a long-awaited report examining the system’s standardized-testing requirements. The bottom-line recommendation: The university, at least for the near future, should continue requiring the ACT or SAT for admission.
Critics of college-entrance exams had hoped that the report would recommend that the system stop requiring them, especially after some of the university’s most prominent leaders publicly questioned their value last year. Instead, the panel pumped the brakes, recommending that the university conduct further research on the possible effects of dropping the requirement.
My emphasis. There were many that thought the system would stop using tests as an admission requirement to the University of California generally.
But, for my mind, the most striking part of the report is the assertion that high school grade point average is not as accurate a predictor of first-year collegiate GPA as standardized test scores are:
How well do UC’s current standardized testing practices assess entering high school students for UC readiness? How well do UC current standardized testing practices predict student success in the context of its comprehensive review process?
The STTF [Standardized Testing Task Force] found that standardized test scores aid in predicting important aspects of student success, including undergraduate grade point average (UGPA), retention, and completion. At UC, test scores are currently better predictors of first-year GPA than high school grade point average (HSGPA), and about as good at predicting first-year retention, UGPA, and graduation. For students within any given (HSGPA) band, higher standardized test scores correlate with a higher freshman UGPA, a higher graduation UGPA, and higher likelihood of graduating within either four years (for transfers) or seven years (for freshmen). Further, the amount of variance in student outcomes explained by test scores has increased since 2007, while variance explained by high school grades has decreased, although altogether does not exceed 26%. Test scores are predictive for all demographic groups and disciplines, even after controlling for HSGPA. In fact, test scores are better predictors of success for students who are Underrepresented Minority students (URMs), who are first-generation, or whose families are low-income: that is, test scores explain more of the variance in UGPA and completion rates for students in these groups. One consequence of dropping test scores would be increased reliance on HSGPA in admissions. The STTF found that California high schools vary greatly in grading standards, and that grade inflation is part of why the predictive power of HSGPA has decreased since the last UC study.
Again, my emphasis.
The reason that I find this information striking is because it flies in the face of years of research that contradicts the assertion; how a student performs over four years in high school is the strongest predictor of how that student will perform at a post-secondary institution.
Indeed there is a large body of evidence that shows a positive correlation between high school GPA and SAT scores. When used in conjunction with high school performance, standardized exam scores (such as the ACT or SAT) enhance the prognostic value to the method being used to predict academic success in college.
But many studies show that GPA alone is the best individual predictor of academic success (particularly in English and math coursework) of traditional-aged first-time college students. Here's a great example from 2017:
High school grade point average was consistently predictive of college performance among recent high school graduates regardless of whether they were from rural or urban parts of Alaska. Although the students attended different high schools, their high school grade point average was similarly predictive. High school grades may be more predictive than standardized exam scores and consistently predictive regardless of high school urbanicity because they are a measure of cumulative performance over time and thus quantify other skills or competencies—beyond reading and math proficiency—that are necessary to succeed in college.
The full study is here.
Given the academic evidence that mounts in favor of high school GPA/performance, I struggle to see how the UC system can assert that test scores out-perform secondary school performance in predicting collegiate success. That argument simply defies logic (and most academic literature on the topic that I have seen).
Throw in all of the recent activity related to the Novel Coronavirus, and you've got quite a slew of institutions adding to the growing list of colleges and universities that are going test-optional:
“These scores have always made up just a portion of our evaluation of prospective students, and we don't want our future applicants to feel hamstrung by circumstances far outside their control,” said Peter Shulman, associate professor of history and chair of Faculty Senate Committee on Undergraduate Education.
The change will be effective with those who apply to Case Western in the fall of 2021.
Also going test optional and citing the test cancellations were Concordia University Texas, Mansfield University of Pennsylvania and Westminster College, also of Pennsylvania.
Other colleges are shifting to test optional but not citing the current health crisis. Announcements in recent weeks include Chapman University, Hamline University, St. Bonaventure University and the University of Redlands.
...and we still have about six weeks to go until the end of the spring semester.
For me, no additional evidence is required to understand that how a given student performs over a four-year period in high school (i.e. a longer-term period) is going to be a very strong predictor of how they will perform over a four-year (or, if they are like me in college, a FIVE-year period) in higher education. The challenge for a large, research institution like Penn State is to develop a way to assess an applicant using a multitude of factors in lieu of standardized exam scores. The aforementioned IU model certainly has my interest piqued, and I will be continuing to watch very closely as things continue to unfold.
Categories:
Admissions,
College Board