By Chad Aldeman, Guest Blogger
The National Student Clearinghouse, which boasts of “near-census national coverage” of all college students in the country, has released a number of new reports in the last few weeks, most noticeably a large report yesterday on degree completions. Due to the breadth of its data—indeed, it has quietly become nearly a national student record database—it was able to rely on a sample of 1,878,484 students who began college in 2006. While the Clearinghouse deserves praise for accuracy and completeness of its data, the headline story touted by both the Clearinghouse and Inside Higher Ed, that the data showed, “that America is doing better on college completion than had previously been revealed,” relies on a misreading of the data. Here's why:
- While the data are richer, the outcomes aren't demonstrably better. For students who start at four-year public universities, after six years, the Clearinghouse found that 48.6 percent had completed a degree at the same institution, 8.7 percent have completed a bachelor’s degree at another 4-year institution, 3.2 percent had finished a two-year degree somewhere, and the rest had not completed any degree. That's a sum total of 60.5 percent of students completing any degree, which is lower than the 64.8 percent that NCES found in 2009 for first-time, full-time students at public four-year institutions. The Clearinghouse added in part-time students who aren’t captured in the federal calculations, but the final numbers look about the same.
- The Clearinghouse counts students who complete a degree at an institution other than where they started in graduation totals. Those students should be removed from the denominator in graduation rate calculations, not added to the numerator. Let’s say you have the following information:
- 100 students start at Institution X
- 40 students complete a degree at Institution X
- 10 students complete a degree at Institution Y
Using the federal graduation rate, Institution X would get credit for 40 graduates out of 100 beginning students, for a total graduation rate of 40 percent. This calculation punishes schools with students who transfer out and fails to capture any students who transfer into the school. It’s not fully accurate, but it’s the best measure we’ve had for a while.
The Clearinghouse shows us another way that would give Institution X full credit for any student who completes a degree at any other school. In this scenario, the numerator is not 40 but 50, and the graduation rate is 50 percent. This makes sense if we’re trying to get a national perspective on higher education, and the latest Clearinghouse report does exactly that. (This method, or a variation of it, may also make sense for a community college with a transfer function where a student who transfers is truly a success, but not for institutions without those incentives).
The problem, however, is that the Clearinghouse is already providing completion data on four-year institutions in precisely this misleading way. College Portraits, an attempt by the major higher education trade associations to address calls for transparency and accountability, use Clearinghouse data to calculate 5- and 6-year “student success and progress rates.” California State-Bakersfield, for example, boosts its 6-year rate to 76.7 percent by adding 7.1 percent of students who graduated somewhere else and 15.6 percent who are still enrolled somewhere else. Instead of having about a little over half of its students complete, suddenly they look 42 percent better!
The Cal State Bakersfield example is even worse than it sounds because it doesn’t account for reverse transfers. It treats any student who completes any degree equally, just like it does any student who transfers anywhere. Recent research from Texas suggests that where students transfer to and graduate from matters in terms of their likelihood of finding a job and how much they’re likely to earn.
A more honest way to do this would be to simply remove from Institution X’s denominator those students who successfully transfer to another institution. That way, Institution X would be credited with graduating 40 out of 90 of its students, for a 44 percent completion rate. This is an improvement on the calculation we use now, and it would be more honest than giving institutions credit for any student who ever enrolled there.
- Should we really be giving credit to schools where students are still enrolled after six years? Again, it’s fine if we’re interested in this from a national perspective, but from an institutional standpoint, we’re creating some bad incentives. Going back to CSU Bakersfield, their “student success and progress rate” plummets from 76.7 to 41.8 percent if we hold the line at six years and only give credit to CSU Bakersfield students who complete a degree at CSU Bakersfield. Research on the extended grad rates suggest the gains for extending the timeframe are minimal.
The Clearinghouse should be applauded for it's willingness to put it's data to use for public purposes. With a data warehouse of nearly every college student in the country, it’s a rich dataset to mine. However, we should be careful to scrutinize how the Clearinghouse analyzes that data and make sure it’s telling an honest story.
Chad Aldeman is a senior policy analyst at Bellwether Education Partners, a nonprofit organization working to improve educational outcomes for low-income students. Previously, Aldeman was a policy advisor in the Office of Planning, Evaluation, and Policy Development at the U.S. Department of Education, where he worked on teacher policy and secondary schools. Prior to joining the Department, Aldeman was a policy analyst with Education Sector, where he focused on K–12 and higher education accountability.