Evaluating Jock Majors and College Quality

Blog Post
Nov. 27, 2007

Yesterday, Higher Ed Watch unveiled its first "Academic Bowl Championship Series" poll, which ranked the current top teams in college football using academic instead of athletic indicators. In developing the Academic BCS poll, we took advantage of all of the data that is publicly available on college athletes academic performance: graduation rates and the NCAAs "Academic Progress Rates."

Thats right, there are only two academic data points available for college athletes (with graduation rates disaggregated by race). To see all of the data we used for our Academic BCS formula, click here.

The actual BCS poll, part of which is determined by complicated computer formulas, has a myriad of different athletic statistics to choose from: strength of schedule, conference strength, home record, road record, opponents rank, points scored, points allowedand the list goes on.

The scarcity of academic performance data extends to all students in higher education, not just college football players. While we can count the number of football players and students overall who leave campus with a degree, at present there are few effective ways to assess the quality of that degree.

Take, for example, the University of West Virginia, currently at the top of the actual BCS poll and a contender for the National Championship. 56 percent of its football players who entered the school between 1997 and 2000 graduated within six years of initial enrollment. Does that mean that West Virginia is doing a better job educating its football team than Oklahoma, which has a much lower graduation rate of 37 percent?

Not necessarily. It could mean that the West Virginia is lowering its standards for football players and encouraging them to enter "jock majors," which provide the easiest routewith the least time in the classroom and librarytoward a degree, whereas Oklahoma isnt. While this likely isnt the case (given that jock majors are common on almost all big-time sports campuses), it brings up the multi-billion dollar question that no one in higher education wants to address, a question with no simple answers: what can we do to compare the quality of degrees across majors and schools?

People often joke about certain majors being less rigorous than others. When you look at the list of majors of the top quarterbacks in the country, its easy to think that Matt Flynn of LSU with his "General Studies" major and Taylor Tharp of Boise State with his "Communication" major are not doing as much work as, say, Chase Daniel of Missouri with his "Business Finance" major. But we have no way to judge that.

Of course, this extends to the other students enrolled in these majors as well. This is a problem for all consumers of higher education. When you decide where you want to spend your tuition money, there are not very many signals of quality available.

This isnt to say that graduation rates arent a helpful barometer. But they cant be the only barometer. There are several surveys out there that are trying to quantify the subjective learning that takes place on college campuses, such as the National Survey of Student Engagement (NSSE). But most schools dont want to make the results of these surveys publicly available (case in point: even after USA Today offered to put the NSSE results in an easy data tool on its website, most schools, including every school ranked in the current BCS poll, declined).

Theres no question that its incredibly hard to evaluate student learning outcomes in higher education. But there is information out there and more helpful surveys and collegiate learning tests are being produced. Unfortunately, colleges often shield the results. (There are some efforts to make more higher education data publicly available, which well discuss more in depth in a future blog post).

The first step toward evaluating and improving college quality is transparency. We need more data on college athletes besides graduation rates and the NCAAs APR (with its very low standards for achieving "academic progress"). For example, an accounting of the courses taken by athletes, with statistics such as GPAs or course requirements. Or a better measure of academic progress that includes a minimum GPA and puts more emphasis on the actual completion of courses rather than simply having enrolled (currently, the APR formula puts equal weight on each).

Without better data and more transparency, athletes and colleges can continue to game the system, and well never really know whether athletes who graduate are better-educated than those who dont.