At the Republican primary debate on November 10, 2015, presidential candidate Marco Rubio called for “more welders and less philosophers.” Fact-checkers quickly pointed out that this statement was based on comparing starting salaries of philosophers to mid-career salaries of welders, but the sentiment is far from Rubio’s alone. Last year, President Barack Obama was pressured to apologize after he “promise[d] you folks can make a lot more, potentially, with skilled manufacturing or the trades than they might with an art history degree.”
These statements have brought a flood of attention to the question of whether attending college makes sense financially. At least for some students, it appears not: a recent report from investment bank Goldman Sachs showed that the time required to break even on a college degree is steadily rising. In 2050, Goldman Sachs estimated, college graduates will need to work 15 years on average to break even on their tuition investment. While the report did not appear to be publicly available, CNN writes that Goldman Sachs found students often lose money by attending schools “that rank in the bottom 25 percent of all universities.”
The bad news continues. Students are paying much more for education: net tuition, fees, and room and board nearly doubled for in-state students at public four-year universities (see chart) and increased 40 percent at private non-profit four-year universities—and that’s just in the last decade. With students paying so much more even with financial aid, we expect universities in the United States to graduate higher-skilled workers. But these rising costs, often attributable to universities spending ever-growing sums of money on facilities, dormitories, and other perks to attract students, have failed to prepare graduates, at least by some measures.
For example, the U.S. Department of Education (DOE) discovered in 2005 that college graduates’ scores on various measures of literacy declined substantially from 1992 to 2003. Further, only 25 percent of college graduates were “document literate.” That means that three in four college graduates did not have “the knowledge and skills needed to search, comprehend, and use information from noncontinuous texts in various formats.” Most of the 1,000+ university presidents that the Pew Research Center polled agreed that quality is falling; only a minority said that the U.S. higher education system is the best in the world, and even fewer said so when asked about the situation in 10 years.
Surprisingly, given rising costs and falling quality, the so-called “wage premium” between college graduates and high school graduates has continued to widen. However, college graduates’ wages have not risen—they have fallen over the last few decades. Instead, workers with only high school diplomas have watched their wages plummet at an even greater rate. In 2013, the National Bureau of Economic Research found that “high-skilled workers have moved down the occupational ladder and have begun to perform jobs traditionally performed by lower-skilled workers…pushing low-skilled workers even further down the occupational ladder and, to some degree, out of the labor force all together.”
Essentially, workers without college degrees must increasingly compete with college graduates for low-paying, non-technical jobs. As Nobel Prize-winning economist Paul Krugman pointed out, a “college graduate serving lattes at Starbucks is a cliché, but he reflects a very real situation.” The numbers back this up—the Federal Reserve found that underemployment in recent graduates, a measure of how many graduates do not have jobs that utilize their skills, rose from 31 percent in 2001 to 44 percent in 2012.
Turning to economic theory provides an explanation. In education economics, there are two positions on college payoff. One posits that students are in fact gaining skills that allow them to produce more, the “human capital theory.” By contrast, the “screening theory” argues that the returns to education are not due to increased productivity or skills—rather, the bachelor’s degree helps employers identify good applicants, since college graduates had the discipline and drive to finish college.
While both human capital and screening theories are empirically supported, screening may be playing a bigger role. In what Burning Glass Technologies, a job analytics company, calls “upcredentialing,” companies have increasingly required bachelor’s degrees for a swath of jobs. For instance, “65 percent of postings for Executive Secretaries and Executive Assistants now call for a bachelor’s degree. Only 19 percent of those currently employed in these roles have a B.A.” Other jobs, such as entry-level IT help desk employees, require skills rarely taught in bachelor’s programs, Burning Glass notes. These jobs are also the ones that “serve as the first step on the career ladder to better jobs” for the two-thirds of the labor force who do not have bachelor’s degrees. Burning Glass argues for increasing certifications for different industries as a way to better fit skills to jobs, but this illustrates the striking point that as companies use college degrees more and more as a screening device, the value in such degrees increasingly becomes avoiding the shrinking market for workers with less education.
It is important to realize, however, that there are myriad benefits from a college education beyond pure economic effects. As Victor Hugo put it, “he who opens a school door closes a prison,” since education decreases the propensity for crime. UC Berkeley economist Enrico Moretti said that each year of schooling reduces murder and assault rates by nearly 30 percent. Further, students make connections and friends who can last a lifetime. Facebook found that 28 percent of marriages listed on the social network were between members who attended the same college. These are likely better marriages as well: The Centers for Disease Control and Prevention showed that many more first marriages survived to 20 years among those with bachelor’s degrees than those with less education.
In addition, learning is valuable in itself, beyond the more tangible benefits. Washington University in St. Louis touts its goal of “discover[ing] and disseminat[ing] knowledge, and protect[ing] the freedom of inquiry through research, teaching, and learning.” There is little doubt that higher education for everyone is a worthy goal. It would produce citizens able and willing to engage in our democracy, who have the skills and drive to launch businesses and develop new ideas in an age of falling entrepreneurship where business closings outnumber business startups.
The problem is the high cost and time required—and not just the ticket price and rising debt graduates face. The opportunity cost of lost earnings is staggering, growing larger and larger each year that a student remains in school. The time required to graduate is rarely the four years that many expect. Even after 6 years, only 58 percent of “first-time full-time bachelor’s degree-seeking students at 4-year institutions” receive their degrees, according to the DOE. This graduation rate is even lower for many minorities and men, meaning that the cost of attending college is much higher on average for groups who are underrepresented in higher education.
What is the solution to this problem of rapidly rising costs and stagnating benefits of higher education? Some, such as Rubio and President Obama, argue that at the very least, the United States should encourage students to pursue less time-consuming and more in-demand careers by choosing higher-paying majors or attending vocational schools. Perhaps this is true. Certainly if more college-bound high school seniors chose to pursue careers in information technology and security, engineering, and healthcare, then college would be a good deal for them—but much of that benefit exists because those careers are currently in high demand, and thus graduates draw higher salaries. And if more high school graduates attended vocational school, they might do better due to training for high-demand careers and because vocational education experiences are often shorter.
Unfortunately, these are only stopgap measures. Even if we presume that these options would work, they fail to address what is causing the underlying problems: technology moving many (especially vocational) careers toward automation and costs becoming unsustainable. Politicians, of course, favor easy, predictable solutions to more unpredictable but ground-breaking ones. Yet the same groundbreaking technology which led to falling wages for high school and college graduates alike has created new possibilities for changing the future of education and solving these problems.
As online education moves away from massive open online courses (“MOOCs”) toward courses and schools that lead to degrees at low costs, many more people will pursue this option as an alternative to four-year brick-and-mortar universities. Although physical colleges are not going anywhere, online education will provide an avenue for low-income and non-traditional students, as well as people who do not want to spend tens or hundreds of thousands of dollars on higher education, to pay less and learn more.
In a sign of what is to come, the Massachusetts Institute of Technology announced in October that its Supply Chain Management (SCM) program—often ranked #1 in the country—would begin allowing students to take a semester of classes online and earn a “MicroMaster’s.” Those students who do well on an exam will be admitted to the full on-campus SCM program and earn credit for their online courses, allowing them to spend a single semester on campus to receive a master’s degree. MIT’s president, L. Rafael Reif, pledged that the inverted admission model would “democratize access.” For universities’ sake, let’s hope so because with student debt over $1.2 trillion and graduates unable to find good jobs, students will only keep paying so much for college.