75% of New Jobs Require a Degree While Only 40% of Potential Applicants Have One
LAX and SoFi Stadium hold a hiring fair to fill more than 5,000 positions in airlines, concessions, retail, administration and more, in Inglewood, California, on September 9, 2021.
CHRISTINA HOUSE / LOS ANGELES TIMES VIA GETTY IMAGES
BYDavid Trend, Truthout
BYDavid Trend, Truthout
PUBLISHEDAugust 7, 2022
In recent years, amid college admissions scams and student debt, a new debate is emerging around higher education. An increasing number of people are questioning the “paper ceiling” — the barrier for skilled job seekers who lack a bachelor’s degree. The education press is calling this an ontological threat in that it questions the existence and value of college itself, while accusing the system of perpetuating multiple forms of inequity. Of course, higher education often has found itself a political football in the past. What makes this time different is its critique of functions universities typically have seen as their strength: providing skills for reemployment and meaning for life.
Everyone knows it’s been a tough few years for higher education. With enrollments dropping during the pandemic at a pace not seen for half a century, concurrent changes in the U.S. workplace have rendered college degrees unnecessary for a growing number of high-wage jobs. Yet many employers require four-year credentials anyway, in what some observers see as an antiquated habit and a cover for discrimination.
The numbers are deceptively simple: 75 percent of new jobs insist on a bachelor’s degree, while only 40 percent of potential applicants have one. According to the advocacy group Opportunity@Work, employers mistakenly equate college completion with work aptitude, while disregarding self-acquired knowledge or nonacademic experience. The group asserts that the nation’s undervalued workforce “has developed valuable skills through community college, certificate programs, military service, or on-the-job learning, rather than through a bachelor’s degree. Workers with experience, skills, and diverse perspectives are held back by a silent barrier.” As a consequence, more than 50 percent of the U.S.’s skilled workforce has been underemployed and underpaid.
More concerning still is that such discrimination is unevenly distributed. Within a 70-million worker cohort of what are termed STAR (Skilled Through Alternative Routes) employees – those who don’t have a four-year degree — one finds 61 percent of Black workers, 55 percent of Latinos and 61 percent of veterans.
Academia has not ignored these issues. Schools know full well that students want jobs. Industry partnerships for job preparation, not to mention research and “innovation” programs, are common, especially for programs in science, engineering, technology or business. Equity and diversity programs likewise have become more robust, particularly in recent years. But the quality and quantity of these efforts varies from school to school. Unsurprisingly, the educational establishment uses its shortcomings to argue for more money and capacity. Pushing student loans and tuition discounts to boost enrollments, universities often cite statistics that their graduates have lifetime earnings of up to $1 million more than those without a degree. Many also assert the role of college in providing intellectual development, critical awareness and socialization.
But public opinion isn’t so sure, with Pew Research finding that only 16 percent of Americans believe college does a good job of preparing students for well-paying careers in today’s economy. Certain cohorts of the population have always held degrees of anti-intellectualism and resentment toward academic elites — but now even grads themselves express doubts, with only half saying their degrees helped them find work or do their jobs. The general public is divided on what higher education should do, according to a recent survey from the Association of American Colleges and Universities, revealing that 75 percent of wealthy and college-educated Americans believe a college degree is “definitely” or “probably” worth it, while only half of adults without a college degree or making less than $50,000 a year hold the same opinion. Pushback from inside the university enters this debate as well, with some faculty resenting the anti-intellectualism and vocationalism of the “corporate university.” Meanwhile, right-wing groups like Turning Point USA advocate the defunding of universities and prosecuting them for fraud.
Amid these disagreements, a growing bipartisan movement now is recognizing that the U.S.’s fixation with bachelor’s degrees ignores the many well-paid skills that can be acquired without going to college, not to mention the pace at which technology is creating more such jobs. Meanwhile, self-doubt is cropping up in publications like the respected Chronicle of Higher Education, which recently ran a piece asking, “If you don’t need a bachelor’s degree to get a good job, what does that do to the value of college?”
But it isn’t only job applicants who miss out. The entire economy suffers from narrow approaches to career preparation. The rising role of technology has meant that 37 percent of skills in highest demand have changed since 2016, according to data collected by the Burning Glass Institute (BGI) on over 15 million jobs. One in five jobs (22 percent) required at least one skill that was totally new, with positions changing rapidly in areas like finance, design, media, management, human relations and IT. Meanwhile, press accounts abound about jobs going unfilled in critical fields because employers can’t find “qualified” applicants. Labor reporter Eleanor Mueller asserts that “the U.S. spends far less on worker development than most other wealthy nations, which has made it difficult for its workforce and supply chain to meet current challenges.” According to Andy Van Kleunen of the nonprofit National Skills Coalition, the nation’s workforce strategy has declined because it relied too much on degree holders, when it “should be investing in all layers of our workforce.”
Little wonder that students are opting out of college at record rates. In a tight economy, only the wealthy can afford an education that promises no job at the end. The economic class division of higher education certainly isn’t lost on the 68 percent of college students who must borrow to pay for school, the majority of whom will spend decades of their lives paying off $1.7 trillion in tuition debt –– while deferring things like buying first houses (and often being prevented from securing home loans when they do try to buy homes because their student loans are too high) or starting families. Conditioned to see college as a requirement for the “American Dream,” many find themselves stuck with a flawed education that is increasingly overpriced, loaded with unnecessary frills, and punitive to anyone unfamiliar with its rules and culture.
Compounding this problem is the inequity running rampant inside colleges and universities in ways only recently coming to light. To make up for lagging enrollment numbers, schools increasingly base admissions decisions on willingness to pay (or borrow) over metrics that predict academic success and graduation. Once students get to college, they often find themselves struggling in poorly taught courses staffed by underpaid and overworked part-time faculty. A growing literature now documents the once-untold story of campus cost-cutting, especially as it shortchanges students with learning differences, special needs or limited experience with college life. All of this contributes to the rapid rise of student stress, academic failure and drop-out rates.
This new crisis in higher education is hardly a secret, yet it mostly gets viewed through such symptoms as rising tuitions, budget cuts, student anxiety and unemployment. Because of this, it’s not the type of problem resolvable with a single fix. Doing away with college certainly isn’t the answer at a time when all young people need the critical skills to make smart decisions for themselves and each other as workers, consumers and citizens. Pretending the problem will just go away won’t work either. At this point, nearly everyone involved agrees that much needs changing in the way higher education is conceived and how it operates. Groups like the Campaign for Free College Tuition find overwhelming public support for cost-free community colleges and state universities, as pressure continues to build in Washington for a broad-based federal program of student loan forgiveness.
Nascent movements like Critical University Studies — which examines higher education in a social context — seem one place to start. Writing an early article on this, Jeffrey J. Williams argued that any true reform of higher education must involve the many stakeholders inside and outside of institutions: students, professors, union organizers, business leaders. A growing body of helpful information/research is now becoming available; for instance, the online “Critical University Studies Resources” from Northwestern University’s Program in Critical Theory is listing current articles on the topic. Duke University Press offers a “Critical University Studies Syllabus” with links to online readings, largely free of charge. Both Palgrave MacMillan and Johns Hopkins University Press now have book series in university studies as well. Key in all of this is the need to begin a conversation, especially within an academic culture that all too often has seen itself exempt from the practicalities of the world around it. Unless something changes soon, higher education’s existential crisis may become very real indeed.
In recent years, amid college admissions scams and student debt, a new debate is emerging around higher education. An increasing number of people are questioning the “paper ceiling” — the barrier for skilled job seekers who lack a bachelor’s degree. The education press is calling this an ontological threat in that it questions the existence and value of college itself, while accusing the system of perpetuating multiple forms of inequity. Of course, higher education often has found itself a political football in the past. What makes this time different is its critique of functions universities typically have seen as their strength: providing skills for reemployment and meaning for life.
Everyone knows it’s been a tough few years for higher education. With enrollments dropping during the pandemic at a pace not seen for half a century, concurrent changes in the U.S. workplace have rendered college degrees unnecessary for a growing number of high-wage jobs. Yet many employers require four-year credentials anyway, in what some observers see as an antiquated habit and a cover for discrimination.
The numbers are deceptively simple: 75 percent of new jobs insist on a bachelor’s degree, while only 40 percent of potential applicants have one. According to the advocacy group Opportunity@Work, employers mistakenly equate college completion with work aptitude, while disregarding self-acquired knowledge or nonacademic experience. The group asserts that the nation’s undervalued workforce “has developed valuable skills through community college, certificate programs, military service, or on-the-job learning, rather than through a bachelor’s degree. Workers with experience, skills, and diverse perspectives are held back by a silent barrier.” As a consequence, more than 50 percent of the U.S.’s skilled workforce has been underemployed and underpaid.
More concerning still is that such discrimination is unevenly distributed. Within a 70-million worker cohort of what are termed STAR (Skilled Through Alternative Routes) employees – those who don’t have a four-year degree — one finds 61 percent of Black workers, 55 percent of Latinos and 61 percent of veterans.
Academia has not ignored these issues. Schools know full well that students want jobs. Industry partnerships for job preparation, not to mention research and “innovation” programs, are common, especially for programs in science, engineering, technology or business. Equity and diversity programs likewise have become more robust, particularly in recent years. But the quality and quantity of these efforts varies from school to school. Unsurprisingly, the educational establishment uses its shortcomings to argue for more money and capacity. Pushing student loans and tuition discounts to boost enrollments, universities often cite statistics that their graduates have lifetime earnings of up to $1 million more than those without a degree. Many also assert the role of college in providing intellectual development, critical awareness and socialization.
But public opinion isn’t so sure, with Pew Research finding that only 16 percent of Americans believe college does a good job of preparing students for well-paying careers in today’s economy. Certain cohorts of the population have always held degrees of anti-intellectualism and resentment toward academic elites — but now even grads themselves express doubts, with only half saying their degrees helped them find work or do their jobs. The general public is divided on what higher education should do, according to a recent survey from the Association of American Colleges and Universities, revealing that 75 percent of wealthy and college-educated Americans believe a college degree is “definitely” or “probably” worth it, while only half of adults without a college degree or making less than $50,000 a year hold the same opinion. Pushback from inside the university enters this debate as well, with some faculty resenting the anti-intellectualism and vocationalism of the “corporate university.” Meanwhile, right-wing groups like Turning Point USA advocate the defunding of universities and prosecuting them for fraud.
Amid these disagreements, a growing bipartisan movement now is recognizing that the U.S.’s fixation with bachelor’s degrees ignores the many well-paid skills that can be acquired without going to college, not to mention the pace at which technology is creating more such jobs. Meanwhile, self-doubt is cropping up in publications like the respected Chronicle of Higher Education, which recently ran a piece asking, “If you don’t need a bachelor’s degree to get a good job, what does that do to the value of college?”
But it isn’t only job applicants who miss out. The entire economy suffers from narrow approaches to career preparation. The rising role of technology has meant that 37 percent of skills in highest demand have changed since 2016, according to data collected by the Burning Glass Institute (BGI) on over 15 million jobs. One in five jobs (22 percent) required at least one skill that was totally new, with positions changing rapidly in areas like finance, design, media, management, human relations and IT. Meanwhile, press accounts abound about jobs going unfilled in critical fields because employers can’t find “qualified” applicants. Labor reporter Eleanor Mueller asserts that “the U.S. spends far less on worker development than most other wealthy nations, which has made it difficult for its workforce and supply chain to meet current challenges.” According to Andy Van Kleunen of the nonprofit National Skills Coalition, the nation’s workforce strategy has declined because it relied too much on degree holders, when it “should be investing in all layers of our workforce.”
Little wonder that students are opting out of college at record rates. In a tight economy, only the wealthy can afford an education that promises no job at the end. The economic class division of higher education certainly isn’t lost on the 68 percent of college students who must borrow to pay for school, the majority of whom will spend decades of their lives paying off $1.7 trillion in tuition debt –– while deferring things like buying first houses (and often being prevented from securing home loans when they do try to buy homes because their student loans are too high) or starting families. Conditioned to see college as a requirement for the “American Dream,” many find themselves stuck with a flawed education that is increasingly overpriced, loaded with unnecessary frills, and punitive to anyone unfamiliar with its rules and culture.
Compounding this problem is the inequity running rampant inside colleges and universities in ways only recently coming to light. To make up for lagging enrollment numbers, schools increasingly base admissions decisions on willingness to pay (or borrow) over metrics that predict academic success and graduation. Once students get to college, they often find themselves struggling in poorly taught courses staffed by underpaid and overworked part-time faculty. A growing literature now documents the once-untold story of campus cost-cutting, especially as it shortchanges students with learning differences, special needs or limited experience with college life. All of this contributes to the rapid rise of student stress, academic failure and drop-out rates.
This new crisis in higher education is hardly a secret, yet it mostly gets viewed through such symptoms as rising tuitions, budget cuts, student anxiety and unemployment. Because of this, it’s not the type of problem resolvable with a single fix. Doing away with college certainly isn’t the answer at a time when all young people need the critical skills to make smart decisions for themselves and each other as workers, consumers and citizens. Pretending the problem will just go away won’t work either. At this point, nearly everyone involved agrees that much needs changing in the way higher education is conceived and how it operates. Groups like the Campaign for Free College Tuition find overwhelming public support for cost-free community colleges and state universities, as pressure continues to build in Washington for a broad-based federal program of student loan forgiveness.
Nascent movements like Critical University Studies — which examines higher education in a social context — seem one place to start. Writing an early article on this, Jeffrey J. Williams argued that any true reform of higher education must involve the many stakeholders inside and outside of institutions: students, professors, union organizers, business leaders. A growing body of helpful information/research is now becoming available; for instance, the online “Critical University Studies Resources” from Northwestern University’s Program in Critical Theory is listing current articles on the topic. Duke University Press offers a “Critical University Studies Syllabus” with links to online readings, largely free of charge. Both Palgrave MacMillan and Johns Hopkins University Press now have book series in university studies as well. Key in all of this is the need to begin a conversation, especially within an academic culture that all too often has seen itself exempt from the practicalities of the world around it. Unless something changes soon, higher education’s existential crisis may become very real indeed.
David Trendis a professor at the University of California, Irvine, and the former editor of the journals Socialist Review and Afterimage. Honored as a Getty Scholar, his books include Anxious Creativity (2020), Elsewhere in America (2016), Worlding (2012), The End of Reading (2010), A Culture Divided (2009), Everyday Culture (2007), The Myth of Media Violence (Blackwell, 2007), Reading Digital Culture (Blackwell, 2001), Cultural Democracy (1997), Radical Democracy (1996), and Cultural Pedagogy (1992).
No comments:
Post a Comment