The brand of higher education seems to be torn between two extremes.
On one side you’ve got the seemingly non-stop, bad press about increasing student loan debt, the uninspiring employment rate for recent graduates, and ever–increasing tuition costs. On the other side are an equal number of experts telling us that going to college is still worth it and that college graduates still, on the average, make more money than non-graduates. In fact, according to some, not enough people are going to college: it should be universal!
The idea that everyone should go to college is a strong and tenacious meme in our culture and is firmly embedded as part of the mythical “American Dream.” A 2012 poll by the Pew Research Center found that a staggering 94% of parents of children 17 years old or younger expect their kids to go to college. Considering the polarization of opinion in the US, it’s probably safe to say it’s one of the few things Americans agree on.
But while parents might agree that their kids should go to college, they seem to be unsure of why. In a June 2015 poll by the Robert Morris University Polling Institute, less than half of parents surveyed were satisfied that schools were paying attention to current job market trends in order to appropriately prepare students for life after graduation. Even worse, the Pew Research Center found that 57% of Americans felt that higher education “fails to provide students with good value for the money they and their families spend.” Worse yet, 75% felt that college has become too expensive for most Americans to afford.
So if parents are so unhappy, why do they want to send their kids to school? Pew found that Americans are split fairly evenly on the topic, with 46% believing that college is about job preparation and 39% believing it’s about personal and intellectual growth. Interestingly enough, those who had been to college were more likely to emphasize personal growth while those who hadn’t perceived college similar to that of vocational training.
But it’s not just parents who are confused about the reason for higher education…apparently college presidents are, too. Pew surveyed over 1,000 of them in the same survey mentioned previously and found that they were just as evenly split as the public, with 50% of the presidents surveyed committed to the idea that college is about personal and intellectual growth and 47% holding the belief that the mission of higher education is to prepare students for the working world. If the people who lead institutions of higher learning are confused about why college should exist, is it any surprise that parents would be, too?
Whether you’re on the inside or the outside, it’s pretty clear that “college” as a brand isn’t what it used to be. Parents don’t think it’s worth the money (and the inevitable crushing debt) and college presidents can’t agree why parents should be sending their kids in the first place. A recent search of Google News conducted when writing this essay turned up over 27 million search results when asked the question “Is college worth it?” If any other industry was eliciting this much confusion and this many questions, nobody would deny that the brand is having problems.
For higher ed marketers, this kind of confusion isn’t a trivial problem. Differentiating one’s institution from the ever-growing host of competitors in the age of the Internet is hard enough without also fighting an uphill battle against a brand falling out of favor with prospective students (and their parents) who aren’t even sure why they should be buying what you’re selling in the first place.
Making headway against these forces is tough, but it can be done. But before we get to the “how,” we believe it’s important to understand the “why” of where we are right now.
The Rise of Higher Education in America: A Brief History
Formalized education has existed in the Americas for centuries, with the first Latin Grammar School (Boston Latin School) founded in 1635 as a place to train the sons of the wealthy for careers in leadership positions in the church, the government, or the judiciary. The first institution of higher learning, Harvard College, was founded a year later in 1636, though it was a pretty tiny place at the time with one building and one teacher.
Over the years, Massachusetts continued to lead the way in education, passing the Massachusetts Bay School Law in 1642 as a way of ensuring that the children of the colony knew the laws of the colony and the basic principles of religion. In order to make sure the law was followed, the colony then passed the Massachusetts Law of 1647 which mandated that every town with 50 or more families must hire a schoolmaster in order to teach their children and that every town of over 100 families had to hire a Latin grammar school master who could prepare the children for eventual entrance into Harvard College.
It wasn’t until a mercantile class began to develop in the later part of the 17th century (in the North) and slavery became firmly established enough to allow for the development of a North American “landed gentry” that there were enough wealthy sons to warrant the opening of the College of William and Mary in 1693, over 60 years after the founding of Harvard College in the North.
The 18th century brought the Enlightenment to Europe and, by way of Benjamin Frankin and other thinkers of the mid-1700’s, to the growing British colonies. Besides providing the philosophical backing to the American Revolution, the humanistic ideals of the Enlightenment also began to spur a revolution of sorts in public education, especially in Pennsylvania, particularly in Philadelphia where the first English Academy was founded in 1751 by Benjamin Franklin. The Academy later went on to become the University of Pennsylvania.
After the Revolution, public education began to spread across the new United States in earnest, with numerous public elementary schools, schools for women, schools for African Americans, and even schools for the mentally and physically disabled opening in the first half of the 19th Century. While many of these early schools were open to the public, attendance didn’t become compulsory until 1852 when Massachusetts, again taking the lead in education, passed the first mandatory attendance law. By 1885, mandatory public primary school laws had been passed by 16 of 38 states. It would only take another 33 years for the rest of the country to pass mandatory public education laws for all 48 states.
Not surprisingly, as public primary education spread across the country, so did interest in higher education. While the Civil War greatly slowed the spread of higher education, it didn’t stop it entirely: in the early days of the War (1862), Congress passed the First Morrill Act, donating public lands to the states for the:
endowment, support, and maintenance of at least one college where the leading object shall be, without excluding other scientific and classical studies and including military tactics, to teach such branches of learning as are related to agriculture and the mechanic arts, in order to promote the liberal and practical education of the industrial classes in the several pursuits and processions in life.
This act, also known as the “Land Grant Act” essentially provided the foundation for the establishment of the state university system, though, for the most part, not until after the conclusion of the War.
In the early part of the 20th century, education in the US really began to take off, spurred on by a solid commitment at the Federal level and public-private partnerships such as John D. Rockefeller’s General Education Board, founded in 1902 to promote education throughout the South through the promotion of “practical farming,” the establishment of public high schools, promotion of higher education across the country, and the development of schools for African Americans, primarily to train new teachers. By 1910, 72% of all American children attended school of some kind, and 9% of young adults had earned high school diplomas. By 1935 (not coincidentally 17 years after universal mandatory schooling laws were passed across the nation), the number of high school graduates had jumped to 40%.
As public education grew in the United States, so did the number of people going to college. In 1870, just 5 years after the end of the Civil War, only about 9,400 college degrees were awarded in the US. By 1930, that number had jumped to 139,800. By 1950 the total number of degrees awarded grew by more than 350% to 496,800, primarily driven by returning GIs taking advantage of the 1944 GI Bill which effectively paid for their college education. The National Defense Act of 1958, passed after the Soviets spooked the US by launching Sputnik, drove numbers higher as federal funding became available for what we now call “STEM” (Science, Technology, Engineering, and Math) programs. The National Defense Act, combined with draft deferments for those attending college (ending in 1971), more than doubled the number of Bachelors degrees awarded and greatly swelled the ranks of graduate schools as students decided to continue their educations rather than risk being caught up in the draft.
|Year||BA degrees||MA degrees||PhD degrees|
(source: US Census)
As the chart below illustrates, the growth of higher education in the US has tracked pretty closely with educational policy decisions that increased the pool of potential undergraduates, including the 1992 ruling (the “90-10” rule) that allowed for-profit schools to receive federal funding for the first time, a change in policy that eventually resulted in the Federal student loan-fueled explosion of for-profit institutions for the next decade and a half.
While it’s easy to see the impact of policy changes on higher education in the US, what’s a little more difficult to see is why these changes occurred. The answer, it turns out, has a lot to do with how technology and culture collide.
The Educational/Industrial Revolution
The rise of public education—and the rise in higher education that followed—tracks closely to the rise of the Industrial Revolutions in the United States. The First Industrial Revolution, a period which is commonly defined as starting in 1712 with the invention of the first steam engine and ending somewhere between 1820 and 1840, was characterized by the transition from hand production and natural (wind, water, human, and animal) powered industry to steam power and mechanized methods of manufacturing (along with the transition from wood to fossil fuels). The Second Industrial Revolution, characterized by increasingly-large scale manufacturing, increasing adoption of powered transport, and the development of the chemical industry, began during the final years of the First Industrial Revolution and, depending on where you are in the world, is still going on today.
These “Revolutions” weren’t just technological revolutions, they were cultural revolutions as well. The changes brought about by new technological developments moved the US away from being a mostly agrarian society of craftspeople and farmers to a more urban society of workers, managers, traders, and capitalists. The skills that had sustained people for millennia – farming, animal husbandry, blacksmithing, woodworking, spinning, weaving, etc. – became more and more irrelevant as production was moved from the home and the village workshop to the factory. Even the very idea of “work,” which had been tied closely to the sun, the seasons, and the weather had to change along with how people thought of everything from compensation to community.
The new society of the Industrial Revolution needed people with new knowledge, new skills, and new ways of thinking about living. It needed workers who were compliant, understood the importance of time, and knew their place in the great machine of the Industrial Age. These people needed to be led by managers who had the skills and knowledge necessary to run huge operations involving a perfect, ongoing, intricate dance between humans and machines. The Industrial Revolution needed not only people who knew how to make the trains and make the trains run, but also those who knew how to make them run on time.
Compulsory public education and the expansion of the system of higher education were vital parts of the Industrial Revolution. Public education prepared students to be good workers (and, perhaps for a lucky few, good managers) and higher education prepared others to be good managers, bankers, lawyers, doctors, and others in the “professional classes” that kept the wheels of industry turning. In addition, funding provided by the government in the form of land grants (the Morrill Acts of 1862 and 1890) and direct funding for certain types of education (the Smith-Hughes Act of 1917 which provide direct federal funding for vocational education) helped build foundations for state systems of higher education while funding scientific research and teaching from industrialists such as Rockefeller and Carnegie helped move the machine forward.
But even with the expansion of education, most Americans still couldn’t afford higher education. This wasn’t as much of an issue during a time when most Americans worked in factories or on farms, but as industry exploded from the demands for material during World War II and technological advancements drove more and more of the economy, it became clear that expanding access to higher education was going to be a national necessity in the Atomic Age.
The democratization of higher education and the creation of the educational system of today was born with the GI Bill of 1944 and the National Defense Education Act of 1958. Millions of returning GIs flocked to colleges and universities after the war, graduating into managerial, technical, and scientific professions that fueled the great boom years of the 1950’s and early 1960’s.
But as the American Dream became more accessible for all, one problem remained: funding for college from the government was limited mainly to those who had served in the military. As those people gave birth to sons and daughters (sons and daughters who, they hoped, wouldn’t have to experience the horrors of war), they needed a way to pay for them to go to college…and the colleges needed new students to fill the classrooms previously occupied by those on the GI Bill.
Enter the Higher Education Act of 1965. Signed into law by President Lyndon B. Johnson at Southwest Texas State College in November of that year, it provided low-interest loans to students, provided more money for universities, created a number of scholarships, launched the National Teacher Corps, and generally made higher education available to a huge number of Americans.
This all made a lot of sense at the time. An increasingly technologically-driven society needed an ever-increasing number of college educated people to keep it running…especially if it had to keep running ahead of the Soviets and the emerging threat from China. At the same time, an expanding population was creating an expanding need for social scientists, providers or social services, and teachers. Jobs in manufacturing were still important pathways to the middle class for most Americans, but it was clear that the future was going to belong to the college educated.
Education and Post-Industrial Confusion
Education changed during the Industrial Revolution because the needs of industry, government, and society changed. Today, we find ourselves facing the same kind of transition as we enter the Digital Age…and transitions aren’t easy.
While higher education over the past century or so evolved to meet the needs of an industrial society by focusing on creating people who could work and lead and create in a somewhat hierarchical, centralized world driven by linear ideas of time and measures of productivity tied to output and profit tied to scale, the Digital Age demands new skills and new ways of thinking:
Whereas the scaling of production was often enough to maintain profitability in the Industrial Age, now both cost advantages and growth is almost always based on continuous innovation. In such a world there are no sustainable competitive advantages. Time and novelty are the sources of profit, and such profits are temporary, at best. This development characterizes the innovation-based economy. At this stage, workers need to become innovators. The balance shifts from the integrating function of education towards the diversifying function.
Learning and Education After the Industrial Age, Tuomi, Ilkka and Miller, Riel (Oy Meaning Processing, 2011) p. 6
The reason that the higher education market is so confused today, why the brand is taking such a beating isn’t, at its heart, about student debt, rising tuition costs, MOOCs, badges, or any of the other buzzwords that get tossed around so much in the press. The real reason for the problems we face today in higher education is that we’re in the middle of a massive transition just as large as the transition from an agrarian society to an industrial society, even if we don’t want to admit it. We’re trying to force a system based on the needs of one era to meet the needs of a new one. The brand is suffering because we haven’t figured out what education for the Digital Age is going to look like—though the quote from Tuomi and Miller above does a pretty good job of giving us a glimpse.
For more idfive marketing ideas sign up for our monthly whitepaper.