Menu

Higher Education in the Age of Access

Posted by Sean Carton | October 28, 2015 | 4:02pm

age-of-access-pdfOn October 14, 2015, the US Department of Education announced that it would begin to offer federal student loans to students attending a few select “coding bootcamps,” intensive short-term (around six weeks) courses of study that promise to turn just about anyone into a programmer. These bootcamps have become increasingly popular—the Department of Education estimates 16,000 people will graduate from them in 2015—but have been out of reach for many, with tuition bills weighing in at over $10,000 in most cases.

Leery about the for-profit college debacles over the last decade that gave unscrupulous institutions a virtually unlimited license to siphon money directly out of the student loan program (while leaving graduates with virtually worthless “degrees”), the Department of Education plans on rolling out this program on a limited basis to programs that have partnered with colleges, and promises strict oversight of student outcomes delivered by the programs.

While this program is being run on a trial basis, some see it as the first step in a longer-term program to develop more highly-paid career pathways that don’t require a 4-year college education. With college debt at an all time high (and rising), “alternative credentials” are seen as a way to prepare people for the workforce without forcing them into crippling debt. As the Washington Post explained in the article announcing the program:

“The pilot comes at a time when the education and philanthropic communities are buzzing about the need to deliver credentials that employers recognize and value. Many businesses simply use a bachelor’s degree as a screening mechanism to identify people they think are likely to have the skills they need, even if the four-year degree wasn’t really necessary — which bars many competent people from jobs for which they might otherwise qualify.”

This is a big deal. For the first time, the Federal Government is showing signs of moving away from its long-time mantra that “everyone should go to college” because, as President Obama declared in August of 2013, “some form of higher education is the surest path into the middle class, and the surest path that you’ll stay there.”

These bootcamps aren’t “higher education.” While the one’s initially slated to receive federal student loan money have to have an affiliation with an institution of higher learning, most of the programs don’t award college credit for the course. There are few prerequisites (other than the ability to pay) and no general education requirements. Basically everything that we’ve used to traditionally define a college education is absent.

But their graduates get jobs…at least according to the statistics reported by the programs. They do learn marketable skills. They help fulfill a (perhaps perceived) need for workers in STEM fields. Even if they aren’t embedded in all the trappings of what we think of as “college,” they seem to accomplish many of the same outcomes, at least as far as those who see “college” as “career prep” are concerned.

This transition poses a communications and rebranding challenge for educational institutions finally catering to the technologically empowered individual. In order to prepare we’ll need to understand what these small progressions mean for the future of higher education in the US. To answer that question, we need to start in what may seem to be a very unlikely place: shopping malls.

 

Remember Malls?

Malls were places where people used to go to hang out, eat, and shop. The Mall was usually “anchored” by 2 or more department stores of varying prestige, ranging, perhaps, from Macy’s to Neiman-Marcus. And while the rest of the stores in the mall may not have been highest of the high end, most were considered to be fairly respectable retailers. Centrally-located with vast expanses of parking, offerings ranging from 20-screen multiplexes to a veritable cornucopia of chain restaurants, and areas for kids, teens, and adults to congregate and play, malls were the town squares of suburbia, a common destination to steer you through the Sprawl. In a world without a center, malls provided a social and commercial core for the middle class, a not-so-strange attractor with an inexorable pull.

malls

When e-commerce first appeared in the late 1990’s, pundits everywhere were quick to announce the death of the mall. “Why,” they asked with a knowing wink, “would anyone go out when they could shop at home in their underwear?”

All potential semi-clothed home shopping fetishes aside, it seemed pretty clear initially that the “shop in your skivvies” folks missed the boat. Malls weren’t on the decline: in fact, if anything, “brick and mortar” retail in shopping malls seemed to be growing, even as the nascent e-commerce industry itself grew. But if people were buying online, why weren’t malls initially feeling the pinch?

The answer, it seems, is that the pundits were looking at the wrong thing. They thought that “shopping” was about procuring goods. If one needed to buy something, they argued, wouldn’t one take the easiest, cheapest, and most efficient route and buy it online so that it could be delivered straight to their doorsteps, often by the next day? Only a Luddite would want to actually go out in order to buy things!

However, what they missed is that for most middle-class suburban consumers, “shopping” at the mall wasn’t just about buying things. The mall served a more important purpose: it was a place to meet other people. For the stay at home mom, going to the mall wasn’t a chore, especially when she could sip a latte with her girlfriends while their toddlers played in a fenced-in, cushioned-floored, semi-supervised play area.

For teens, malls were a place to socialize, to see and be seen in an environment their parents wouldn’t hassle them about and that was large enough to offer some degree of anonymity. While few suburban parents would have dreamed at dropping their 15 year old at what remained of the “shopping district” in their nearby cities, virtually none worried that their teen would come to harm in an enclosed mall with its own security force and relatively wholesome selection of retail outlets. Malls provided an almost perfect compromise, offering parents peace of mind while offering teens a sense of freedom and anonymity (within limits, of course).

 

The Decline and Fall of the American Mall

But the mall’s days were numbered. The turn of the 21st century brought with it a host of changes that began to nibble away at the mall’s role as the center of the American suburban landscape.

Economics played a big role. Many retailers were knocked on the ropes by the double whammy’s of 9-11 and the bursting of the dot.com bubble. There was, of course, a period of recovery and growth as the housing market picked up, home values rose, and much of the middle class began to feel exuberant about their new-found wealth…until that exuberance became irrational and the bubble burst around 2008. The Great Recession followed and the middle class began to wither under the forces of unemployment, underwater mortgages, rising debt, and slow economic growth. Shopping as entertainment began to feel like a long-ago and barely-remembered dream by many.

“If you look at what’s happening in America, where the country is, where the people are, the middle class has been decimated. The top group has never been richer by a tremendous multiple, and the middle class and low-end are getting destroyed,” observed Howard Davidowitz, founder and chairman of Davidowitz & Associates retail consulting and investment banking firm in a recent US News & World Report article. “This is very complex, and tremendous numbers of malls are closing and will close because they simply won’t be viable, because the middle class in our country is dramatically less financially viable than they used to be.”

dead mall

Demographics began to shift as well. Disillusioned with suburban life and no longer wanting (or being able to afford) long commutes and the upkeep of their McMansions, many began to move back into the cities, unconcerned (at least initially) about the schools they’d find there because they’d decided to delay having children. The kids in the suburbs (the original “mall rats”) began to grow up and move away and fewer kids were being born to take their place as the “demographic bubble” popped.

The result of these factors alone has caused a precipitous drop in retail traffic, with retail visits in November and December (the two biggest shopping months of the year) dropping from 35 billion in 2010 to 17.3 billion in 2013, according to Cushman & Wakefield (PDF). And according to retail analysts RetailNext, retail traffic declined 11.4% from the previous year in November 2014 and an additional 7.1% in December of the same year. As a result, many of the traditional “anchor” stores are closing their doors, with onetime stalwarts such as JCPenny, Sears, and Kmart shedding retail locations. And when the anchors go, the rest of the mall usually follows.

But economics and demographics aren’t the only forces remaking the retail landscape in the US. Technology’s played a huge role, especially mobile and social media tech. Sure, ecommerce has, too, but not in a way that many thought it would.

The reason that mobile and social technologies have had such a large impact on consumer shopping behavior can be summed up in one word: access. In particular, access to information and access to people.

 

Access Changes Everything

First, lets look at access to information. According to Nielsen’s (PDF) 2014 State of the Shopping Center report, 87% of consumers with smartphones or tablets use these mobile devices when shopping, though not necessarily for buying. Instead, consumers are using these devices to educate themselves about products before purchase, browsing reviews, comparing prices, and turning to their peers for advice and ideas about what to purchase via social media. Unlike consumers a decade ago, today’s consumer heads out the door armed with enormous amounts of information about what they want to buy and where they want to buy it, allowing them to hone in on their retail “targets” (pun intended!) with ease, especially when they can trust their GPS-equipped smartphones to get them to their destinations with ease.

But it’s not just access to information about stuff that’s changing the consumer landscape: access to people may be additional element that’s putting the final nails into traditional retail’s coffin. Using social media and text messaging, today’s consumers are never truly “alone” in the sense that they were in the days before the mass acceptance of this technology. While moms may have headed to the mall in order to socialize in days gone by, today they can connect with their friends on Facebook, swap purchase ideas on Pinterest, and even ask for advice about what to buy when shopping using Snapchat, Periscope, or other more-or-less real-time visual social media. And because they’re constantly linked to their social networks in Cyberspace, physical space ceases to matter like it used to, there’s no need for a shopping center when a consumer can always be at the center of their own network when shopping.

The changes, as you can see from the chart below, have been pretty drastic and the decline of the traditional retail sector (illustrated in the chart by declining department store sales) tracks pretty closely with developments in social and mobile technologies. Of course, correlation doesn’t imply causation – particularly important to note in this case—but it’s hard to ignore the results of all the forces at work, draining last allure malls had over online shopping.

social and malls

While the death of the middle-class mall is one highly visible symptom of these market forces at work, the bigger picture is a little more complex. These changes in demographics, economics, technology and behavior seem to be producing the following results that strongly relate to changes in higher education.

  1. A bifurcation in shopping experiences that mirrors the increasing economic separation in American society. While middle-class targeted malls are dying, luxury malls (dubbed “A++ malls” in the retail biz) are thriving. “The luxury malls are golden. If you take the top 500 malls, they’re terrific, and the stores in them are doing well. Wealth in America has never been better for the top group of people,” said Davidowitz in US News & World Report. “The middle class is getting killed. The upper class is doing great. And that reflects itself in what is going on in the malls. The two biggest middle-class stores in America are Sears and Penney’s, and they’re getting crushed.”
  2. Consolidation and specialization in retail. While “one-stop-shop,” “aspirational” department stores and malls are in decline, a quick glance at the list of 100 top US retailers makes it obvious who’s in the lead: big-box discount stores (Wal-Mart, Target, Cosco), specialized mega-retailers (The Home Depot, Lowe’s), and grocery/health/beauty chains (Walgreen, CVS, Kroger, Safeway). Each is it’s own “destination” and most focus on price while differentiating via branding (e.g.Wal-Mart vs. Target). Only one e-commerce retailer cracks the top 10: Amazon.com at position 9.
  3. “Boutiquing” of retail, with small stores surviving through intense differentiation, focus, and a connection to the community. Nielsen’s State of the Shopping Center report predicts that “young, diverse, urban consumers are the future of retail,” and these consumers want authenticity, novelty, and seek personal connections with the stores where they shop. Even Wal-Mart and Target are experimenting with smaller “community” stores.
  4. E-commerce and physical retail beginning to feel out a symbiotic relationship (of sorts). Retailers are matching online pricing (though not without a few hiccups) and encouraging “hybrid” shopping options that include same day pickup of online orders. While the final mix may shake out slightly differently, it appears (p.9) that consumers are settling into a pattern of looking online for things they don’t need right away and looking towards local stores for timely items (food, gas, health/beauty products) or items that require a more personalized “touch” (e.g. clothing) for a satisfactory experience. The balance will probably net out in a solution that uses the best characteristics of each channel.
  5. Increasing consumer responsiveness and accountability. When 65% of American adults use social media, there are a lot fewer places for bad retailers to hide.

While each of these changes is significant by itself, taken as a whole they represent a much larger shift, a shift that’s happening not only in retail but in nearly every part of our connected culture.

 

Mass Media & The Metanarratives

Mass media was probably the single most important factor that defined “America” to Americans and the rest of the world. Make no mistake, this definition was one that seriously marginalized many who didn’t want to “get with the program” by excluding them from the discourse. But for those who consumed whole-heartedly the vision first articulated in the papers of Pulitzer and Hurst, recited by FDR during his “fireside chats,” and brought to life on the screen by the idyllic suburban lifestyle fantasies of 50’s and 60’s television, the vision of America communicated through the mass media defined and supported our culture and our vision for our institutions for decades. Like the archetypal American hero, America refused to be defined by others. America defined itself, dang nab it!

The mass culture defined by the mass media of the 20th century had a (perhaps unintended) consequence of creating a sort of hazy populism in our institutions and, at least for a while in the middle part of the century, an idealistic progressivism. We were both “free to be you and me” and all in it together. The post-war prosperity of the 1950’s and early 1960’s broke down socio-economic barriers that had been in place for centuries. Millions of returning GIs took advantage of the GI Bill to fill colleges and universities that at one time had been reserved mostly for the wealthy. Suburbs filled quickly with those with the cash to pursue the dream of home ownership. Cars flowed off of assembly lines and into the suburbs. Malls soon followed, malls filled with a homogenized and idealized plethora of products we’d been told we needed by the explosion of advertising on TV, radio, and in print. The “American Dream” crystalized into a form that most of us still recognize (and dream about?) today: a house, 2 cars, 2.5 kids, a dog, parents with good jobs (or, even better, only one parent working outside the home), and the promise that those 2.5 kids would go to college and do better than their parents before them. Yes, many were excluded—mostly people of color or people whose living choices didn’t jive with the majority—but its safe to say that, if nothing else, what Jean-Francois Lyotard called the “metanarratives” of our culture (which were, to quote Superman, the greatest American of all, “peace, justice, and the American Way!”) were believed, or at least given lip-service, by nearly all.

 

Your One-Stop-Shop For the American Dream

Central to these great metanarratives was the ideal that education was the key to a better world. After all, wealth and higher education had always gone together and it seemed obvious to many that the post-War world of middle class prosperity was driven by the explosion in college graduates in the early 1950’s. We were in the Space Age, and the Space Age required rocket scientists, engineers, accountants, managers, and other highly-trained white collar workers. And it wasn’t all about prosperity: Sputnik’s steady pinging mocked us with every orbit after it was launched in 1957. Increasing the numbers of educated workers was now a matter of national security.

And increase they did.

As the chart below shows, in 1947, approximately 5% of Americans had earned a college degree (or higher). By 2011, that number had increased to 30%.

rise of college

And while the number of colleges and universities in the US had been growing steadily since 1900, the post-War boom in enrollment sparked massive growth in both 4-year and 2-year colleges as the supply of education increased to meet the demand.

rise of college 2rise of college 3

With the influx of students, colleges began to expand both physically and academically. Campuses expanded their footprint and/or opened satellite campuses. Schools and departments (especially in high-growth areas like business and the health professions) were born and grew. Government programs such as student loans and Pell grants continued to make college more accessible for the middle class until “college” and “middle class” became inseparable. Institutions began to understand their role in the socialization of the new middle class and amped up efforts to provide a self-contained experience for their students. When “Animal House” was released in 1978 it’s popularity can be attributed in no small part to its familiarity. When John Belushi’s character Bluto walked around sporting his “College” sweatshirt, no one in the audience failed to get the joke or the symbolism: what you were watching was College with a capital “C,” though obviously a parody painted with broad strokes. None of the tropes, none of the characters were unfamiliar to someone in the middle class because college (and “the college experience”) had become so familiar, even if it was something that many still aspired to.

The ghost of Higher Education Past, Present, and Future

Higher education continued to grow, fueled by aspiration and easy money. Colleges and universities expanded their physical plants and their administrative units, adding more and better facilities in order to compete with an increasingly “consumerist” base of prospective students who’d been raised to expect certain amenities. Many colleges, whose mission had been one of undergraduate education, added graduate schools and became universities, expanding even more as research dollars poured in. Eager to fulfill their positions as engines of workforce development, colleges and universities revamped their curricula to become more “relevant” to the job market and began to focus increasingly on job placement as a key outcome, often developing curricula difficult to differentiate from their competitors. Looking forward at the inevitable bursting of the post-Baby Boom “demographic bubble” and identifying the growing need for continuing and professional education, institutions began to expand into offering non-credit courses, certificates, and certification programs. Feeling the twin pressures of competition for a declining base of “traditional” undergraduates and the need to serve an ever-expanding base of “non-traditional” undergraduates and professional graduate students, institutions transformed themselves to become more welcoming to commuters by becoming more convenient with extended hours, accelerated programs, and fully online classes. Institutions of higher learning became, in varying degrees, “one-stop-education-shops” where aspirational, middle-class consumers could go in order to get the education they needed to achieve the American Dream.

In other words, they became malls.

 

The Age of Access

Institutions of higher learning were, by definition, designed to be repositories of information, both in their libraries and in the heads of the faculty. They served not only as conduits for transmitting this information to new generations, but as gatekeepers to the information as well. Mechanisms existed for the transmission of information between institutions in the form of scholarly texts and gatherings, but these were circulated almost exclusively within The Academy thereby maintaining the storage/gatekeeping role. The grades students received and the degrees they were eventually granted were signifiers of the success of this information-transmission process: if you received the information and retained it, you graduated and were given a piece of paper signifying that you now possessed a particular body of information yourself.

The people who made up an institution – both faculty and students—were vital to its operation as well. The faculty served as the information transmitters and mentors to the students, even more so when it came to graduate students destined for a life in academia who needed to be socialized in the ways of The Academy. Access was often controlled in a rigid hierarchy, with only a select few gaining direct access to those with the most information and power.

Student life and the interaction between students was important as well, especially in institutions that arose in the latter half of the 20th century. Social interactions on campus prepared students for dealing with the kinds of interpersonal relationships—both business and personal—that they’d have to deal with later on in their lives. At some of the more rarefied institutions (especially those nurturing only one gender), social interaction and access to people took the form of ritualized exchanges in the form of “socials” or “balls” or even sporting events. Fraternities and sororities took this kind of social stratification even further, allowing students to form bonds that went beyond graduation. And even if one wasn’t involved in Greek life, access to networks of alumni (especially in the more elite institutions) was often promoted as a perk of attendance.

But the technologies now available to us in this Age of Access turn much of these traditions on their heads. Information is now more-or-less freely available to anyone with a device to view it and an internet connection. The desktop Web was the first step in information liberation, but though information may have wanted to be free, it really wasn’t free until it was unbound from the wires that used to hold it.

For education, nearly universal and ubiquitous access to information has had and will continue to have profound effects. While “knowledge” was often (somewhat erroneously) defined as “knowing lots of stuff,” today we don’t need to use our grey matter as storage: we can just look up what we want to know when we want it, no matter where we are.

Digital technology and the Internet have also served to free information from its containers. The whole idea of “going to the library” to look something up seems absurd when we live inside the library. Open online courses have released the information that was once relegated to the classroom and set it loose for the world to experience…though the jury’s still out about how well that actually works in practice.

Remember how the Age of Access has changed commercial retail? Well, the same changes are occuring in higher education.

Radically-expanded access to people via technology is also having profound effects on higher education. Social networks now stretch far beyond campus boundaries. A student today can access an old friend from high school as easily as a distinguished professor they met at a conference or an author they’ve never met before. While faculty have always maintained networks (both formal and informal) with peers at other institutions, now those networks can be accessed at will. Access to people is a force that eliminates both physical and hierarchical barriers.

The Age of Access allows us to transcend both time and space.

 

The Great Fracturing

Probably the biggest impact that market and technological forces have had on the consumer retail industry is the fracturing of what was once a somewhat monolithic industry created to serve a somewhat homogeneous middle class. Born of a time when we were united by the great metanarratives, the department store and its host, the mall, thrived during a time when mass media drove mass culture. Probably nothing sums up the era better than this description of the once-ubiquitous department store Sears’ audience written as part of an internal merchandising plan during the late 1970’s as reported in Salon:

“Sears is a family store for middle-class, home-owning America. We are not a fashion store. We are not a store for the whimsical, nor the affluent. We are not a discounter, nor an avant-garde department store…We reflect the world of Middle America, and all of its desires and concerns and problems and faults.”

The same forces at work transforming the consumer retail landscape are hard at work transforming the world of higher education. While the past several decades have been marked by the fracturing of a wide range of institutions across American society—the breakup of the Bell System in 1982, the steady erosion of government programs implemented during the more progressive decades earlier in the 20th century, the fragmentation of mass media brought about by technology, etc.—institutions of higher learning have been remarkably resistant to change.

But those days are over. The Great Fracturing is beginning.

Our system of higher education has been remarkably resistant to change for a number of reasons. First, contrary to the popular belief held by those outside of The Academy, colleges and universities are incredibly conservative places where change—any change—comes slowly and only after much study and deliberation. Institutions of higher learning are also inherently insular, tightly-knit communities often shielded by design or by default from the outside world and its influence. And while smaller, tuition-driven institutions were faster to catch on to how the world was changing, especially when their meager endowments were pummeled by the Great Recession beginning in 2008, the elite institutions – the institutional thought leaders—were shielded from change by great walls of endowment. The economy may have dinged those walls a bit, but they still held.

But, in the end, the real reason that higher education has been slow to change is because it really didn’t have to. Things seemed to be working. The system seemed solid. If it ain’t broke, don’t fix it.

Unfortunately, like the apocryphal slow-boiled frog, higher education as an institution had been in decline for a long time, the heat turned up ever-so-slowly by the gradual demise of the middle class, the downward spiral of the public education system, increasing economic disparity, the explosion of information brought about by rapidly advancing technologies, and a world moving faster and faster in a billion different directions. Everything was changing, but few noticed. Now the water’s boiling.

The recent decision by the Federal Government to offer student loans to coding bootcamps is a clear indication that the inevitable fracturing of higher education as we knew it is underway. It’s a clear admission that the systems that have been in place for decades (if not centuries) no longer work when it comes to preparing people for the workplace. If you can learn what you need to learn to land a 6-figure job in 6 weeks for under $20,000, why bother spending 4 (more likely 5) years and potentially hundreds of thousands of dollars to learn skills that are going to be out of date years before you graduate?

If the point of college is to prepare people to enter the workforce, “college” as we know it doesn’t make much sense. It’s too slow, too expensive, and, for the most part, hasn’t adapted to the realities of the Age of Access. Colleges and universities are lumbering dinosaurs and alternative learning channels like coding bootcamps are like the first mammals scurrying about under the feet of the great lizards. They may not look like much now, but they’re the future. And the dinosaurs don’t even notice they’re there.

Higher Education’s Complicated Future

While the influence of a nearly infinite number of cultural, economic, and philosophical forces makes it impossible to predict how higher education will evolve in the future, we can look back and extrapolate from what happened to so many other institutions that defined the 20th century. In almost all cases, the forces of the Age of Access have changed these institutions by flattening hierarchies, radically changing the influences of time and space, facilitating instantaneous communications between people, the formation of ad-hoc virtual networks, and access to virtually unlimited amounts of information any time, anywhere. The end result has been, for the most part, the fragmentation of monolithic institutions into specialized elements designed to meet the needs of increasingly fragmented audiences while economic forces drive the creation of new institutions at widely-separated ends of the economic spectrum. The “high end”—characterized by an emphasis on human service, high quality, and high cost—caters to the elite who can afford to pay to be tended to by expensive human beings while the low end—characterized by self or automated service, lower quality, and low cost—is targeted to the majority at the lower end of the economic scale, though the “masses” are fragmented into their own tribes. Target and Wal-Mart thrive because they cater to tribes. Sears and K-mart are dying because they don’t see the distinctions.

The future of higher education will likely be categorized by similar fragmentation. Instead of a once-in-a-lifetime learning experience with a beginning and an end, “college” may become something that we experience in fragments as we need it throughout our lives and careers. Students graduating from high school may opt to jumpstart their careers through intense, focused vocational training and then move on towards developing the “softer” skills of a traditional Liberal education as they progress in their careers and need to acquire more sophisticated critical thinking, communication, and collaboration skills. These may be provided by institutions that we used to call “colleges,” but they’ll be a lot smaller, geared towards intensive instruction over a much shorter period of time. When they need to learn additional skills in order to stay current in their careers, people may turn to other educational providers geared towards providing that kind of instruction, either in person or online. Enterprising students may just opt to learn skills on their own from online sources followed by taking a standardized test to earn a “badge” or other microcredentials recognized by employers.

Given the forces shaping the world of the 21st century, “higher education” in 2065 probably won’t resemble what higher-ed looks like today any more than a modern land-grant institution of 60,000 students resembles a medieval university. Workers may contribute monthly to their “EMO,” or Educational Maintenance Organization, an organization structured much like health insurance companies today that allows students to continue their educations at pre-negotiated group rates. Adult learners may join the educational equivalent of health club where they pay one monthly fee to take all the classes offered by the “brain gym” they want. Diplomas may be replaced by “badges” or other microcredentials indicating mastery of particular skills.

While most people may receive higher education from a wide variety of providers, the elite who can afford face-to-face, personalized instruction on a bucolic campus may do so in a more traditional “college” environment, but one with amped-up luxury amenities and higher levels of service. College concierge, anyone?

Of course, higher education in 2065 may not resemble any of this. After all, nobody trying to predict the future in 1985 could have predicted the impact of the Internet. But regardless of the final form it takes, the trend vectors are there: higher education as we know it doesn’t have long to live in its present form—and its communication and marketing needs will change along with it.

 

View Higher Education in the Age of Access Slideshare

 

For more mission based marketing insight, dig into our blog page, or sign up for our newsletter and like/follow our Facebook, Twitter, or LinkedIn pages to let the news come to you!