Figuring out how to get more students drawn into the “STEM education pipeline” has been a major concern of those arguing that there exists an acute shortage of STEM workers, be it in the U.S., the U.K., Brazil, Australia, or almost any country you choose. Typically, the arguments made to encourage students to enter the STEM pipeline center around how interesting STEM careers are and especially how much money you can earn over pursuing non-STEM careers.
However, others point out that many students aren’t interested in STEM careers because they see that the academic work needed at both the high school and university-level to pursue a STEM degree is just too hard in comparison to non-STEM degrees. Until this changes (for example, by increasing the readiness of a prospective STEM student by “redshirting” them), the argument goes, don’t expect a full STEM pipeline anytime soon.
Another factor little talked about that I personally witnessed has been the changing social compact between STEM workers and employers over the past several decades and the impact it has had on convincing students today to pursue a STEM career. When my father, an electro-optical engineer was laid off from his company late in the recession of 1957-1958, he assumed the company would be rehiring him a few months later when the economy got better. His wasn’t an unreasonable assumption, since that was the general practice in the 1950s. When he wasn’t soon rehired, and with a new house mortgage to pay and three children under age 5 to feed, my father left his temporary job of selling Electrolux vacuums door-to-door and found another electro-optical engineering job. He stayed with that company for another 25 years when he retired with the usual gold desk pen-set, which now sits on my desk.
When I graduated with my undergraduate computer systems engineering degree in 1977 after a stint in the military, my expectation of a generally comfortable career working for at most a handful of companies was similar to my father’s. I remember when my friends and I were looking for jobs, our discussions often centered on whether a company we were planning to interview with was one worth spending a career with. IBM, for example, had a commitment to a life time job for its employees while Digital Equipment Corporation had a no lay-off policy. Other companies were offering similar types of inducements as a reason to work for them. Even so, part of our employment equation now included the possibility of layoffs, given that between 1968 and 1970, aerospace employment (which had been up to that point was promoted as the most exciting industry to be a part of) had dropped from 1 418 000 to 1 177 000 workers as the Vietnam War wound down and the Apollo space program's end was coming into view.
A New York Times article of the time titled, “Aerospace: Tale of a Sick Industry,” reported that while mass layoffs in the aerospace industry was not uncommon, “what is unfamiliar and more painful about the current recession is that so many laid off are scientists and engineers whose skills the nation can ill afford to lose permanently.” A related Times article reported that among those aerospace employees laid off were the majority of the remaining members of NASA's von Braun group, the “nucleus of the engineering team that launched the first American satellite and built the Saturn 5 moon rocket.” The fact that famous rocket engineers could get laid off signaled to us the need to be a bit more cautious in who we decided to go to work for.
While the engineering job market was improving by the time I graduated, the engineering unemployment crisis of a few years earlier had definitely discouraged many students from getting into engineering. It even motivated many to look towards seeking careers in business, a career field decidedly shunned during the 1960s. Business was in fact the most popular field of study for entering college students in 1972 according to the American Council on Education; many laid-off engineers also decided to pursue MBAs or other business degrees since the opportunities in non-tech fields looked brighter.
But mostly throughout the late 1970s and into the mid-to-late 1980s, engineers (and the ever-growing number of IT professionals) identified themselves with their company, and as a member of the upper middle class. Compensation for fully employed electrical engineers, believe it not, was higher than that of the average salaried doctor. The 1979-82 period with recession and high inflation (inflation hit 14.76 percent in 1980) was no fun, but for the most part engineers thought the company they were working for—and were loyal to—would mostly likely be the one they retired with, or if not, then surely the next one. Companies still offered decent pensions (which also helped keep job hopping, which has highly discouraged, down) as well as good educational benefits to keep you current; but you put in a lot of hours in return.
Change was in the air, however, as more engineers outside of the aerospace and defense industry started to work on projects (that was a way of life in A&D). When the projects ended, engineers began to be laid off instead of being moved to other projects in their company, as had been the more typical case. Your employment prospects (and personal loyalty) subtly began shifting from the fate of your employer to the fate of your project. The 1980s also saw of the rise of the “body-shoppers” and large system integrators, like Electronic Data Systems and Computer Sciences Corporation, which promised to take over all of a company's IT operations (often as part of the contract to deliver a modernized infrastructure IT system). Now one's loyalty became divided between the company who signed the paycheck and the one a person was actually performing the work for.
I don’t remember exactly when engineers as a group started to feel that their corporate loyalty was no longer being reciprocated, but it had to be sometime by the mid-1980s when corporate downsizing started to gain momentum. Downsizing was unlike the “traditional” lay-offs that happened during previous recessions: these were permanent reductions in the workforce even in good economic times in the name of cutting costs and improving profitability. Partly, downsizing was in response to the rise of global competition. More so it was sparked by the rise of the corporate raiders who said they were looking to unlock corporate value by getting rid of under-performing or underutilized business units. As a result, executives were motivated by large incentives to hit cost/profit targets that the financial markets desired. It was not uncommon for a company’s stock price to rise in wake of a lay-off announcement; similar to Microsoft’s stock when Steve Balmer recently announced his “retirement.”
Between 1985 and 1990, over 1 000 U.S. firms embarked on downsizing (pdf), with most indicating they would be doing so again in the future. Predictably, employee morale was destroyed while the company's stock price was boosted by such confident announcements of planned future layoffs. According to Labor Department statistics, 36 million jobs were eliminated between 1979 and 1993.
And these weren’t just low-skilled or blue-collar jobs from the declining U.S. smokestack industries being eliminated, but just the opposite. As a 1996 New York Times article titled, “On the Battlefield of Business, Millions of Casualties,” stated, “In a reversal from the early 80's, workers with at least some college education make up the majority of people whose jobs were eliminated, outnumbering those with no more than high school educations. And better-paid workers—those earning at least $50,000—account for twice the share of the lost jobs than they did in the 1980's.”
For the engineering community, downsizing hurt, but the defense lay-offs from the end of the Cold War hurt more. According to the NSF, in 1987 some 16 percent of the engineers and 11 percent of natural scientists, computer scientists, and mathematicians working in the United States were involved in defense work. But between 1987 and 1992, 700 000 defense-related civilian jobs were eliminated, and another million would go away over the next five years. You can see the drop-off in aerospace employment in this Aerospace Industries Association chart (pdf). Aerospace and defense companies’ mergers began in earnest, with some engineers working for three different companies within five years, hoping that with each merger they wouldn’t be the target of the inevitable layoff that occurred as redundant jobs were eliminated.
This period of defense layoffs coincided with the rapid growth of IT outsourcing (with some now moving overseas) and the reengineering craze started by Michael Hammer’s 1990 Harvard Business Review article (pdf), “Reengineering Work: Don’t Automate, Obliterate.” Hammer argued that IT was being used to automate business processes that were out of date; what organizations needed to do was to redesign (or eliminate) these processes to make them more efficient, and then automate them. While generally true, the side effect was to give companies a new excuse to aggressively eliminate jobs, something that Hammer claimed was not reengineering’s intent, but became its mantra (and legacy) nevertheless.
Also in 1990, the H-1B visa program was started (pdf) as a way for U.S. employers to hire temporary, foreign workers in specialty occupations, mainly because of a perceived (but as it soon became apparent vastly exaggerated) engineering and IT “skills gap.” Ever since, there have been arguments over whether the program has helped (pdf) or hurt native U.S. engineers and IT professionals’ employment prospects and salaries. Regardless, a “temporary” guestworker program for hard to fill jobs has become a permanent fixture affecting STEM career discussions.
If I were to guess, probably the watershed year for engineers and IT professionals in their realization that they had now become independent, expendable employees was in 1993. That was the year that IBM announced, in the wake of a previously unimaginable US$15 billion in losses racked up in a mere two years, that it was no longer going to honor its 70 year commitment of life time employment to its employees. IBM had 406 000 employees in 1986; it had 207 000 by 1994. Also in the late 1980s and early 1990s saw several major computer companies struggle like Control Data Corporation, Wang Laboratories and Digital Equipment Corporation (IBM's biggest competitor) which had to end its own no-layoff policy in 1990. Each fell to young, aggressive entrepreneurial software and hardware companies led by Apple, Microsoft, Oracle, Intel, Dell, Compaq among others, who radically changed the face of corporate and personal computing.
While there was a great displacement of engineers and IT professionals in the early 1990s, at least there was expanding demand in the mid and late 1990s spurred in part by rapid employment growth in the telecommunications industry (pdf) and the entrepreneurial hyper-activity that created the Dot-com bubble. Alas, the boom was short-lived, with over 900 000 tech workers losing their jobs between 2001 and 2002 alone. Even the period of the mid-2000s which once again saw increased engineering and IT employment didn’t last long. The past five years have seen tens of thousands of engineers and IT professionals laid off and having trouble finding work, even as technology executives complain they can’t find the right high tech skills and therefore need a major increase in the number of H1-B guestworkers. Those engineers and IT workers who still have jobs have seen their salaries remain essential remain flat since 2000.
The social contract between STEM workers and employers that used to exist has long been broken, and is never coming back. Instead of being mainly enticed into a comfortable, secure career in engineering by what civil engineer Samuel Florman calls the existential pleasures of engineering, salary has now become the primary carrot because there is no such thing as a comfortable STEM career.
For students wanting to pursue a STEM career today, they are faced with future prospects much different than when I got my undergraduate degree. Some 92 percent of the engineering class of 1978 found jobs and were working in the field of their bachelor’s degree (or one closely aligned to it) two years later; only 68 percent of the engineering class of 2006 could claim having the same opportunity. There are likely even fewer opportunities available for the engineering class of 2013.
An engineering undergraduate’s starting salary will be on par relatively speaking as when I started, but few engineering graduates from my class had student loans averaging around US$26 000 to pay off after graduation. The federal government used to fund universities and colleges at a much higher level than they do today.
Nor did I face a job market where the occasional tech layoffs turned from an accepted occupational hazard into a way of life, or where human resource managers frankly consider job skills to be nothing more than commodities that should be valued in the same manner. For STEM graduates today, especially in the IT profession, this “skill as a mere commodity” perception means they are increasingly considered technology obsolete and therefore ripe for replacement at the age of 40 or less, little matter the experience in many cases.
Not surprisingly, the change in the engineering world is influencing many electrical and electronic engineers to advise their children not to pursue an engineering career. An EDN survey from 2007 (before the current recession) showed that 1 out of 3 felt this way. Well, at least that is better than in the medical profession, where a recent survey found that 59 percent of doctors indicate that they would unlikely encourage a young person to go into medicine.
I still firmly believe a STEM education (liberally sprinkled with the humanities and arts) is the best one to pursue since it gives one a plethora of career options. However, after 35 years in the field as a working engineer and long-time consultant and entrepreneur, I believe that it is vital that any prospective STEM student seeking a career does so with their eyes fully wide open. A 1983 New York Times article titled, “Engineers: A Dropout Problem for the U.S.” said it best: “There’s nothing wrong with a young person’s wanting to be an engineer as long as he [or she] is aware of the myths.”
There is no benevolent employer anywhere who has (or can afford to have) a no-layoff or life-long employment policy. Globalization means ever increasing more technical and wage competition. Pursuing a career in STEM has become little different from pursing a career in professional sports. It can be exciting and highly rewarding if you have the talent and are willing to work hard, and have a bit of luck in which major you choose and company you start off with. Graduating at the start of an economic boom also helps.
However, it is also a career where the employer, like a sports coach, is always critically evaluating whether you've “lost a step.” If you're perceived to have fallen behind technically, or can’t adequately answer the question, “What have you done for me lately,” you're not likely to be around for very long, especially if the economy turns even a little bit sour.
In fact, it wouldn’t hurt for STEM graduates to view themselves akin to professional sports players; after all, that is model that high tech companies like to cite when they say they ought to be able to hire the best engineering and IT talent from anywhere in the world, like professional sports team do in hiring athletes. And yes, some STEM graduates may have very long and fruitful STEM careers, like a NFL kicker Gary Anderson, or MLB pitcher Nolan Ryan, or Manchester United’s Ryan Griggs, who I think will still be playing for United when he turns 60 in 2033. However, most STEM graduates will not be so fortunate in their first choice of field of study.
Plan on 15 or so of hopefully exciting and mentally and financially fulfilling years, and then an increasingly tough fight for every year after that. And always, always from day one have an escape plan in hand in case you need to shift onto a better career path. That is, for good or worse, today's reality.
Photo: Milena Boniek/Getty Images
Contributing Editor Robert N. Charette is an acknowledged international authority on information technology and systems risk management. A self-described “risk ecologist,” he is interested in the intersections of business, political, technological, and societal risks. Along with being editor for IEEE Spectrum’s Risk Factor blog, Charette is an award-winning author of multiple books and numerous articles on the subjects of risk management, project and program management, innovation, and entrepreneurship. A Life Senior Member of the IEEE, Charette was a recipient of the IEEE Computer Society’s Golden Core Award in 2008.