Designing the Future

May 2015

Designing the Future

Do PEs have an ethical responsibility to think ahead to help prevent harmful effects of technology?


In an age of uncertainty, one thing is clear: technology advances. As the saying goes, “the only constant is change.” What is common today was science fiction to our parents. The tools designed to improve our daily lives will be ancient history to our children. In this steady progression, is there a need to pause, to ask deeper questions about the world we are creating? Many say yes, that it’s imperative to examine the long-term implications. Some believe engineers should be the ones to lead this effort. And PEs may be especially qualified to do so.

A Growing Chorus
May 2015 PE Magazine CoverRecently, several prominent scientists and technologists have spoken up about one developing area: artificial intelligence. Tesla and SpaceX CEO Elon Musk, Microsoft cofounder Bill Gates, and theoretical physicist Stephen Hawking have all stressed their concerns about AI and humans’ abilities to control it. Hawking told the BBC, “The development of full artificial intelligence could spell the end of the human race.”

Musk, who had tweeted that AI was “potentially more dangerous than nukes,” took action. In January, he donated $10 million to the Future of Life Institute for a research program aimed at ensuring AI benefits humans.

But Musk, Gates, and Hawking are following in the fifteen-year-old footsteps of Sun Microsystems cofounder Bill Joy, who in April 2000 published a nearly 12,000-word treatise in Wired: “Why the Future Doesn’t Need Us.” In it, he argued that powerful 21st century technologies such as robotics, genetic engineering, and nanotechnology could threaten human existence. And unlike previous risks such as nuclear technology, they offered the added dangers of both self-replication and access by a greater population.

“Failing to understand the consequences of our inventions while we are in the rapture of discovery and innovation seems to be a common fault of scientists and technologists,” wrote Joy. The overwhelming desire to know, he adds, can cause us to overlook the fact that “the progress to newer and more powerful technologies can take on a life of its own.”

Joy emphasized caution, even “relinquishment: to limit development of the technologies that are too dangerous.” But it’s hard to say whether, in the 15 years since, anyone has heeded his advice.

The Engineer’s Role
Who should be the ones to give pause? According to a November Chronicle of Higher Education article, it should be engineers.

In “Fools for Tools,” Sujata Bhatia, P.E., assistant director of undergraduate studies in biomedical engineering at Harvard University, and John Kaag, associate professor of philosophy at University of Massachusetts, write that engineers “need to consider and debate the far-reaching outcomes of their inventions.” That includes positing second- and third-order effects.

Further: “If engineers fail to carefully weigh the long-term impact of their innovations and neglect to provide appropriate guidance for novel devices, then engineers share the culpability if their machines are used in ways that harm the public good.”

Bhatia, elaborating later, explains why she puts this responsibility on the engineer’s shoulders: “No one has a better appreciation for the capabilities of the technology and the ways that technology might be used or misused than the engineer who designs [it],” she notes.

While engineers could just innovate and leave it to society to determine how their creations are used, she thinks they must take more leadership. “If we want to be seen as professionals who care about people and about saving lives, we have to think about the long-term implications of our technologies,” she says.

Engineers are already adept at considering failure modes, she explains. They could add a societal point of view: How could a technology, at the societal level, hurt the public’s best interests? Or, stated differently, engineers could release technologies with not just mechanical but also societal operating instructions.

Beyond the Technical
In an article for the fall 2014 Issues in Science and Technology, Carl Mitcham, a professor at the Colorado School of Mines who specializes in the philosophy and ethics of science, technology, and engineering, examines the engineer’s broader responsibility.

He notes that among the National Academy of Engineering’s Grand Challenges, “only the most cursory mention was made of the greatest challenge of all: cultivating deeper and more critical thinking, among engineers and nonengineers alike, about the ways engineering is transforming how and why we live.”

Engineers are the “unacknowledged legislators” of the world, he writes. By designing and constructing new structures, processes, and products, they influence how society lives as much as politicians. “Would we ever think it appropriate for legislators to pass laws that could transform our lives without critically reflecting on and assessing those laws?”

Yet, engineers have been told they should just be concerned about the technical, says Deborah Johnson, professor of applied ethics in the University of Virginia’s Science, Technology, and Society Program, part of the School of Engineering and Applied Science. “It’s sort of like saying, ‘You’re just a cog…. We don’t want you to think about the wheel that you’re designing.’”

But that process is critical, she continues. Whenever you build something physical, you also build something social. “[Engineers] are making society,” she says. “They are making technology, but technology is society.”

And she believes engineers have signed a social contract. “In exchange for the privileges and rights of being a member of the profession, then you owe something back…to do engineering in a way that either protects the public health, welfare, and safety or [even] benefits society.”

PEs as Ethicists
As a professional engineer herself, Bhatia says PEs are especially suited to take a leadership role in these types of discussions, because licensure demonstrates an understanding of and competence with ethics. “The public has an extra level of both trust and expectation [in] licensed professionals,” she says.

John Hall, P.E., F.NSPE, agrees. His article “I, Robot, P.E.” (March PE) called for professional engineers to lead the conversation about unintended consequences of advanced technologies such as artificial intelligence.

Professional engineers are already perceived as ethicists, he elaborates. And their education teaches them to imagine possible outcomes and failure modes, as well as to consider competing interests and find the best solution using good judgment. “We are, by education and training, problem solvers,” he says. “We should be applying our problem-solving skills to more than technical problems.”

NSPE President-Elect Tim Austin, P.E., F.NSPE, believes professional engineers could fill a leadership void, standing with the public to ask, “Just because we can do something, should we?”

He asks: “Who is in the best position to do it [if not] us?”

A Larger Trend
The responsibility to think more generally about technologies’ effects fits into an overall trend for engineering ethics, explains Gerald Galloway Jr., P.E., chair of the National Academy of Engineering’s Center for Engineering Ethics and Society advisory group. Over the last several decades, engineers have been asked to focus more on second- and third-order effects, he says—for example, environmental and social sustainability.

Rachelle Hollander, who directs the CEES and is a member of the governing board of the Association for Practical and Professional Ethics, notes that a fairly recent development in the sciences and engineering has been an “acknowledgement of the early warning function that science and engineering can play in society,” in terms of risk identification, evaluation, communication, and management.

Along with that comes an emphasis on “anticipatory ethics”—as Johnson defines it, “the attempt to get engineers and developers to think about ethical issues of the technology while they’re still developing it.”

Of course, engineers are already doing some of this. As Dan O’Brien, P.E., F.NSPE, chair of NSPE’s Board of Ethical Review, explains, engineering projects frequently use the triple bottom line assessment, which examines not only the financial impacts but also effects on environment and society.

And ABET’s accreditation criteria for engineering programs require students to demonstrate (in part):

  • “the ability to design a system, component, or process to meet desired needs within realistic constraints such as economic, environmental, social, political, ethical, health and safety, manufacturability, and sustainability”;
  • “an understanding of professional and ethical responsibility”;
  • “the broad education necessary to understand the impact of engineering solutions in a global, economic, environmental, and societal context”; and
  • “knowledge of contemporary issues.”

Explicit Versus Implicit
But O’Brien and others still believe this process of long-term thinking could be made more prominent and included earlier in the process.

For instance, the Board of Ethical Review chair says that consideration of second-generation impacts by engineers could become a regular part of the project development process.

Mark Frankel, director of the Scientific Responsibility, Human Rights, and Law Program for the American Association for the Advancement of Science, emphasizes the need for engineers to consult with others in this process. “You can’t just give up and say you can’t think of any [long-term] effects,” he says.

Engineers should reach out to the communities that their products will serve, Frankel says. For instance, firms could convene outside advisory groups made up of nonengineers and community leaders.

While this adds more work and perhaps cost to development, he explains, it’s best to have a routine in place. And such efforts can reflect positively in the event of a failure; they are a demonstrated record of making a “good faith effort.”

In addition to taking initiative on their own technologies, engineers can also serve more generally as spokespeople, Bhatia believes. She says popular culture, such as the movies Iron Man and Big Hero 6, demonstrates the public’s constant tension between a fascination with technology and fear of its misuse. But engineers can stand in the forefront to help resolve that, she says, providing trusted guidance, as doctors do. Examples could include position statements on controversial technologies or reassurances that certain innovations don’t have the capabilities to be misused in ways that the public fears.

Education and Training
How should engineers prepare for these conversations? David Guston, codirector of the Consortium for Science, Policy, and Outcomes at Arizona State University, is leading the creation of a new Virtual Institute for Responsible Innovation and editing the Journal of Responsible Innovation. He says ethics curricula have previously focused on microethics, the responsibilities of engineers and other researchers to each other and to the profession, rather than macroethics, the responsibility to society at large. But that’s slowly shifting, he says.

Bhatia points out that ethics cases in engineering curricula primarily examine whether a technology worked—asking was it ready to be used rather than how it could be misused. But she says it’s important to train students to think about why a technology should be developed and how it should or shouldn’t be used.

Such ethical discussions should be woven into everything programs teach, says Galloway.

University of Virginia offers proof that engineers with a broader mindset are in demand. Every engineering student at the school has to take four courses in the Science, Technology, and Society program, replacing some of their usual humanities classes. Johnson believes graduates are both better employees and world citizens. And she says recruiters seek out the students, who make excellent managers and leaders.

A Call to Action
In “The True Grand Challenge for Engineering: Self-Knowledge,” Carl Mitcham asserts that “[a]mid the [NAE] Grand Challenges must thus be another: The challenge of thinking about what we are doing as we turn the world into an artifact and the appropriate limitations of this engineering power.”

It is not enough to do things right, he says later. “We, engineers and nonengineers alike, need to consider what are the right things to do.”

Hollander stresses that engineers are critical participants in such discussions, because their expertise can help society make informed choices. In particular, she says, the public needs to hear from engineers they can trust and have confidence in, knowing that their training, experience, and priorities will be directed at enhancing the public good.

That sounds very much like professional engineers.

How can PEs contribute? Send your ideas to

Questions Engineers Might Raise

Bhatia; Frankel; Galloway; Hollander

Experts weigh in on the discussions engineers and PEs could initiate about the long-term implications of technology.

Sujata Bhatia, P.E.

  • What is the societal need that this technology is addressing?
  • Can existing technologies address this need? Why is this new technology advantageous over existing ones?
  • Will this technology benefit a few or benefit everyone? If it benefits only a few, does it do so at the expense of others?
  • Are there ways in which the technology compromises human rights or the public health and safety?
  • What is the risk/benefit ratio?

Mark Frankel

  • Where is the technology most likely to have an impact? What kind of an impact?
  • Who will benefit?
  • Will the benefits change course over time?
  • What people will be more disadvantaged by the technology than others?
  • How might the technology be used beyond the way it’s designed to be used? Will it have users not originally foreseen?

Gerald Galloway, P.E.

Ask the following of developers, proponents, and researchers who are moving technologies from R&D to implementation:

  • What don’t you know about what’s going to happen?
  • What are the uncertainties you’re worried about?

Rachelle Hollander

  • What is the likely impact on future generations?
  • What are the questions that might arise for different social groups?
  • What would happen if this new development fell into the hands of children? Of people who might want to use it to do harm?
  • What kinds of controls can be put into place to minimize the likelihood of harm? In what social circumstances are they feasible?
  • How likely is it that current and future populations will be able to manage this technology?
  • Does it satisfy the ethical maxim for “due care” (the way a reasonable person would behave in looking out for others’ safety)?


Publications Referenced (In Order of Mention)

Additional Readings

Organizations and Programs Referenced (In Order of Mention)

Additional Resources Noted by or Associated With Experts