Assuming that an engineer is aware of the eventual applications for his or her work and is ethically mature enough to recognize that some of those applications are evil, destructive, or violate the rights of others, it doesn't necessarily follow that this engineer should refuse to do the work. For one, just recognizing that a technology has harmful applications--even only harmful applications--and refusing to build it doesn't mean that the technology will not come about anyway. If you find the applications harmful and refuse to work on it, others may--and likely will--step in and build it instead. So there is a game theory issue that the article ignores. If many people can profit (financially, professionally, learning) from building technology that eventually becomes harmful, then a sufficient number of those people need to refuse to work on the technology in order for it to not be developed. But decreasing the scarcity of labor will increase the value of it, and thus the rewards for developing this harmful technology will be increased, which creates a more powerful incentive for engineers to "defect" (in the prisoner's dilemma sense) and work on the harmful tech.
There are also objectively good benefits to doing work that is eventually harmful. If you have a family, you can feed and clothe them. You can learn skills and that will help with non-harmful work in the future, and you will meet people who may help you do non-harmful work later. Being intimately familiar with this technology, you will be able to warn others of the dangers of it. Whether these objective goods outweigh the potential bads of a technology is difficult to determine during its development. And where the balance is difficult to determine, engineers, being practical and conservative by nature, will tend to side with the tangible benefits they can get today over the intangible costs that others may incur in the future.
I agree with the article's point that engineers need to develop a broader societal focus[1] and be mindful of the potential uses, especially the unintended ones, of their work. The problem is that awareness alone isn't going to accomplish much because a world full of aware engineers doesn't change the existing incentives.
Assuming that an engineer is aware of the eventual applications for his or her work and is ethically mature enough to recognize that some of those applications are evil, destructive, or violate the rights of others, it doesn't necessarily follow that this engineer should refuse to do the work. For one, just recognizing that a technology has harmful applications--even only harmful applications--and refusing to build it doesn't mean that the technology will not come about anyway. If you find the applications harmful and refuse to work on it, others may--and likely will--step in and build it instead. So there is a game theory issue that the article ignores. If many people can profit (financially, professionally, learning) from building technology that eventually becomes harmful, then a sufficient number of those people need to refuse to work on the technology in order for it to not be developed. But decreasing the scarcity of labor will increase the value of it, and thus the rewards for developing this harmful technology will be increased, which creates a more powerful incentive for engineers to "defect" (in the prisoner's dilemma sense) and work on the harmful tech.
There are also objectively good benefits to doing work that is eventually harmful. If you have a family, you can feed and clothe them. You can learn skills and that will help with non-harmful work in the future, and you will meet people who may help you do non-harmful work later. Being intimately familiar with this technology, you will be able to warn others of the dangers of it. Whether these objective goods outweigh the potential bads of a technology is difficult to determine during its development. And where the balance is difficult to determine, engineers, being practical and conservative by nature, will tend to side with the tangible benefits they can get today over the intangible costs that others may incur in the future.
I agree with the article's point that engineers need to develop a broader societal focus[1] and be mindful of the potential uses, especially the unintended ones, of their work. The problem is that awareness alone isn't going to accomplish much because a world full of aware engineers doesn't change the existing incentives.
[1] A good book on this topic is "The Civilized Engineer" by Samuel Florman: http://www.amazon.com/Civilized-Engineer-Samuel-C-Florman/dp...