Posted by Sadie from ? (126.96.36.199) on Monday, June 23, 2003 at 1:33PM :
Nature 423, 787 (19 June 2003)
EDITORIAL: Silence of the neuroengineers
Researchers funded by a defence agency should stop skirting the ethical issues involved.
Neuroengineering is in its infancy, but already provides plenty of food for thought. Ethicists can cite a slew of projects that they have studied while trying to come to grips with the new technology. How will it affect human identity, for example? And could it one day be used to control thought processes?
Not so long ago, the idea of connecting a tiny machine to the nerves of the heart had ethicists worried — would it make us in some way less human? Today, over half a million pacemakers are implanted annually without any soul searching. As with many new technologies, the reality turned out to less disturbing than some had speculated.
Integrating machines with the brain will be more ethically complex. One research group has shown how to control the movement of a rat by sending signals to electrodes implanted in its brain. Others have taught a monkey to control a robot arm using signals taken directly from its brain (see page 796). The scientists involved are happy to speculate about their work; some say that the robot-arm experiments, for instance, could lead to a new generation of prosthetic limbs.
But the researchers should perhaps spend more time pondering the intentions of the people who fund their work. A significant amount of US neuroengineering research is funded by the military through the US Defense Advanced Research Projects Agency (DARPA). The agency wants to create systems that could relay messages, such as images and sounds, between human brains and machines, or even from human to human.
In the long term, military personnel could receive commands via electrodes implanted in their brains. They could also be wired directly into the equipment they control. Do neuroengineers support these goals? Their research could make it happen, so they have a duty to discuss their opinions, and to answer questions from those who object to the development of such technologies. Yet when Nature talked to DARPA-funded neuroscientists, many were reluctant to debate the potential military uses of the technology, saying that the agency's goal of brain–machine interfaces was still many years off.
The agency's goal may indeed be a distant one. But like all new potential technologies, it is worth discussing the consequences now. Simply taking DARPA's money, and citing possible medical benefits, is not enough. The discussions may never achieve a consensus, but will achieve a better quality and balance with researchers' engagement.
© 2003 Nature Publishing Group
-- signature .
Post a Followup