Scholars from around the world participated in an ASU conference on Dec. 4-5, debating whether law and ethics are capable of keeping pace with science and technology and seeking potential solutions for the challenges created by the growing gap.
The conference, sponsored by ASU's Lincoln Center for Applied Ethics, was organized by Gary Marchant, Executive Director of the College of Law's Center for the Study of Law, Science, & Technology, and Lincoln's Professor of Emerging Technologies, Law and Ethics. Other organizers included Joe Herkert, Lincoln's Associate Professor of Ethics and Technology, and Brad Allenby, Lincoln Professor of Engineering and Ethics.
As developments in science and technology accelerate (the number of important scientific discoveries doubles every 20 years, and the number of patent applications filed increases 5 percent each year), laws that regulate them are being bogged down, Marchant said.
For example, the Clean Air Act, which early on was amended every two to three years, has not been updated since 1990, despite the advent of global warming and other problems, he said. The same is true of the Clean Water Act, which doesn't address the majority of today's water pollution problems, which are caused by run-off sources, he said.
"There's a sense that `we don't want to even open it up because the statute is such a mess," said Marchant, a former environmental attorney in Washington, D.C. "Congress doesn't have time."
At the same time, government regulators are nearly paralyzed due to extra mandates from Congress on rulemaking, and their agencies frequently are mired in legal challenges from those seeking to impose their own agenda on regulation, Marchant said.
Meanwhile, emerging technologies, among them, nanotechnology, genetic testing and computer privacy, largely are unregulated, he said, and as a result, public health and the environment are at risk, Marchant said.
In talking about the ethics of emerging technologies, Herkert asked if sociologists and ethicists should be involved at the research and development stage to help identify ethical issues and establish procedures for dealing with them.
"Before we turn over the ethics to the people developing the technology, we ought to see what other processes might be available," he said. "There's a long tradition of ethics in our society and a developing trend of ethics being applied to science, technology and engineering."
Herkert said humanoid robots, which will look, think and act like people, are moving from science fiction to reality. Along with developments in robot technology, scientists, technologists and ethicists are beginning to develop an ethics of robots, he said. South Korea and Japan already are working on codes of ethics for the development of robots, both for the protection of humans and the protection of robots.
Humanoid robots pose a number of ethical dilemmas relating to concepts such as moral agency, free will, human identity, social roles, and potential marginalization of humans, according to Herkert. Issues include consumer safety, product liability, and whether robots should or will ultimately have rights, as in the current debate over animal rights.
Allenby used the example of the birth of the railroads to demonstrate a technology's profound impact on society. The railroad, he said, created a modern sense of time, and a new division of labor, and shifted the economic structure from local to national.
Today, scientists are working on extending the human life to 150 years or more, which would impact population levels and the Kyoto Protocol and create major generational and other serious issues, Allenby said.
"How do we stop this technology, control this out-of-control juggernaut?" he asked. "Is it reasonable we are going to stop it? It's a subtle question that we haven't begun to think about yet."
The conference was the first step in a multi-year project funded by the Lincoln Center to produce innovative solutions for bringing law and ethics into pace with science and technology.