I am a scientist, and I teach scientists research ethics.
Are scientists ethical?
Easy answer: we’re just like everyone else; the great majority of us are the great majority of the time, but some of us aren’t.
More complicated answer:
From 1985-1990, when I was getting my PhD in Biochemistry from one of the nation’s top research universities flush with National Academy of Science members and future Nobel Laureates, no one mentioned the word ethics, much less the concepts and ideas behind it, even once.
Lord, have things changed. And two stories in today’s headlines shine a bright light on these changes, on the new, big science of the 21st century.
A nutshell background on what changed in the last 20 years: (1) Money: Congress changed the rules in the late 1980’s to say scientists who made research discoveries funded by the federal government (which at the time accounted for many if not most discoveries in the US) and their universities could financially profit from their innovations; (2) Gene cloning: molecular biology transformed biomedical science and drew much public attention; (3) Scandal: complex scandals in apparent misconduct, most notably the one involving Nobel Laureate David Baltimore, that ruined careers led to the development of federal oversight agencies and new education programs (one of which I lead) in research ethics; (4) Who does science: in barely a generation, basic science went from an enterprise carried out by white males with more or less the same background and cultural assumptions to a vastly diverse international and cultural smorgasbord.
Famous Harvard researcher Marc Hauser, who ironically studies the evolution of morality, has had to retract one published paper and is caught in the gnarly web of university and federal investigation he said/she said that I would wish upon no one. From the eventual investigation reports we will one day learn the final word on the allegations against Hauser, but those results will be buried in the back pages. The damage — to Hauser and, because of the collaborative nature of science, to his many colleagues — has already been done. What I call in my classes ‘the ethical cracks’— the tiny fissures in the vast but fragile ethical scaffold that allows science to function—are already there.
Skeptics of research ethics courses (of which there are far fewer now than when I started teaching them over a decade ago) scoff and say ‘you can’t teach ethics’. True, we can’t teach the few bad apples to be ethical, but we do make young scientists aware of what constitutes ethical behavior, how to identify and deal with ethically gray situations, and what to do about reporting unethical ones. Perhaps it was this established educational infrastructure that allowed the Hauser situation to be addressed now rather than having its ethical cracks spread even further (a circumstance that happens much more often in science than we like to admit).
A much more positive and exciting side of 21st century blares out in the lead article in today’s New York Times: “Rare Sharing of Data Led to Results in Alzheimer’s: Collaboration Between Science and Industry Seen as Model for Parkinson’s Studies.”
Here is science at its biggest and collaborative best. Here we have the vast private biotechnology and pharmaceutical enterprise—much of it spawned by the legislative changes I mentioned—collaborating with university researchers, federal funders, and even private foundations in a nearly unprecedented show of research strength.
In the traditional science model, scientists around the world compete with each other for that one Holy Grail that will cure the disease and earn them prizes and a fat juicy patent or two. Realizing this model wasn’t working for staggeringly complex, but common diseases like Alzheimer’s and Parkinson’s, a couple of honchos who control a lot of the purse springs had a different idea. Let’s all collaborate in a few huge projects, share trial participants, share data, and, in perhaps the biggest ethical leap for biomedical scientists, make all the data public as soon as its available. Already, the approach has led to several recent breakthroughs in identifying potential pieces of the Alzheimer’s puzzle.
What if this, only made possible by recent stunning advances in computer communication and analysis, becomes the way science is done, or at least becomes a model that competes with the old dog-eat-dog model? Perhaps we’d not only have fewer Hauser-like scenarios, but also a model for any type of learning and innovation.