Subscribe
Subscribe
MY AMERICAN SCIENTIST
LOG IN! REGISTER!
SEARCH
 
Logo IMG

ETHICS

Raising Scientific Experts

Competing interests threaten the scientific record, but courage and sound judgment can help

Nancy L. Jones

Transforming the Culture

Although my colleagues and I are confident that students benefited from our curriculum, it takes more than a professionalism course to really nurture a scientific expert. Opportunities to improve and test one’s understanding of scientific culture and epistemology should be pervasive throughout the training experience. To refine one’s judgment requires extensive practice, a supportive climate and constructive feedback. This means that mentors, graduate programs, societies and funders must value time away from producing data. Fortunately, scientific knowledge is not the data; it is how we use the data to form and refine conceptual models of how the world works. So activities that improve scientific reasoning and judgment are worth the investment of time.

More attention must be paid to the epistemology of science and the underlying assumptions of the tools of the trade. As methods and experimental approaches become entrenched in a field, rarely do students return to the rich debates that established the current methodology. This lack of understanding comes to light when prepackaged test kits, fancy electronic dashboard controls and computer-generated data tables fail to deliver the expected results. To interpret their own research and critique that of their peers, scientists need to understand the basis of the key conceptual models in their discipline.

The best way to develop sound scientific judgment is to engage with the scientific community—friend and foe alike—to articulate, explain and defend one’s positions and to be challenged by one’s peers. This learning process can take place in laboratory discussions, journal clubs, department seminars and courses, as well as during professional-society functions and peer-review activities. As my colleagues and I learned from the evaluations of our problem-based learning course, students (and faculty) need explicit instruction on the goals, expectations and skills of these non-didactic activities. After a laboratory discussion, for example, time could be spent reviewing what was learned, giving feedback on how to develop soft skills, and providing an opportunity to collectively improve the group process. Students should be evaluated on their meaningful participation in these community activities.

Professional societies also have an important role to play in fostering professionalism among young scientists. As Michael Zigmond argued in “Making Ethical Guidelines Matter” (July–August), societies are uniquely positioned to develop effective, discipline-specific codes of conduct to guide the standards of their professions. Criteria and practical guides to authorship and peer review are important—but they’re not enough. We must open the veil and show how seasoned reviewers apply those criteria. Discussions among reviewers and editors about how they put the guidelines into practice are the best way to move forward ethically. Indeed, such rich exchanges should be modeled in front of the entire research community, especially for students, showing how different individuals apply the criteria to critique a paper or proposal and then respectfully challenge each other’s conclusions. Societies and funders could also provide sample reviews with commentary on their strengths and weaknesses, and how they would be used to make decisions about publication or funding.

As graduate programs and societies implement such programs, they also need to ask themselves if their activities are actually conducive to open scientific dialogue. Sometimes, the cultural climate stifles true engagement by tolerating uncollegial exchange or by allowing participants to float in unprepared for substantive discussion. There must be a spirit of collective learning that allows students to examine the assumptions and conceptual models that are under the surface of every method and technique. No question should be too elementary. Arrogant, denigrating attitudes should not be tolerated.

Finally, scientists should foster commitment to their profession and its aspirations and norms. This goal is best accomplished through frank discussions about how science really works and about the various competing interests that pull on a scientist’s obligations as author, peer reviewer and, sometimes, editor. Many students enter the community vested in scientism—living above the ice-cream parlor, if you will. Viewing life through those idealistic, optimistic lenses causes them to stumble into an epistemological nightmare the first time they try to make black-and-white truth out of their confusing data. Or even worse, they become sorely pessimistic after naively smacking headfirst into the wall of disillusionment when trying to publish in a prestigious journal or competing, for the first time, for an independent research grant. We must provide opportunities that afford socialization around the principles, virtues and obligations of science. Our faculty and trainees should freely discuss how they have dealt with their own competing interests and managed conflicts within peer review and authorship. All participants need to enter these conversations with a willingness to learn from others and address how to improve the culture.

I’ll end by positing a new definition of professionalism: A scientist, in the face of intense competing interests, aspires to apply the principles of his or her discipline to support the higher goal of science—to ethically advance knowledge for the good of humankind. Professionalism takes courage, but when leaders display this courage, the journey for those who follow is better.

Bibliography

  • Bebeau, M. J., et al. 1995. Moral Reasoning in Scientific Research: Cases for Teaching and Assessment. Bloomington: Poynter Center for the Study of Ethics and American Institutions.
  • Bush, V. 1945. Science: The Endless Frontier. Washington: United States Government Printing Office. Available online at http://www.nsf.gov/od/lpa/nsf50/vbush1945.htm.
  • Iserson, K. V. 1999. Principles of biomedical ethics. Emergency Medicine Clinics of North America 17(2):283–306, ix.
  • Jones, N. L. 2007. A code of ethics for the life sciences. Science and Engineering Ethics. 13(1):25–43.
  • Jones, N. L., et al. 2010. Developing a problem-based learning (PBL) curriculum for professionalism and scientific integrity training for biomedical graduate students. Journal of Medical Ethics 36(10):614–619.




comments powered by Disqus
 

EMAIL TO A FRIEND :

Subscribe to American Scientist