Ethics in Laboratory Research


Do not harm yourself or other people


Help yourself and other people


Allow rational individuals to make free, informed choices


Treat people fairly; treat equals equally, unequals unequally


Maximize the ratio of benefits to harms for all people


Keep your promises and agreements


Do not lie, defraud, deceive, or mislead


Respect personal privacy and confidentiality

Adapted from Fox and DeMarco [8]


Honesty is science’s fundamental rule and is the foundation to the scientific method and progress in science. In the research process, scientists should be objective, unbiased, and truthful. This allows for discovery and invention. Honesty applies to many aspects in research not only in data collection and analysis but also in the writing of research proposals where the truth may be stretched to increase the chances of gaining funding [10]. In other areas scientists, engineers, or officials may exaggerate the scientific merit or economic gain of a project in order to receive support and notoriety. If scientists are dishonest and fabricate, falsify, or misrepresent data, it would be impossible to move forward and achieve the goals of science.

Dishonesty: Fabrication, Falsification, and Plagiarism

Often it is hard to tell if deviations from good scientific practices are due to dishonesty or error. Both dishonesty and error can produce similar consequences, but dishonesty has the intent to deceive. Therefore, harsh judgments and consequences are held for dishonesty, as it is a serious violation of scientific ethics, whereas honest errors are often forgiven. Definitions of dishonesty are not rigid but organizations including the National Academy of Sciences, the National Academy of Engineering, and the Institute of Medicine have definitions that involve deception, misrepresentation, fabrication, falsification, or plagiarism [11]. In addition, other forms of misconduct according to these organizations include harassment, misuse of funds, violation of regulations, and vandalism [11].

Dishonesty in the production and analysis of data may take the form of fabrication or falsification. Fabrication is the unfounded creation of data, making up data in the absence of experimental results [1, 7]. A notorious example of fabrication was by William Summerlin who claimed that he successfully transplanted skin from one unrelated mouse to another, after using a solution that suppressed the immune response and therefore rejection. However, in 1974, it was found that Summerlin had used a black marker on white mice to fabricate the transplantation of skin from a black mouse to a white mouse [7].

Falsification involves the manipulation or misrepresentation of experimental data and is more difficult to define and prove than fabrication [11]. Falsification often occurs during the analysis and statistical handling of data. The most common forms of falsification involve the misrepresentation of results in terms of trimming, cooking, and fudging of the data [12]. Trimming or “tightening up” of data involves the failure to report results that do not support the scientist’s hypothesis. Fudging of data occurs when scientists try to make results appear better. This type of falsification includes the omission of ambiguous, outlying, or unexplainable data points. “Cooking” the data refers to designing experiments in order to obtain results that the scientist already suspects will have positive results, thereby avoiding tests that are likely to have negative results. As such, it can be difficult to distinguish the boundary between misrepresentation of data, poor methodology, or making sense of ambiguous results [13]. Since rules for analyzing data are not defined, judgment must be exercised on how to trim data in order to convey clear and objective results [1].

Plagiarism is considered a form of dishonesty; whether intentional or unintentional, it is regarded as intellectual theft [7]. Plagiarism occurs in a variety of situations in which one takes undeserved credit for another person’s ideas, techniques, methods, or written work. It can be due to improper citation, attribution, self-plagiarism, or paraphrasing. It is not uncommon in science to base research on previous methods or techniques for which credit must be given. Other situations in which plagiarism may occur is during the peer review process in which scientific reviewers may be tempted to steal unpublished ideas. Although unintentional plagiarism may be viewed as an error whether due to ignorance or carelessness, steps should be taken to correct it and avoid it in the future. Many documented cases are investigated each year, and many more undocumented; it is expected that much of plagiarism is unintentional [11].


Carefulness and caution, similar to honesty, promote the goals of science by avoiding errors and allowing science to move forward, to innovate, and to advance knowledge. Resnik cautions against errors in that errors can hinder science as much as outright lies and errors are more prevalent than lies [7]. Errors waste resources, erode the public trust, and the trust of other scientists. The occasional error may be treated as an honest mistake or episode of incompetence; however, serious and repeated errors maybe viewed as negligence [14]. According to the Committee on the Conduct of Science, once errors are discovered in submitted or published work, the author should admit the mistake and publish a correction, erratum, or retraction [15].

To avoid unnecessary errors, Resnik recommends using informal rules that are rooted in the principle of carefulness to guide scientific methodology (Table 16.2) [7]. First, a scientist should use controlled experiments that are repeated to confirm the results. Reported and presented results should not be based on only one experiment. Secondly, investigators are responsible for using instruments correctly and using the most reliable instrument to gather data. Next to avoid errors, all data should be recorded promptly and carefully in a laboratory notebook. A copy should be designated to remain in the laboratory at all times. Researchers should also cultivate a general “skepticism,” regard their own findings, and be rigorous in confirming results. To remain skeptical, researchers must seek dialogue with colleagues to avoid self-deception and bias. Though all reported work should undergo formal review, researchers must also seek informal peer review particularly with regard to the design of an experiment or interpretation of data. When uncertain, a scientist should seek the expertise of a mentor or trusted colleague. Incorporating these informal rules avoids many potential experimental and methodological errors and helps ensure objectivity and accuracy in science.

Table 16.2
Informal rules for scientific methodology

Use controlled experiments

Repeat experiments to confirm findings

Use reliable instruments

Use instruments correctly and reliably

Carefully record and duplicate laboratory data records

Regularly engage in informal peer review of experimental design and data interpretation

Adapted from Pawlik and Colletti [1]

Types of Error

It is important to distinguish between types of errors in research. Practical errors include human error and experimental error. Examples of human error include using an instrument inappropriately, errors in performing calculations, and errors while recording data. Human error is most commonly due to inattention, disorderliness, or hastiness in research. Experimental error is inherent in using an instrument and should be taken into account when reporting the data and results. Theoretical errors are due to errors related to the interpretations and analysis of data. It also includes self-deception or bias in analysis. These errors can be at the start of a project, during its construction and design, or while interpreting the data. Self-deception refers to the occasion when a scientist is convinced that the results of the experiment have significance or validity even when the data do not support it. For many scientists their entire life’s work is devoted to a specific field. Between the pressure to produce results and the loss of perspective and self-criticism, scientists may be deceived to see what they want to see when evaluating their data.

Biases/Conflicts of Interest

Biases are systemic flaws in research and can lead to errors. The definition of what is considered a bias is controversial. Resnik suggests that one person’s bias may be another scientist’s valid assumption or methodology [7]. Biases often originate from political, social, and economic aspects of science. Longino asserts that biases can be minimized by freedom and openness in research and by being open to criticism [16]. Conflicts of interest can be considered a special form of bias and result in the loss of objectivity in scientific research. Conflicts of interest originate from personal or financial interests that conflict with professional obligations. In the academic setting potential conflicts of interest include power, tenure, funding, and publications, whereas in business or medicine, it may be self-referrals, gifts, or recruitment bonuses [17]. In themselves, conflicts of interest are not all unavoidable or unethical. Instead, conflicts of interest present unique situations in which a scientist must have the insight to be extra diligent when making decisions.

The increase in academic researchers desire to collaborate with business and industry has brought concerns over conflict of interest more to the forefront. As a result, the Association of American Medical Colleges (AAMC) proposed guidelines for dealing with conflicts of interest. The AAMC supports full disclosure, aggressive monitoring, and misconduct management [18]. Full disclosure requires revealing all information that could restrict investigators’ activities. It should include details of both individual and family financial and professional interests that are related to the research at hand. Next, these disclosures should be reviewed by supervisors, chairpersons, or institutional review boards [18]. The reviewers should first confirm if a conflict of interest exists and to what degree it might affect the research. If the conflict of interest is prohibitive, the reviewers have a responsibility to remove the researcher from the relationship with either the industry or the research being pursued. In academic research it is often the institutional review committee that is responsible for developing a conflict of interest policy, determining when there is a conflict of interest and what actions need to be taken in order to maintain the integrity of the research environment [1].


Another guiding principle of science is openness, which creates an atmosphere of free exchange, cooperation, and collaboration. Scientists should have mutual respect and honesty with one another to encourage the sharing of data, results, methods, ideas, techniques, and tools [7]. The peer review of another scientist’s work depends on openness and for the authors to be open to criticism and new ideas [19]. Openness in the science community promotes the advancement of knowledge, develops an atmosphere of cooperation and trust, and enables the efficient use of resources and research sites. The free exchange of dialogue between scientists prevents science from being dogmatic, uncritical, and biased [7].

Without openness in science, science becomes a society of secrecy, and consequentially the public may suspect scientists of being dishonest. Although we as scientists strive for openness, this can often conflict with our instinct to avoid openness in order to protect ongoing research, guarantee proper credit, or keep trade secrets [7]. In reality, some environments do not foster openness, including researchers who are competing for academic positions, economic resources, working for a private company, or working with classified government information. Nevertheless, as a general rule, secrecy should be the exception.


Freedom to conduct research in all areas has been hard fought by figures such as Galileo who championed heliocentrism, Bruno who advocated an infinite universe, and Darwin who conceived natural selection and evolution. These theories once heretical are now universally considered fact in the scientific world. Areas of research that have recently been the focus of freedom and ethics in science include genetic engineering and cloning. Science should be able to conduct research on a wide range of problems or hypotheses, enabling new ideas to be perused and old ones to be criticized [7]. Protecting scientific freedom allows for scientists to validate each other’s work and therefore validation of scientific knowledge. It also allows old assumptions to be challenged either verifying it as fact or making it null. Overall, freedom in science allows the scientific goal of expanding knowledge to be accomplished while also promoting intellectual freedom and creativity.

Governments should cautiously put restrictions on actions, funding, publication, thought, and discussion. Resnik writes, “censorship, moratoriums, and other more severe limitations on the discussion of scientific ideas can have a detrimental effect on science and violate basic rights and liberties; we have good reasons for avoiding these kinds of restrictions on research” [7]. As mentioned funding is an important restriction on science, in that research that does not get funded does not get done. Therefore, government and institutions should seek equal opportunities to fund researchers. In some situations, restrictions may be sanctioned in order to prevent violating the rights of others or from causing harm to people and animals.


Again, as noted by Resnik: “Credit should be given where credit is due, but not where it is not due” [7]. Credit motivates scientists to conduct research, collaborate, and cooperate with each other, and ensures that rewards for original research will be fair [20]. It also plays a role in assigning responsibility for errors and assigning punishments for dishonesty. On the one hand plagiarism is not giving credit when credit is due, whereas honorary authorship gives credit when credit is not due. Honorary authorship may be granted to add prestige or help a colleague, though that person has not made a significant contribution to the work.


Educating prospective and young scientists on how to conduct good research helps ensure the future of science [7]. At the same time scientists must also educate the general public through popular books and magazines or more recently through forms of multimedia including television and the Internet. Scientists are also obligated to teach educators who will teach science in secondary education, which can aid in the recruit of prospective student scientists. Forms of education include formal and informal instruction, apprenticeships, and mentoring. Mentoring is an important part of graduate education and the building of ethical character.

Social Responsibility

Although some scientists reject the notion of social responsibility, idealizing that knowledge should be pursued for its own sake. Most scientists agree that they have an obligation to conduct socially valuable research. Within this obligation, scientists should avoid causing harm to society and attempt to produce social benefits [7]. As professionals, scientists are given authority, responsibility, and trust by the public; in return they are expected to produce socially valuable goods and services. They should also help in constructing scientific policy and debunking junk science [7]. By being socially responsible, scientists negate previously irresponsible scientists, thereby increasing the public support of the scientific community. Socially irresponsible research undermines public support for science [21]. An additional social responsibility of scientists is to educate. Scientists are highly specialized and have significant technical knowledge that needs to be shared with the public. Having an educated public benefits the research community in several ways. First, an educated public is more likely to support future research. Secondly, an educated public encourages the publics’ young to study science and become prospective scientists.


In general, scientists should obey society’s laws and institutional regulations concerning their work. This includes the appropriate handling and disposal of hazardous and controlled substances, following protocols for the use of use of animal and human subjects, and seeking approval from internal review committees. Breaking sanctions and common law can result in arrest, confiscation of equipment, denial of funding, and the erosion of public support [7]. Yet, the history of science has multiple examples of its hindrance by sanctions and laws. Therefore, Fox and DeMarco argue that scientific civil disobedience can be justified [8]. There must be balance with society so that science has the freedom to pursue knowledge in all areas while not infringing on the rights of others.


Opportunities should be fairly provided to all scientists with regard to use of scientific resources and advances within the scientific profession [7]. A great deal of funding goes to large scientific projects and prestigious labs. However, this should not be at the expense of deserving smaller, less prestigious groups. In addition, discrimination against race, sex, national origin, age, or any other characteristic not related to scientific competence should not be condoned within the scientific community [22]. Recently efforts have been made to recruit, employ, and reward underrepresented groups, including women and minorities. These efforts help amend centuries in which women and minorities were banned from science and from studying at universities. Scientists should promote an environment of diversity, which helps spark innovation and dialogue.

Only gold members can continue reading. Log In or Register to continue

Apr 3, 2017 | Posted by in GENERAL & FAMILY MEDICINE | Comments Off on Ethics in Laboratory Research
Premium Wordpress Themes by UFO Themes
%d bloggers like this: