Studying people as well as microbes: Ethical challenges for iGEM teams
by Jessica Scherer, Shrestha Rath, and Tessa Alexanian
on behalf of the Human Practices Committee
If you want to solve a real-world problem using synthetic biology, you can’t just study microbes. At some point, you’ll need to study people―their diverse values, opinions, and priorities―too.
You can study people by finding insights from social science research or by looking for your own answers using surveys, interviews, focus groups, and other forms of human subjects research. This post will focus on the ethical challenges your team might face if you choose the second option.
Individual people need to be treated with more respect and care than a beaker of microbes (under most ethical frameworks). In addition to the basic requirement of “don’t break the rules or the law” set out in our human subjects research policy, it’s important for your team to take some time to think about how to treat every person participating in your research with respect.
An iGEM story: catching a risky survey just in time
You may be thinking: come on, human subjects research in iGEM usually involves asking people a handful of survey questions. Is that really an ethical challenge? How could that possibly harm them? We have a story for you.
Jessica, one of the authors of this post, saw first-hand how a few survey questions could endanger the people who answered them. Her iGEM team’s project aimed to address a controversial subject: global access to birth control. To better understand the problem, they planned to survey women in Uganda, collecting basic personal information (name, age, occupation) and asking about their experiences with contraception.
What if, in the spirit of transparent and open research, Jessica’s team had uploaded their raw data to their wiki? Someone could have searched the survey participants’ names and discovered whether they used birth control. Because some people in their community oppose contraception, this would have put them at risk.
Jessica’s team didn’t notice these risks until it was almost too late. However, the team was invited to present their iGEM project to a bioethics class at their university. The students in the class brought up the risks that Jessica’s team had missed when drafting the survey. The professor questioned whether the survey had been approved by the university’s Institutional Review Board (IRB). Jessica’s team had never heard of the IRB, and didn’t know they were supposed to seek approval for surveys. Thanks to the discussion, the survey was not distributed as it was, and Jessica’s team established contact with their university’s IRB.
The IRB helped the team to re-write their survey. They removed personally-identifying questions according to the six key areas of anonymization and added options to allow participants to skip controversial questions. The revised survey was approved by the IRB and given to their contacts in Uganda. The guidance from the students in the bioethics class, the professor, and the IRB were all crucial in helping Jessica’s iGEM team think through ethical data collection.
Empowering people to make their own decisions
Carefully handling and anonymizing data is one important way to protect participants. What about other, non-privacy risks from your study? How will you give people the information they need to make an informed choice about whether to participate?
Seeking the free, ongoing, and informed consent of participants is key for showing both concern for their welfare and respect for their autonomy. The history of biomedical research includes many ethically suspect studies where such concern and respect was absent.
One infamous case is the Tuskegee Syphilis Study, a 40-year study of the progression of deliberately untreated syphilis in African-American men, during which researchers concealed diagnoses from participants and prevented them from seeking treatment. That study is infamous in part because it seems, from our vantage fifty years after it ended, so blatantly unethical. Surely lying to participants so you can study non-treatment of a severe disease is obviously unacceptable? There was no way for participants to make an informed choice. The Tuskegee Syphilis Study is only one case of many; in the past century, medical researchers denied malnourished Indigenous children adequate food, fed radioactive iron to uninformed pregnant women, and deliberately exposed intellectually disabled children to hepatitis.
Biomedical human experimentation is strictly prohibited within iGEM, and so teams have not had to address ethical issues surrounding human medical research. However, even non-medical research can bring up ethical issues around informed consent. For example, in 2012, researchers at Facebook and Cornell University studied emotional contagion by manipulating Facebook news feeds to show more positive or negative posts. Users did not opt into this manipulation, which affected their emotional state―users who saw fewer positive posts wrote more negative status updates. The lack of prior consent in this study was controversial, though the publisher argued that Facebook constantly experiments on its users, and running such experiments rigorously and transparently is better than the default. Other examples of ethical issues in non-medical research include ethics dumping, where research is intentionally moved into countries with few ethical restrictions, and questions like whether to seek consent when researching public social media posts or when it's okay to deceive participants about the true purpose of a study.
As these case studies illustrate, it is important to consider possible harms beyond physical harms and privacy concerns when conducting research. Are you asking people about sensitive emotional, political, or medical topics? Can you allow participants to opt out if they find certain parts of your study upsetting?
When you start thinking through these questions, you may find yourself running into issues that go beyond direct impacts on participants. For example, could the information produced by your study create harm or reinforce inequities? How does your team’s particular context (cultural, national, educational, etc.) influence your ethical intuitions, and will your study participants share those intuitions? We encourage you to think holistically about the potential impact of your work, and share your thinking with participants so they can make informed decisions.
Check yourself before you wreck yourself
Your human subjects research may need to be pre-approved by your institution, as with the IRB in Jessica’s story. This review process can take a long time, and there’s no consistently-applied ethical standard. Facebook, a public company, didn’t need to have its emotional contagion study reviewed ahead of time, but in some countries a PhD student running exactly the same study would need IRB approval. These gaps in oversight seem unfair and unprincipled, especially when formal reviews slow down useful research.
We’re not here to tell you that the system is perfect! It’s just that everyone gets excited about their own projects, and in that rush of optimism, it’s easy to miss crucial ethical considerations. Many historical biomedical studies now seem terribly unethical, but the scientists who did those studies must have been unaware of the ethical considerations, or believed (wrongly, to modern eyes) that the benefits of their research justified placing participants at risk.
We don’t want to pile on so much ethical complexity that you give up on human subjects research. It’s just that... Humans are, in fact, complex, and treating people well is important, and you might overlook something.
To guard against this, you should ask someone outside your team to check over your research plans before you start recruiting participants. A formal ethical review process is one way to get an outside view ― be sure to check whether your institution requires this, and follow the rules that apply to you!
That’s not the only way to get an outside perspective, though. Maybe your team will talk through ethical issues with your advisors. Maybe you’ll send an email to the Human Practices Committee. Maybe you’ll reach out to someone in the community of people you want to study; after all, potential study participants are probably the foremost experts in whether your plans would help or harm the humans subjected to them.
That’s our challenge to you: check yourself! Get an ethical critique before you get started studying people. It might make things a bit slower but, well, we often say that iGEM isn’t easy, but it’s worth it. We know you can rise to the challenge.