Generation National Security Leaders Fellowship
Fellowship for Ending Bioweapons Program
Environmental Health, Health Security Track, PhD
Emerging Leaders in Biosecurity Fellowship
Biotechnology Innovation & International Security Fellowship
Emergency Management and Homeland Security (Biosecurity and Threat Management), Masters
Biological Defense and Health Security, MS/PhD
Biohazardous Threat Agents & Emerging Infectious Diseases, Masters
Biodefense, MS/PhD
Belfer Fellowship in Science and International Affairs – International Security Program
ACHS Fellowship Program
AAAS Science & Technology Policy Fellowships
Scoville Peace Fellowship
The Christine Mirzayan Science & Technology Policy Graduate Fellowship Program
Youth for Biosecurity Fellowship

Callie R. Chappell
Dr. Callie Chappell (they/them) is an NSF Postdoctoral Fellow in Biology at Stanford University. Callie’s work focuses on the environmental and social implications of bioengineered organisms that live outside the lab, such as genetically modified microbes. Callie was a fellow with the Center for International Security and Cooperation (CISAC) at Stanford University, leading a project promoting community biology labs (“LABraries”) as sites for community-led biodesign innovation. Callie is also a professional artist and led an arts and bioengineering summer camp, BioJam.
Curriculum Module: Biosecurity & Bioethics Education Resource (B-BER)
Biosecurity and bioethics training can support the life sciences research community in securing and safeguarding biotechnology as it delivers powerful new tools, technologies, and products. However, these topics are not always well-integrated into researcher training. B-BER provides tools to support trainee exposure to biosecurity and bioethics concepts and suggestions for using them.
Response to NITRD NCO RFI on the Development of an Artificial Intelligence (AI) Action Plan
Publication Date: March 2025
EBRC’s response to OSTP’s Request for Information regarding the development of an Artificial Intelligence (AI) Action Plan to sustain and enhance the US competitive edge within this strategic technology. AI holds a great deal of promise for engineering biology, but it also potentially introduces new risks. Without guardrails and common sense oversight, AI may be appropriated to inform the creation of hazards that could endanger public health and national security. Further, USG must ensure developers and researchers have access to resources required to develop and deploy the technology. Our response recommends policy actions that promote the creation of a competitive and vibrant AI ecosystem that is safe and secure.
Response to RFI on NIST AISI’s Draft Document: Managing Misuse Risk for Dual-Use Foundation (AI) Models
Publication Date: March 2025
EBRC’s response to NIST’s Request for Comment regarding their draft document: “Managing Misuse Risk for Dual-Use Foundation Models.” Protecting technologies against potential misuse is critical for both scientific advancement and public safety. We support the updated document and recommend that NIST’s AI Safety Institute continues this work by collaborating with the scientific community to: 1) Better characterize risks specific to specialized chemical and biological design tools; and, 2) Develop targeted mitigation strategies that protect innovation while preventing misuse.