Author
Listed:
- K. L. Akerlof
(George Mason University)
- Todd Schenk
(Virginia Tech)
- Kelsey Mitchell
(George Mason University)
- Adriana Bankston
(Bankston Policy Consulting LLC)
- Aniyah Syl
(George Mason University)
- Lisa Eddy
(Virginia Sea Grant)
- Sarah L. Hall
(Virginia Tech)
- Nikita Lad
(George Mason University)
- Samuel J. Lake
(University of Virginia)
- Robert B. J. Ostrom
(Virginia Tech)
- Jessica L. Rosenberg
(George Mason University)
- Abigail R. Sisti
(Virginia Institute of Marine Science)
- Christopher T. Smith
(Virginia Tech)
- Lee Solomon
(George Mason University)
- Anne-Lise K. Velez
(Virginia Tech)
Abstract
Making research evidence accessible and relevant to policymakers is one way that the scientific enterprise confers direct societal benefits. With global norms increasingly promoting these types of broader impacts, new initiatives to do so, including training researchers to engage in policy, have flourished. But what should this training entail, and how would we know whether it has been effective? A review of academic and professional literature in fields such as science communication and public affairs suggests that curricula aiming to enhance the capacity of scientists and engineers to engage in policy should broadly cover effective communication skills and knowledge of public policy processes. This finding largely aligns with the learning outcomes sought by leaders of science policy training programs in the Commonwealth of Virginia, a state with among the highest number and diversity of these types of initiatives in the U.S. Training efforts could benefit from evaluation models and measures from academic literature that speak to the same types of educational outcomes. However, the lack of consistent theoretical foundations and constructs across this highly multidisciplinary scholarship reduces their utility. A common framework describing shared conceptual terms and relationships is needed to further establish the study and practice of these interventions at the science-policy interface.
Suggested Citation
K. L. Akerlof & Todd Schenk & Kelsey Mitchell & Adriana Bankston & Aniyah Syl & Lisa Eddy & Sarah L. Hall & Nikita Lad & Samuel J. Lake & Robert B. J. Ostrom & Jessica L. Rosenberg & Abigail R. Sisti , 2025.
"Learning outcomes and evaluation metrics for training researchers to engage in science policy,"
Palgrave Communications, Palgrave Macmillan, vol. 12(1), pages 1-16, December.
Handle:
RePEc:pal:palcom:v:12:y:2025:i:1:d:10.1057_s41599-025-05434-2
DOI: 10.1057/s41599-025-05434-2
Download full text from publisher
As the access to this document is restricted, you may want to
for a different version of it.
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:pal:palcom:v:12:y:2025:i:1:d:10.1057_s41599-025-05434-2. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: https://www.nature.com/ .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.