Digital Science
Browse
- No file added yet -

Digital Science Webinar: Articulating Research Impact – Strategies from Around the Globe

Download (121.27 MB)
Version 2 2016-08-12, 12:40
Version 1 2016-08-09, 14:58
media
posted on 2016-08-12, 12:40 authored by Digital ScienceDigital Science, Jonathan AdamsJonathan Adams, Stacy KonkielStacy Konkiel, Daniel S. KatzDaniel S. Katz

Jonathan Adams and his Consultancy Group in Digital Science, last year, created a searchable database of impact case studies, with descriptive metadata, from the 2014 Research Exercise Framework (REF) in the United Kingdom.

Jonathan shares some interesting and diverse examples of impact from the case studies database, and speak to the expectations that administrators have when it comes to selecting tools to help them collect key indicators of impact, pinpoint that impact and then report on it accurately.

Building upon Jonathan’s insights, Stacy Konkiel examines specific examples of impact that weren't submitted in the REF2014 impact case studies, but that can showcase certain important types of impact like the influence of research on public policy, technology commercialization, and more. She discusses how tools like Altmetric can help researchers easily discover where their work is making a difference in academia and beyond.

Daniel S. Katz discusses how reviewers at the National Science Foundation (USA) consider the "intellectual merit" and "broader impacts” criteria for funding and in particular how metrics might help applicants understand their impacts in these areas. Dan also talks about how reviewers might use qualitative and quantitative altmetrics data to inform their peer reviews for grant applications. He addresses many of the salient questions around this use of metrics, for example, do reviewers take metrics seriously and what types of metrics are of most value to them?

History