Updated April 2019 for Throwback Thursday series.

The deadline for proposals to the National Science Foundation’s Advanced Technological Education program for this year just passed, so a blog about what to include in the Results from Prior NSF Support section in an ATE proposal may seem untimely. But if you’re involved in a project that will seek additional funding from NSF in the next year or two, there’s no better time than NOW to assess the quality and quantity of the evidence of your current project’s intellectual merit and broader impacts (NSF’s review criteria for proposed and completed work).

Understand the fundamentals of intellectual merit and broader impacts: In a nutshell, intellectual merit is about advancing knowledge and understanding. Broader impacts are benefits to society. If your project is mainly research, it’s likely that most of your achievements are related to intellectual merit. If you’re mostly doing development and implementation, your achievements are probably more in the area of broader impacts. But you should have evidence of both aspects of your work.

Identify your project’s intellectual merit and broader impacts: To hone in on your project’s intellectual merit and broader impacts, it helps to break down these big ideas into smaller chunks. To identify your project’s intellectual merit, ask yourself, what are we doing that is generating new knowledge or improved understanding? Are we using novel research methods or investigating a novel topic to better understand an aspect of STEM education? Is our project transformative, bringing about extraordinary or revolutionary change? In terms of broader impacts, what are we doing to serve groups that have been historically underrepresented in STEM; developing a diverse workforce; creating partnerships between academia and industry; enhancing education infrastructure; increasing economic competitiveness; or improving STEM education in general?

Identify gaps in evidence: It’s not enough to profess your achievements—you need evidence. Evidence is not the method you used to collect data (tests, surveys, observations, etc.); it’s the evidence indicated by those data (a genetic test is not evidence that someone committed a crime, the result of that test is the evidence). If you don’t have good evidence of important achievements, revise your evaluation plan and start collecting data as soon as possible. Make sure that you have evidence of more than just the completion of activities. For example, if your achievement is that you developed a new certification program, to demonstrate broader impacts, you need evidence that it is a high-quality program and that students are enrolling, graduating, and getting jobs (or at least go as far down the outcomes chain as reasonable). Plotting your evidence on a logic model is a good way to figure out if you have sufficient evidence regarding outcomes as well as activities and outputs.

If you find gaps that will impair your ability to make a compelling case about what you’ve achieved with your current grant, update your evaluation plan accordingly. When you write your next proposal, you will be required to present evidence of your achievements under the specific headings of “Intellectual Merit” and “Broader Impacts” – if you don’t, your proposal is at risk of being returned without review.

 

To learn more, check out these resources:

NSF Grant Proposal Guide (this link goes directly to the section on Results from Prior NSF Support): http://bit.ly/nsf-results

NSF Merit Review Website: http://www.nsf.gov/bfa/dias/policy/merit_review/

Changes to NSF Merit Review Criteria: https://www.nsf.gov/about/transformative_research/merit_review_criteria.jsp

NSF Examples of Broader Impacts: http://www.nsf.gov/pubs/2002/nsf022/bicexamples.pdf

Perspectives on Broader Impacts: http://www.nsf.gov/od/oia/publications/Broader_Impacts.pdf

National Alliance for Broader Impacts: http://broaderimpacts.net/about/

Materials from our ATE PI conference workshop on this topic, including presentation slides, worksheets, and EvaluATE’s Results from Prior NSF Support Checklist: http://www.evalu-ate.org/library/conference/pi-conference/

About the Authors

Lori Wingate

Lori Wingate box with arrow

Executive Director, The Evaluation Center, Western Michigan University

Lori has a Ph.D. in evaluation and more than 20 years of experience in the field of program evaluation. She is co-principal investigator of EvaluATE and leads and a variety of evaluation projects at WMU focused on STEM education, health, and higher education initiatives. Dr. Wingate has led numerous webinars and workshops on evaluation in a variety of contexts, including CDC University and the American Evaluation Association Summer Evaluation Institute. She is an associate member of the graduate faculty at WMU.

Creative Commons

Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Related Blog Posts

Nation Science Foundation Logo EvaluATE is supported by the National Science Foundation under grant numbers 0802245, 1204683, 1600992, and 1841783. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.