Evaluation, much as we love it, has a reputation among non-evaluators for being overly technical and academic, lost in the details, hard work to wade through, and in the end, not particularly useful. Why is this?

Many evaluators were originally trained in the social sciences. There we added numerous useful frameworks and methodologies into our toolkits. But, along the way, we were inculcated with several approaches, habits, and ways of communicating that are absolutely killing our ability to deliver the value we could be adding. Here are the worst of them:

  1. Writing question laundry lists. Asking long lists of evaluation questions that are far too narrow and detailed (often at the indicator level).
  2. Leaping to measurement. Diving into identifying intended outcomes and designing data collection instruments without a clear sense of who or what the evaluation is for.
  3. Going SMART but unintelligent. Focusing on what’s most easily measurable rather than making intelligent choices to go after what’s most important (SMART = specific, measurable, achievable, relevant, and time-based).
  4. Rorschach inkblotting. Assuming that measures, metrics, indicators, and stories are the answers, they are not!
  5. Shirking valuing. Treating evaluation as an opinion-gathering exercise rather than actually taking responsibility for drawing evaluative conclusions based on needs, aspirations, and other relevant values.
  6. Getting lost in the details. Leaving readers wading through data instead of clearly and succinctly delivering the answers they need.
  7. Burying the lede. Losing the most important key messages by loading way too many “key points” into the executive summary, not to mention the report itself, or using truly awful data visualization techniques.
  8. Speaking in tongues. Using academic and technical language just makes no sense to normal people.

Thankfully, hope is at hand! Breakthrough thinking and approaches are all around us. Some have been there for decades. But many evaluators just aren’t aware of them.

Here’s a challenge: seek out and get really serious about infusing the following into your evaluation work:

  • Evaluation-Specific Methodology (ESM). These are the methodologies that are distinctive to evaluation, i.e., the ones that go directly after values. Examples include needs and values assessments, merit determination methodologies, importance weighting methodologies, evaluative synthesis methodologies, and value-for-money analyses.
  • Actionable Evaluation. A pragmatic, utilization-focused framework for evaluation asks high-level, explicitly evaluative questions and delivers direct answers to them using ESM.
  • Data Visualization & Effective Reporting. The best of the best of DataViz, reporting, and communication to deliver insights that are not just understandable but unforgettable.

 

Jane Davidson is the author of Evaluation Methodology Basics: The Nuts and Bolts of Sound Evaluation (Sage, 2005) and the e-book Actionable Evaluation Basics: Getting Succinct Answers to the Most Important Questions (Real Evaluation, 2012). She is the owner/ director of Real Evaluation.

A previous version of this blog post appeared in the Spring 2013 EvaluATE newsletter.

 

 

 

About the Authors

Jane Davidson

Jane Davidson box with arrow

Jane Davidson is the author of Evaluation Methodology Basics: The Nuts and Bolts of Sound Evaluation (Sage, 2005) and the e-book, Actionable Evaluation Basics: Getting Succinct Answers to the Most Important Questions (Real Evaluation, 2012). She is owner/ director of Real Evaluation (realevaluation.com). You can learn more about evaluation-specific methodologies in the blog she writes with Patricia Rogers at genuineevaluation.com.

Creative Commons

Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Related Blog Posts

Nation Science Foundation Logo EvaluATE is supported by the National Science Foundation under grant number 1841783. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.