• Skip to main content
  • Skip to footer
  • Home

The May 13 Group

the next day for evaluation

  • Get Involved
  • Our Work
  • About Us
You are here: Home / Archives for cplysy

cplysy

May 17 2020

Evidence for Engagement

Picture

It’s funny how things work out sometimes. 

Tamara Hamai and I have been sowing the seeds for our new program, Evidence for Engagement, for months. Our partnership happened so organically – a meeting of the minds for two evaluators who have experience with and a passion for organizations that serve youth and families. We’d been toying with the best way to support the organizations that we serve and help them use evaluation to improve their access to funding and the children and families they serve. 

Then COVID hit. 

The pandemic has caused all of us to pause and re-evaluate how our work fits into a very new, very different reality. Tamara and I know that small organizations, especially those who work in schools, are struggling right now. Their access to the people they serve has been essentially cut off. We realized that organizations may need our help even more than before. 

Our solution: We’re running a totally free, three-week email series that will help small youth- and family-serving organizations build their evidence base (which is required under the Every Student Succeeds Act for any organization receiving federal education funds). Through videos, worksheets, frameworks, and success stories, Tamara and I will walk participants through the process of becoming evidence-based organizations and help them see this as an opportunity, not a burden. 

The goal: We want to help vital, community-based organizations plan for the future, open themselves up to new opportunities, and become more sustainably funded. We’re hoping that this opportunity will help them better serve youth and families, not only during this difficult period of time, but also for a long time afterward. 

For us, this is also about equity. We know that for many community-based, minority-owned organizations, budgeting for evaluation is out of the question. We also know that these grass-roots organizations are having a profound impact on their communities — and that their communities need all the support we can give. We’re hoping that we can get more small, local organizations approved as evidence-based programs in their districts and begin to level the playing field. 

If you think this program will benefit you and your organization, sign up! If you know of someone else who could use this support, encourage them to join. Feel free to share this link widely: bit.ly/evidence4engagement ​

Written by cplysy · Categorized: engagewithdata

May 15 2020

Applying the JCSEE Program Evaluation Standards to Real World Practice

 

To skip right to the free guide, check out our New Products page.

Many evaluators will already be familiar with the Program Evaluation Standards developed by the Joint Committee on Standards for Educational Evaluation (JCSEE). For those newer to the field, take comfort in knowing that evaluation has this set of Standards to guide your way forward. The Standards provide guidance both for evaluators in planning and implementing their program evaluation projects, and for evaluation users in knowing what to expect from the evaluation process and products.

71kVbtEgGxL.jpg

Evaluation practitioners come to their work through diverse academic and practice backgrounds. We may identify primarily as evaluators, or as a myriad of other job titles: program manager, executive director, assessment coordinator, quality improvement director, research assistant or data scientist, to name a few. The Program Evaluation Standards help to bring all of that professional diversity together into a field that has a common language and common expectations.

Developed and revised by experts, and with sponsoring organizations including the American Evaluation Association and the Canadian Evaluation Society, the Standards are published in a comprehensive guide that we recommend all evaluators keep in their library.

Through developing and delivering evaluation training, we know the value of short guides for translating concepts to practice. That’s why we developed this free resource that helps evaluators reflect on whether and how they are applying the Standards to their practice.

Our 6-page resource provides evaluators with reflective questions for each of the Standards. We suggest that you read through these questions as part of your evaluation planning process or use them to guide a self-reflective exercise after your evaluation project has concluded. For anyone working toward their Credentialed Evaluator designation through the Canadian Evaluation Society, this guide will support you to consider the Reflective Practice competencies.


Sign up for our newsletter

We’ll let you know about our new content, and curate the best new evaluation resources from around the web!


We respect your privacy.

Thank you!


Reference: Yarbrough, D.B., Shula, L.M., Hopson, R.K., & Caruthers, F.A. (2010). The Program Evaluation Standards: A guide for evaluators and evaluation users (3rd. ed). Thousand Oaks, CA: Corwin Press.

 

Written by cplysy · Categorized: evalacademy

May 13 2020

“Capacidad y Sistema” Nacional de Evaluación en la Agenda 2030

Parte de la Agenda 2030 y de la Agenda de Evaluación 2020 están enfocadas al desarrollo de la Capacidad Nacional de Evaluación (NEC) y de los Sistemas Nacionales de Evaluación (NES). Aquí tratamos de clarificar estos dos conceptos:

A. La Capacidad Nacional de Evaluación (NEC) es la capacidad colectiva del sistema de evaluación nacional (1) para lograr, alinear y mantener sus objetivos, estructura, procesos, cultura, capital humano y tecnología (2) para producir conocimiento evaluativo que informe prácticas y toma de decisiones para (3) mejorar la efectividad y la rendición de cuentas.

B. Un Sistema Nacional de Evaluación (NES) es (1) el conjunto de (a) instituciones, (b) personas y  (c ) actividades, junto con (d) las políticas, procedimientos y relaciones que los vinculan y guían, (2) involucrados en la demanda, oferta y uso de evaluación (3) para apoyar (a) la rendición de cuentas, (b) el aprendizaje y (c) la toma de decisiones.

La Agenda de Evaluación 2020 identificó cuatro dimensiones de un Sistema de Evaluación Nacional: (1) entorno propicio, (2) capacidades institucionales y (3) capacidades individuales para la evaluación, así como (4) integración entre estas tres dimensiones

Los pasos para fortalecer el entorno propicio para un Sistema Nacional de Evaluación son: (1) evaluar la necesidad de evaluación, (2) mapear el interés de las partes interesadas, (3) identificar vías para el crecimiento y (4) crear partenariados, asociaciones y/o grupos de interés.

 

 

Written by cplysy · Categorized: TripleAD

May 13 2020

Evaluation and COVID-19

Social media is not a monologue. It’s a collection of voices seen (or unseen) based almost entirely on who you do or do not follow.

But the world we see is based, as it always has been, on the people that surround us. Historically this has been primarily determined by where we live, and the communities where we interact. But increasingly, what we see it is based at least somewhat on who we choose to follow in the digital realm. And who we choose to invite into our own conversations.

I am an on and off lurker* of the American Evaluation Association’s Evaltalk listserv and recently stumbled into this comment by Doug Fraser.

It raises the same question Bill Fear has tried to raise a couple of times, namely: why, at this of all times, when the world faces the biggest and most vital evaluation challenge in its recent history, is the evaluation profession missing in action?

I don’t see it that way.

I see evaluators across the globe stepping up. They are using their approaches, methods, and other tools in their toolkits to make contributions. And they are using their voices to share their perspectives and guidance across the globe.

In this post I wanted to create a compilation of some of those viewpoints. And as I collected, I found that I could have kept going and going.

If you have read (or written) something recently that inspired you, please share in the comments.

*EvalTalk has a long history, but listserv technology has a way of elevating the perspectives of those members who prefer shouting to discourse. Subsequently, I find many of the conversations lacking sufficient diversity in both participation and perspective. But I digress.

When Was There Certainty?

The pandemic has laid bare that which many of us (meaning people of the global majority that live in the United States) have always known: this country is designed to maintain a power dynamic that privileges white male power and wealth at the sacrifice of most everything else.

Jara Dean-Coffey in A note from EEI’s Director: “I have feelings.”

On May 13, at 12PM Eastern/3PM Pacific, Jara will be joining me for an Eval Central UnWebinar to talk about being of service when you are not essential. Join us!

The Lines Between Work and Life and Life and Work

The resources designed to help us adapt to Covid-19 don’t match up with our lives right now. Every day, I sit down at my kitchen table — the same table where I (used to) host dinners and put together puzzles — in front of a make-shift workstation where I do my job. My living space is also my workspace. We’re managing a whole new definition of work-life balance right now and it turns out work is part of our lives.

Alissa Marchant in 18 Resources helping me in work and life with Covid-19

Don’t Overthink, Just Do

@timbidey/@traversepeople Tim is an experienced qualitative researcher with a passion for helping charities explore what works (and what doesn’t) and why to inform their project design and practice.

It’s clear that many of the voluntary and community sector organisations that Traverse works with are struggling at the moment in the UK. Charities have had to adapt to continue delivering frontline services or develop new ones to meet emerging needs in a world of new public health restrictions – all amid a catastrophic loss of funding and, in some cases, lack of staff where people have been furloughed.

In some cases, evaluation has fallen fast down the list of priorities – but it’s important to remember that ‘evaluation’ in itself is not a homogenous practice. Sure, now is not the right moment for continued impact evaluation of multi-year programmes, but evaluation has many faces and need not be so traditional or comprehensive.

My advice to organisations has been to keep it simple. Evaluation is so often seen as a ‘mystifying practice’, but now, more than ever, it’s better to collect something than not collect anything through fear of being seen as unsystematic.

After all, evaluation’s value lies in its utility – it needs to serve the information needs of its users. Right now, the needs of voluntary and community sector organisations demand real-time data to inform weekly decisions about delivery, rather than demonstrating the differences that they’ve made for funders or members of the public. 

Evaluation for this purpose does not need to be theory-based with perfectly rounded edges, it just needs to capture ‘good enough’ data about the essentials on a regular basis. For new or developed services these might include: what is the problem? Who is affected, how? Who are we reaching or not reaching – why? What do people think? What, if any differences are we making? What do we need to do to improve?

A sense of intended outcomes, basic monitoring data, simple service user feedback or even reflective, anecdotal data from staff can all provide ‘good enough’ insights into these questions for the situation at hand – so long as people remain honest with themselves (and others) about how insights were generated and what limitations sit behind them. 

So don’t overthink, just do.

Tim Bidey of Traverse is part of the FreshSpectrum Panel of Experts. These were his words.

Becoming Developmental Evaluators

All evaluators must now become developmental evaluators, capable of adapting to complex dynamics systems, preparing for the unknown, for uncertainties, turbulence, lack of control, nonlinearities, and for emergence of the unexpected. This is the current context around the world in general and this is the world in which evaluation will exist for the foreseeable future.

Michael Quinn Patton in the Evaluation Implications of the Coronavirus Global Health Pandemic Emergency

Because Our Decisions Have Consequences

 Some of us want to do everything we can to stop the spread of the pandemic and minimize the overall harm it will cause. Others of us are more concerned with managing the indirect effects of the crisis on communities or causes we care about. Still others of us are just trying to figure out the role we can and ought to play. All of us, though, can benefit from approaching these challenges with thoughtfulness and rigor.

Ian David Moss in Deciding Well in Tumultuous Times

Changing Definitions

But now, with the uncertainty of what visitation will look like over the coming months and potentially years as museums phase into reopening with limitations on visitor capacity and new social distancing measures, I wonder what does a “representative sample” mean now?  

Katie Chandler in Sampling: What does “representative” mean during and after coronavirus?

Who is Afraid of Rigor?

Bio: @b3consults/www.b3consults.com  Betsy brings all the tools of data & program evaluation in harmony with the heart and intuition-led world of coaching, to increase the impact of results-driven organizations.

My gut reaction, though I’m scared to put it out there, is two fold.  One, what about the organizations that were ready for the pandemic? Two, why shouldn’t we see a call for increased rigor?

To the first: I believe some organizations planned for something along the lines of a global pandemic.  As early as 2012, some organizations were making the case that a COVID-19 like event was coming.  The crisis was foreseen, and some organizations were ready for it. 

To the second: I’ve seen implied that we should be more adaptive and iterative, in a sense softening the rigor as organizations lean into change and adapt.  
But what if COVID-19 isn’t a one-time event (and most experts will tell you it isn’t)? And what if our ability to measure, with a high degree of rigor, how organizations served critical populations during this time will be crucial to inform what we need to do next time? 

I’m an unlikely architect of highly rigorous evaluation design, I tend to focus in the PrEvaluation space. However, when I think of my clients, the ones that know that their ability to rise up and serve critical populations well NOW will prevent further exacerbating equity issues in the future, I want to hear the kind of evaluation for which they would strive.  And I would hesitate to assume they want less rigor, or that they are afraid of rigor in a crisis. 

Ask yourself: what if they crave rigor right now?

Betsy Block of B3 Consults is part of the FreshSpectrum Panel of Experts. These were her words.

Responsible COVID-19 Data Visualization

Amanda said, “There are different points in which we make decisions about how and what we visualize, and then how we publish and share. What are we creating and doing more for our own exploration and understanding? And what are doing so that we can share it with the public to help others make sense of information?”

From Ann K Emery’s interview with Amanda Makulec in Visualizing COVID-19 Data Responsibly: An Interview with Amanda Makulec

Evaluation Contingency Plan

We’re often providing funders an evaluation plan that includes best case scenario and prioritize in-person interactions. Rarely do plans require the evaluation team to offer contingency options. As many of us are now tailoring methods to respond to social distancing and travel recommendations, we’re switching to virtual interviews and other methods of web-based data collection. Building contingencies into future evaluation plans will leave us better prepared to pivot and could save time and resources on creating post-hoc plans.

Martena Reed in Reflex or Reflection: Three Lessons for Evaluators Amid COVID-19

Changing Your Data Strategy

Consider what sort of data collection activities are going to help you use your data and collect data that is useful.

Michelle Molina in her Video on Nonprofit Data Adjustments to COVID-19

Forget Returning to Normal

So no, I don’t want us to return to normal. I want us to use this an opportunity to change, to create systems and social structures that create deep and lasting equity and a world where we work together for the common good. One can dream, right? If anything, this crisis should teach us that we are all connected. 

Ann Price in There are words I really hate right now.

The Importance of Our Work

We work to support better evaluation globally. Good evaluation helps people identify the information they need and make sense of it.  It helps inform decisions about what to do and how to improve results. Good evaluation is essential to guide the best use of resources and to ensure accountability and learning. During this pandemic and in the post-pandemic world our work is more important than ever.

Patricia Rogers in BetterEvaluation COVID-19 Statement

Compensating for a Lack of Monitoring

Una evaluación en tiempo real (RTE) está diseñada para proporcionar retroalimentación inmediata (en tiempo real) a aquellos que planifican o ejecutan un proyecto o programa, para que puedan realizar mejoras. Esta retroalimentación generalmente se proporciona durante el trabajo de campo de la evaluación, en lugar de después.

Carlos Rodriguez-Ariza in La evaluación en tiempo real en emergencias

[English Translation] A real-time evaluation (RTE) is designed to provide immediate (real-time) feedback to those who plan or execute a project or program, so they can make improvements. This feedback is generally provided during the assessment fieldwork, rather than afterwards.

Access for On-Site Data Collection

On 1 April 2020, USAID and IDEAL hosted a webinar on ‘Challenges and Strategies for Monitoring and Evaluation (M&E) in the Time of COVID-19’. The virtual meeting was attended by 500 M&E professionals.

The participants of webinar completed a poll that yielded the following results.

Ann-Murray Brown in A New Dawn: Monitoring and Evaluation during COVID-19

Before Expanding Boundaries

This is the smallest visible system (SVS) in which you can make a difference. Once you can act wisely on this system, you can expand the boundaries and scope to work larger.

Cameron D. Norman in Acting in Complex Times

Looking to the Desired Present

I’m looking to the desired present instead of a desired future. Not because I have no hopes or aspirations for the future, but because I don’t find it helpful right now to aim for something I can’t see. I don’t know what the future will hold. I don’t know where this moment goes. I’m hoping there’s a future out there so different from this one that I can’t even imagine it fully much less trace a path to it by design. All I want to do is find the best part of whatever moment I am in, and work with that.

Carolyn Camman in Entering the Clearing

Written by cplysy · Categorized: freshspectrum

May 13 2020

Strategic Learning and Evaluation – What Boards Need to Know

 

Recently I was asked by a client about an evaluation literacy course for its board. The client’s board members had just attended a strategic planning day and through that discussion felt they needed education on evaluation and metrics. On one hand I thought “bravo, they want to know more about evaluation!”; on the other hand I thought “shit…., I’ve totally failed them as their evaluator – what have I been missing?”

Boards need quality information to make strategy and leadership decisions; however, the reality is this board isn’t getting the information it needs to inform its decisions. As their evaluator, it is my responsibility (and also opportunity) to show them the way forward, so they are no longer left with answers that are “a definite ‘maybe,’” but instead have data and insights that they can use to inform their decision making. This means a more systematic, coordinated, and intentional approach to evaluation and learning – a strategic learning and evaluation system (SLES), as is described in FSG’s Building a Strategic Learning and Evaluation System for your Organization. The course I developed therefore focused not only on evaluation literacy, but also how evaluation can support a SLES. The course has three overall objectives for its board members:

  1. Understand the necessity of and advocate for strategic learning;

  2. Understand the basics of evaluation and how it can support strategic learning, and;

  3. Begin developing the building blocks for a strategic learning and evaluation system.

Here are some of the key learnings and action imperatives for the board: 

Understand that evaluation is one piece of the learning pie

While evaluation is important for learning and improvement, it is only one of many information-gathering approaches that can be used to inform decision making about strategy. Organizations also collect information through performance measurement, audits, research, case studies, discussions at the water cooler and a number of other ways. So, if the ultimate goal is for organizations to learn and use that information to improve, boards need to shift thinking from “leading with evaluation to leading with learning” (Centre for Evaluation Innovation).

Evaluation is one of many ways to gather information to inform decision-making and learning. Others include performance measures, monitoring, audit, research, cost effectiveness analysis, and case studies.

Evaluation is one of many ways to gather information to inform decision-making and learning. Others include performance measures, monitoring, audit, research, cost effectiveness analysis, and case studies.


Understand and advocate for strategic learning

I am lucky that my client’s board has a desire to learn more about evaluation and how it can support strategy development. The Centre for Evaluation Innovation recently conducted a survey to collect data on evaluation and learning practices in foundations and found that senior management often communicate support for evaluation, but their behaviours do not demonstrate support for evaluation. So what are the board behaviours that would demonstrate support for evaluation? And even more specifically, evaluation for strategic learning? 

The Centre for Evaluation Innovation produced a report titled, Evaluation to Support Strategic Learning: Principles and Practices, in this report they explain that “designing data collection and evaluation specifically to support strategy decisions requires shifts in thinking about what questions get asked, the role the evaluator plays, how data collection is timed, and the framing of the findings” (pg. 3). They go on to articulate nine principles of evaluation for strategic learning that boards can advocate for within their organization:

  1. A support for strategy

  2. Integrated and conducted in partnership

  3. Emphasizes context

  4. Client focused

  5. Places high value on use, and helps to support it

  6. Data to inform strategy can come from a wide variety of sources and methods

  7. Must take place within a culture that encourages risk taking, learning and adaptation

  8. Is flexible and timely and ready for the unexpected

  9. Is constructivist

Understand the basic evaluation terms and steps

Part of advocating for strategic learning is understanding basic evaluation concepts and terms. If learning and evaluation efforts are to inform an organization’s decision-making practices, then boards need a clear vision for evaluation – what it is and is not.

I’m not going to lie – I don’t believe in reinventing the wheel, and Chris Lovato and Kylie Hutchinson put together an Evaluation for Leaders course. Much of what I covered in the evaluation basics module of my course follows what they outline for evaluation terms, types and steps. However, as Chris and Kylie say in their course:

 

“Leaders and decision makers don’t need to be evaluation experts, just expert supporters and users.”

 

So, in my course there is an emphasis on the first evaluation step – focus. For boards it is important to understand this step. An evaluation can’t be all things to all people – focusing it provides a clear direction of who needs what information and how the user(s) are going to use it.

Understand what is credible evaluation evidence

Thinking has changed on what constitutes “credible” evaluation evidence. Many board members from this organization come from a science background that believes RCTs (randomized control trials) are the gold standard for evaluation evidence. This is a common misconception, but one that is particularly important to address with board members. As the Evaluation Yoda, Michael Quinn Patton states:

 

“Despite the recognition more than 35 years ago that the reductionist approach to complex problems is likely to fail, many still persist in believing that we must rigorously apply the scientific method to problems in medicine and public health.”

 

If a board is trying to impact systems and trying to shift the conditions that are holding problems in place (i.e. systems change) then it is important that the board shift their thinking from measuring and proving against some sort of fixed model to understanding and improving – in other words, to adopt a systems thinking lens. The evaluation methods that are selected to evaluate that change comes back to how appropriate the methods are given the purpose of the evaluation, the questions it needs to answer, and how technically adequate the findings are given the time and cost constraints. As the United States General Accounting Office Program Evaluation and Methodology Division (1991, pg. 17) quotes:

 

“A strong study is technically adequate and useful – in short, it is high quality”

 

Know what you want/need and communicate it to your organization

If a board is not clear on what information it needs to inform its decisions, you can be sure the rest of the organization won’t either. A board gets a lot of information, but it may not:

  • Contain the right information,

  • Be presented in a useful and useable format,

  • Be received on time (i.e. after a decision has been made), or;

  • Be connected to organizational strategy, which means the findings aren’t getting used (or at least not fully). 

An easy first step to enhance use of evaluation findings is for boards to make their timelines and reporting preferences known. A more difficult next step a board should consider when trying to enhance usability of findings is to implement a SLES.

According to FSG’s Building a Strategic Learning and Evaluation System for your Organization, a SLES contains:

  1. A clear vision for evaluation,

  2. A culture that fosters individual, group and organizational learning,

  3. A compelling and strong strategy,

  4. Coordinated evaluation and learning activities, and;

  5. A supportive environment.

Bottom line – a SLES will provide guidance and align organizations on who, what, when, where, why and how to measure and report.

I left my client a lot to chew on. As I mentioned above, implementing a SLES will be difficult, but ultimately should provide the board with an evaluation strategy that increases the value of evaluation for its organization.

Interested in learning more? Sign up for our newsletter and we’ll notify you when the online version of this course is available.


Sign up for our newsletter

We’ll let you know about our new content, and curate the best new evaluation resources from around the web!


We respect your privacy.

Thank you!


 

Written by cplysy · Categorized: evalacademy

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 262
  • Go to page 263
  • Go to page 264
  • Go to page 265
  • Go to page 266
  • Interim pages omitted …
  • Go to page 304
  • Go to Next Page »

Footer

Follow our Work

The easiest way to stay connected to our work is to join our newsletter. You’ll get updates on projects, learn about new events, and hear stories from those evaluators whom the field continues to actively exclude and erase.

Get Updates

Want to take further action or join a pod? Click here to learn more.

Copyright © 2026 · The May 13 Group · Log in

en English
af Afrikaanssq Shqipam አማርኛar العربيةhy Հայերենaz Azərbaycan dilieu Euskarabe Беларуская моваbn বাংলাbs Bosanskibg Българскиca Catalàceb Cebuanony Chichewazh-CN 简体中文zh-TW 繁體中文co Corsuhr Hrvatskics Čeština‎da Dansknl Nederlandsen Englisheo Esperantoet Eestitl Filipinofi Suomifr Françaisfy Fryskgl Galegoka ქართულიde Deutschel Ελληνικάgu ગુજરાતીht Kreyol ayisyenha Harshen Hausahaw Ōlelo Hawaiʻiiw עִבְרִיתhi हिन्दीhmn Hmonghu Magyaris Íslenskaig Igboid Bahasa Indonesiaga Gaeilgeit Italianoja 日本語jw Basa Jawakn ಕನ್ನಡkk Қазақ тіліkm ភាសាខ្មែរko 한국어ku كوردی‎ky Кыргызчаlo ພາສາລາວla Latinlv Latviešu valodalt Lietuvių kalbalb Lëtzebuergeschmk Македонски јазикmg Malagasyms Bahasa Melayuml മലയാളംmt Maltesemi Te Reo Māorimr मराठीmn Монголmy ဗမာစာne नेपालीno Norsk bokmålps پښتوfa فارسیpl Polskipt Portuguêspa ਪੰਜਾਬੀro Românăru Русскийsm Samoangd Gàidhligsr Српски језикst Sesothosn Shonasd سنڌيsi සිංහලsk Slovenčinasl Slovenščinaso Afsoomaalies Españolsu Basa Sundasw Kiswahilisv Svenskatg Тоҷикӣta தமிழ்te తెలుగుth ไทยtr Türkçeuk Українськаur اردوuz O‘zbekchavi Tiếng Việtcy Cymraegxh isiXhosayi יידישyo Yorùbázu Zulu