• Skip to main content
  • Skip to footer
  • Home

The May 13 Group

the next day for evaluation

  • Get Involved
  • Our Work
  • About Us
You are here: Home / Archives for allblogs

allblogs

May 18 2020

Evaluación Centrada en el Uso: Usos y usuarios directos definidos

En ocasiones podemos caer en la tentación de pensar que toda evaluación se centra en el uso, pero no es cierto. Especialmente con “Evaluación Centrada en el Uso”, nos referimos a un enfoque con pasos sistemáticos, en el que uso y usuarios de la evaluación están bien definidos:

La Evaluación Centrada en el Uso (UFE, por sus siglas en inglés), desarrollada por Michael Quinn Patton, es un enfoque basado en el principio de que una evaluación debe ser juzgada por su utilidad para los usuarios previstos. Por lo tanto, las evaluaciones deben planearse y llevarse a cabo de manera que se mejore el posible uso tanto de los hallazgos como del proceso en sí para informar las decisiones y mejorar el desempeño.

La UFE tiene dos elementos esenciales.

En primer lugar, los principales usuarios previstos de la evaluación deben estar claramente identificados y comprometidos personalmente al comienzo del proceso de evaluación para garantizar que puedan identificarse sus principales usos previstos.

En segundo lugar, los evaluadores deben asegurarse de que estos usos previstos de la evaluación, por parte de los principales usuarios previstos, guíen todas las demás decisiones que se tomen sobre el proceso de evaluación.

En lugar de centrarse en usuarios y usos generales y abstractos, UFE se centra en usuarios y usos reales y específicos. El trabajo del evaluador no es tomar decisiones independientemente de los usuarios previstos, sino facilitar la toma de decisiones entre las personas que utilizarán los resultados de la evaluación.

Patton argumenta que la investigación sobre la evaluación demuestra que: “Es más probable que los usuarios previstos utilicen las evaluaciones si entienden y tienen sentido de apropiación del proceso y los hallazgos de la evaluación [y que] es más probable que entiendan y se sientan adueñados si han participado activamente. Al involucrar activamente a los principales usuarios previstos, el/la evaluador/a está preparando las bases para su uso”.

 

Utilisation Focused Evaluation (Patton, 2008, Capítulo 3).

Written by cplysy · Categorized: TripleAD

May 18 2020

Collective Impact Forum (Lessons Learned)

Written by cplysy · Categorized: connectingevidence

May 18 2020

Practical Evaluation Tips in a Time of Crisis

Hi everyone-

Today I am joined by Jenn Ballentine of Highland Nonprofit Consulting to talk about, what else, evaluation in the time of Covid-19. Granted, my last blog was a bit of a rant, so today, I would like to strike a  more positive and helpful tone.  

To tell you the truth, some of the conversation around data collection during the pandemic has me a little squirmy because it has felt kind of opportunistic. I don’t think rushing out to survey people when they are really worried and anxious feels helpful, or frankly ethical.

But we are evaluators, so we do believe evaluation is important and we just can’t stop doing what we do. I am a community psychologist and Jenn, a public health professional. We believe in a public health approach to prevention and in systems-level change. And if anything, this pandemic should teach us is that we are all connected. Systems-level change is needed now more than ever to correct the inequities in our society so evident in disproportionate impact of COVID-19 in communities of color.

Adaptions in the Time of Crisis

Sanjeev Sridharan recently wrote a thoughtful and poignant piece called Adaptions and Nimbleness in the Time of Crisis: Some Questions for Evaluators. In it, he observes that both program implementers and evaluators must now think about how to adapt. He raises a set of questions for evaluators to consider, and I urge you to read the article for yourself.

Today we would like to address the nonprofit and program implementers and provide some practical and feasible tips, inspired by some of the issues he raises.

Jenn and I are evaluating a federally-funded teen pregnancy prevention program that for the last year, has been implemented at 5 community-based centers for teenage boys and girls. I also am the evaluator for several Drug-free Coalitions and Alcohol substance Abuse Prevention Programs, all of whom have a school component. Jenn serves as the evaluator for school-based sexual health education programs facilitated by a statewide training and advocacy organization.

As to be expected, nearly all programming and thus, data collection stopped mid-March. This left Jenn and I wondering, what the heck we were going to evaluate beyond the data we already collected this year?

The technical assistance from funders for the most part, included four specific questions:

  1. What were your intended enrollment numbers and what are your actual numbers, and what is are the reasons for these differences?
  2. What is the status of your programming and how has that changed?
  3. How has data collection changed (number pretests/number posttests) and how were participants affected- e.g. missed content, sessions provided out of order, etc.).
  4. How will the program use Continuous Quality Improvement (CQI) strategies to document and learn from the events?

What is missing is the story here, as Sridharan points out in his first question: “What are exemplars of good evaluation stories related to the adaptiveness/nimbleness of specific interventions.” Yes, we need to understand changes regarding what was planned versus what was done, but we need the why, in order to tell the entire story. When did they have to close their doors and why did they make that decision? What happened to staff and why? And as a result of the situation, were program staff able to pivot and if so, in what way?  For example, did program staff decide to shift from in-person to online delivery?

For one of my clients, they have shifted, rather nimbly I might say, to online meetings with their youth advisory committee. They are taking notes about their discussions and developing interventions that they can do online via social media. Similarly, another client that trains health and physical education teachers to implement comprehensive sex education offered to facilitate virtual lessons for one new district in an effort to ensure that students received this valuable information.                                                                                                                                      

Another of Sridharan’s questions is “Are there examples of evaluations that have taken a developmental approach to enhance the coordination at this time of the crisis?”  He observes that the “pandemic has highlighted the need to better understand the connections between the intervention and its underlying systemic contexts/supportive structures.” During a time of crisis, coordination can be improved, perhaps accelerated, or could also break down altogether.

Some school systems for example, have enlisted bus drivers, community volunteers and even local law enforcement to deliver food to students eligible through the National School Lunch Program. Some have expanded food distribution beyond those eligible through these federal programs. Other school systems have maintained the status quo, requiring families and guardians to drive to school with the eligible children present to collect the food distribution. Those without transportation, or without a large enough car to transport the whole family, were out of luck.                                                                                                                                 

There are a lot more gems to unpack (like the dynamics of vulnerability), but we will end with this question posed by Sridharan: “Can a focus on a minimal set of components needed to produce change help enhance a focus on meeting the needs of the disadvantaged given limited resources?”

For our teen pregnancy prevention program, we can’t even imagine where to start on a minimal set of components for this implementation fidelity evaluation. How do you deliver an evidenced-based, comprehensive teen pregnancy prevention program virtually? Is it even ethical to do so with parents, siblings in the next or same room? What about students without internet access, laptops or other devices or when access to devices must be shared by multiple youth?                                                                                             

Looking Forward

We are pretty sure that six months from now, funders will be asking nonprofits what happened? What is the program implementer to do? We think it’s critically important to document the changes programs made and the various ways in which the disruption impacted their organization and the people they serve. But program staff are busy people, especially during times of crisis. Evaluators can help the nonprofits they serve by helping staff document the changes they made and why they made them. Evaluators need to stress how this information will be useful when reporting to funders, partners, board members and others. The learning that comes from this process can help the organization plan for future disruptions.

We developed a guide to help in this process. Just let me know you want the guide and I will send it to you. Depending the needs of your client and their situation, these questions can be adapted in a variety of ways. You might want to change the order, eliminate some questions and add others. Do let us know what you think and if you find it useful. Stay safe and well!

Written by cplysy · Categorized: communityevaluationsolutions

May 17 2020

Evidence for Engagement

Picture

It’s funny how things work out sometimes. 

Tamara Hamai and I have been sowing the seeds for our new program, Evidence for Engagement, for months. Our partnership happened so organically – a meeting of the minds for two evaluators who have experience with and a passion for organizations that serve youth and families. We’d been toying with the best way to support the organizations that we serve and help them use evaluation to improve their access to funding and the children and families they serve. 

Then COVID hit. 

The pandemic has caused all of us to pause and re-evaluate how our work fits into a very new, very different reality. Tamara and I know that small organizations, especially those who work in schools, are struggling right now. Their access to the people they serve has been essentially cut off. We realized that organizations may need our help even more than before. 

Our solution: We’re running a totally free, three-week email series that will help small youth- and family-serving organizations build their evidence base (which is required under the Every Student Succeeds Act for any organization receiving federal education funds). Through videos, worksheets, frameworks, and success stories, Tamara and I will walk participants through the process of becoming evidence-based organizations and help them see this as an opportunity, not a burden. 

The goal: We want to help vital, community-based organizations plan for the future, open themselves up to new opportunities, and become more sustainably funded. We’re hoping that this opportunity will help them better serve youth and families, not only during this difficult period of time, but also for a long time afterward. 

For us, this is also about equity. We know that for many community-based, minority-owned organizations, budgeting for evaluation is out of the question. We also know that these grass-roots organizations are having a profound impact on their communities — and that their communities need all the support we can give. We’re hoping that we can get more small, local organizations approved as evidence-based programs in their districts and begin to level the playing field. 

If you think this program will benefit you and your organization, sign up! If you know of someone else who could use this support, encourage them to join. Feel free to share this link widely: bit.ly/evidence4engagement ​

Written by cplysy · Categorized: engagewithdata

May 15 2020

Applying the JCSEE Program Evaluation Standards to Real World Practice

 

To skip right to the free guide, check out our New Products page.

Many evaluators will already be familiar with the Program Evaluation Standards developed by the Joint Committee on Standards for Educational Evaluation (JCSEE). For those newer to the field, take comfort in knowing that evaluation has this set of Standards to guide your way forward. The Standards provide guidance both for evaluators in planning and implementing their program evaluation projects, and for evaluation users in knowing what to expect from the evaluation process and products.

71kVbtEgGxL.jpg

Evaluation practitioners come to their work through diverse academic and practice backgrounds. We may identify primarily as evaluators, or as a myriad of other job titles: program manager, executive director, assessment coordinator, quality improvement director, research assistant or data scientist, to name a few. The Program Evaluation Standards help to bring all of that professional diversity together into a field that has a common language and common expectations.

Developed and revised by experts, and with sponsoring organizations including the American Evaluation Association and the Canadian Evaluation Society, the Standards are published in a comprehensive guide that we recommend all evaluators keep in their library.

Through developing and delivering evaluation training, we know the value of short guides for translating concepts to practice. That’s why we developed this free resource that helps evaluators reflect on whether and how they are applying the Standards to their practice.

Our 6-page resource provides evaluators with reflective questions for each of the Standards. We suggest that you read through these questions as part of your evaluation planning process or use them to guide a self-reflective exercise after your evaluation project has concluded. For anyone working toward their Credentialed Evaluator designation through the Canadian Evaluation Society, this guide will support you to consider the Reflective Practice competencies.


Sign up for our newsletter

We’ll let you know about our new content, and curate the best new evaluation resources from around the web!


We respect your privacy.

Thank you!


Reference: Yarbrough, D.B., Shula, L.M., Hopson, R.K., & Caruthers, F.A. (2010). The Program Evaluation Standards: A guide for evaluators and evaluation users (3rd. ed). Thousand Oaks, CA: Corwin Press.

 

Written by cplysy · Categorized: evalacademy

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 265
  • Go to page 266
  • Go to page 267
  • Go to page 268
  • Go to page 269
  • Interim pages omitted …
  • Go to page 310
  • Go to Next Page »

Footer

Follow our Work

The easiest way to stay connected to our work is to join our newsletter. You’ll get updates on projects, learn about new events, and hear stories from those evaluators whom the field continues to actively exclude and erase.

Get Updates

Want to take further action or join a pod? Click here to learn more.

Copyright © 2026 · The May 13 Group · Log in

en English
af Afrikaanssq Shqipam አማርኛar العربيةhy Հայերենaz Azərbaycan dilieu Euskarabe Беларуская моваbn বাংলাbs Bosanskibg Българскиca Catalàceb Cebuanony Chichewazh-CN 简体中文zh-TW 繁體中文co Corsuhr Hrvatskics Čeština‎da Dansknl Nederlandsen Englisheo Esperantoet Eestitl Filipinofi Suomifr Françaisfy Fryskgl Galegoka ქართულიde Deutschel Ελληνικάgu ગુજરાતીht Kreyol ayisyenha Harshen Hausahaw Ōlelo Hawaiʻiiw עִבְרִיתhi हिन्दीhmn Hmonghu Magyaris Íslenskaig Igboid Bahasa Indonesiaga Gaeilgeit Italianoja 日本語jw Basa Jawakn ಕನ್ನಡkk Қазақ тіліkm ភាសាខ្មែរko 한국어ku كوردی‎ky Кыргызчаlo ພາສາລາວla Latinlv Latviešu valodalt Lietuvių kalbalb Lëtzebuergeschmk Македонски јазикmg Malagasyms Bahasa Melayuml മലയാളംmt Maltesemi Te Reo Māorimr मराठीmn Монголmy ဗမာစာne नेपालीno Norsk bokmålps پښتوfa فارسیpl Polskipt Portuguêspa ਪੰਜਾਬੀro Românăru Русскийsm Samoangd Gàidhligsr Српски језикst Sesothosn Shonasd سنڌيsi සිංහලsk Slovenčinasl Slovenščinaso Afsoomaalies Españolsu Basa Sundasw Kiswahilisv Svenskatg Тоҷикӣta தமிழ்te తెలుగుth ไทยtr Türkçeuk Українськаur اردوuz O‘zbekchavi Tiếng Việtcy Cymraegxh isiXhosayi יידישyo Yorùbázu Zulu