• Skip to main content
  • Skip to footer
  • Home

The May 13 Group

the next day for evaluation

  • Get Involved
  • Our Work
  • About Us
You are here: Home / Archives for cplysy

cplysy

Nov 18 2020

Comienzo de la Iniciativa de Evaluación Global (GEI)

Fuente https://www.globalevaluationinitiative.org/

Hoy da comienzo/se inaugura la Iniciativa de Evaluación Global (GEI) : una asociación global inclusiva para el desarrollo de marcos y capacidades de seguimiento y evaluación (M&E) (a) sostenibles y (b) apropiados por  los países para (i) promover el uso de evidencia en la toma de decisiones públicas, (ii) mejorar la rendición de cuentas y (iii) lograr mejores resultados.

Parte de la hipótesis de que una mejor evidencia contribuye a mejores políticas y, en última instancia, a mejorar la calidad de vida.

A nivel mundial, el 91 % de las estrategias nacionales de desarrollo de los países, aprobadas desde 2015, se refieren a la agenda 2030 y los Objetivos de Desarrollo Sostenible (ODS); sin embargo, de acuerdo con la Alianza Global para la Cooperación Efectiva para el Desarrollo, sólo el 35 % de ellos tienen datos y sistemas confiables para rastrear el progreso de sus políticas y programas.

En respuesta a esta brecha global en los sistemas y capacidades de M&E, GEI es una asociación que pretende (a) ser innovadora, (b) reunir a una amplia y diversa coalición de gobiernos, organizaciones de desarrollo (inter) nacionales y locales, y expertos en M&E para aunar recursos financieros y técnicos para coordinar y expandir los esfuerzos de M&E a nivel mundial.

“Ninguna institución tiene los recursos para abordar las brechas globales actuales en la capacidad de seguimiento y evaluación. Para abordar la escala de la necesidad se requerirán esfuerzos coordinados por amplias coaliciones y GEI es un paso en esa dirección, proporcionando una base sobre la que se puede construir. ” – Alison Evans, vicepresidenta del Banco Mundial & Director General del Grupo de Evaluación Independiente

La pandemia del coronavirus ha aumentado la demanda mundial de marcos de MyE más sólidos, especialmente en los países en desarrollo donde los sistemas y capacidades de MyE siguen siendo débiles.

“Reconstruir mejor” a partir de las crisis requiere fortalecer la capacidad de los gobiernos para recopilar, analizar y utilizar datos en la toma de decisiones. GEI pretende abordar esta necesidad creciente y urgente de uso oportuno y confiable de evidencia en la toma de decisiones a través de lo siguiente:

  • Coordinar y ampliar los programas de desarrollo de capacidades en evaluación (DCE)
  • Apoyar a los países para crear y fortalecer marcos y capacidades de M&E que fomenten procesos de decisión mejor informados.
  • Catalizar alianzas estratégicas mediante la convocatoria de actores y expertos de M&E
  • Aprovechar el conocimiento local, regional y global para proporcionar soluciones de M&E efectivas y contextualmente relevantes.
  • A través de esos esfuerzos combinados, pretende contribuir a (a) intervenciones de políticas más relevantes y efectivas, respuestas mejores y más oportunas a las conmociones y, (b) en última instancia, a mejorar el progreso hacia los objetivos de desarrollo nacional de los países y los ODS.

El enfoque GEI

Como asociación global, GEI pretende apoyar a los países en la construcción de marcos y capacidades de M&E sostenibles y efectivos, aprovechando el conocimiento y la experiencia locales, regionales y globales. Nuestro trabajo aspira a hacer contribuciones significativas en DPI a través de lo siguiente:

  • Creación de marcos nacionales de M&E
  • Creación de capacidad sostenible
  • Enfoque para maximizar el impacto
  • Reducir la fragmentación de los servicios de DPI
  • Aprovechar el mundo digital
  • Generar y compartir conocimiento de M&E

GEI pretende ser facilitador:

  • GEI como catalizador
  • GEI como generador de bienes públicos
  • GEI se centra en los temas y sectores que tienen más probabilidades de generar un bien público.
  • GEI como socio de confianza
  • GEI se centra en países en desarrollo de todo el mundo.
  • GEI como plataforma para el intercambio de conocimientos GLOCAL
  • GEI fomenta la generación de conocimiento de M&E, compartiéndolo global y localmente (“gLocalmente”), de modo que el conocimiento de M&E generado en un país esté disponible para otros según sea relevante para un mayor aprendizaje y efectividad.

En otro post discutiremos los retos de esta iniciativa para conseguir el impacto colectivo pretendido. Como en otras ocasiones, en la distribución de responsabilidades y en los detalles de llevar la estrategia a la práctica (operativa) estarán algunas de las diferencias…

Written by cplysy · Categorized: TripleAD

Nov 17 2020

En busca del Liderazgo perdido

Fuente https://siembralabuenasemilla.files.wordpress.com/

Retomamos dos informes sobre liderazgo:

I. El informe, Liderazgo en tiempos de Covid-19 sobre “Cualidades, habilidades y prácticas esenciales” por Doug Reeler y Desiree Paulsen: Cuando enfrentamos una crisis compleja, impredecible e invisible, como el Covid-19, entonces se requiere un liderazgo de una calidad completamente diferente: los líderes deben ser (a) más artistas que científicos, (b) más facilitadores que controladores, activando el autocontrol en las personas que lideramos:

1.Contextualizar y analizar con perspectiva

2. Facilitar más que controlar 

3.Ser consciente del poder y sus privilegios

4.Escuchar para aprender y hacer preguntas que estimulen

5.Ser auténtic@, compartiendo luchas, miedos y dudas

6.Empatía y conexión

7.Si hay sobreesfuerzo y pérdida de perspectiva: Volver a conectar con el propósito y las prioridades

8.Enfoque situacionalmente a la toma de decisiones

9.Resituar Propósito y prioridades/valores

10.Tome un enfoque de aprendizaje frente a las malas (y buenas) decisiones

II. El artículo de Chris Roche y Lisa Denney: ¿Cómo pueden las agencias de cooperación internacional apoyar el liderazgo en el /del desarrollo? con cuatro conclusiones principales:

1. Existe en general un enfoque más basado (1) en el liderazgo individual, basado en nociones de liderazgo (a) gerencial y (b) de origen/centradas en Occidente, que (2) en el liderazgo colectivo;

2. Las agencias tienden (a) más a respaldar los intentos de apoyar directamente a los líderes individuales que (b) una participación más indirecta u oblicua en la configuración de los procesos, políticas o entornos que facilitan la aparición del liderazgo;

3. Sobre las mejores formas de trabajo en esta dirección más indirecta u oblicua: son importantes para un desarrollo mejorado, sostenible y liderado localmente;

4. Hay una serie de barreras, desincentivos o desafíos sistémicos que enfrentan las agencias y el sector para trabajar de esta manera. Desafíos relacionados con las políticas y prácticas organizacionales (sistemas de recursos humanos, procesos de contratación, informes, etc.)

Written by cplysy · Categorized: TripleAD

Nov 12 2020

La gestión del conocimiento: simple, pero no fácil

Fuente https://warriorsway.com/

En KM: simple pero no fácil Nick Milton indica que aunque la gestión del conocimiento es realmente bastante simple, sin embargo, simple no equivale a fácil.

Aunque las herramientas, técnicas, enfoques y estrategias pueden complicar la gestión del conocimiento, en el fondo es un concepto muy simple.

Se trata de asegurarse de que:

(1) los tomadores de decisiones de la organización tengan acceso al conocimiento crucial que necesitan para tomar decisiones.

(2) sucedan las cosas correctas para transferir ese conocimiento, ya sea a través de la conversación o del contenido.

Luego se trata de la gestión del cambio: cambiar a un mundo donde el conocimiento se considere importante.

El aspecto de la gestión del cambio es el aspecto difícil.

Existe suficiente tecnología, hay procesos bien definidos que funcionan extremadamente bien, hay una comprensión de los roles y conjuntos de habilidades necesarios, y hoy en día también hay una comprensión bastante buena de los elementos de gobernanza. Todo eso es bastante fácil. Es el cambio en sí lo que es difícil.

Ahí es donde la gente a menudo se equivoca con sus programas de Gestión del Conocimiento (GC). Hacen las cosas fáciles, no las difíciles. (1) Compran las tecnologías. (2) Imprimen los folletos. (3) Trabajan con los entusiastas y (4) cantan a/ con el coro.

Lo que no hacen con tanta frecuencia es:

(1) Tener una discusión intensa con la gerencia y el equipo de alta dirección sobre el valor que GC puede ofrecer a la organización y las pocas áreas de enfoque que deben abordar.

(2) Conseguir comprensión, apropiación, apoyo y patrocinadores de alto nivel.

(3) Discutir con los jefes de equipo en apuros para decidir qué se puede hacer para ayudarlos y qué pueden hacer los para ayudar al equipo de GC.

(4) Salir y trabajar en detalle con los proyectos piloto, para entregar los éxitos espectaculares que actúan como un faro para el resto de la organización.

La gestión del conocimiento no es complicada. Realmente no lo es, a pesar de los complicados modelos que la gente construye a veces. Pero necesita coraje y dedicación y necesita perseverancia y piel dura, y necesita que trabajes en algunas conversaciones muy difíciles.

Hagamos el trabajo duro señalado y lo simple será posible.

Written by cplysy · Categorized: TripleAD

Nov 12 2020

Is your site suffering from a content deficiency?

Have you ever visited a park that lacked any type of attraction? So no playground, trails, pond, shelter, picnic table, or anything else that make a visit worthwhile?

Boring right?

Unless it’s right across the street from your house with just enough room to toss a ball back and forth, you’re probably not going back to that park anytime soon. Don’t get me wrong, there are plenty of reasons to have open spaces, but if your community’s goal was to create something people cherish and love, it’s probably not enough.

The same is true is for the web. There are countless attraction-less websites filling up the interweb. Websites with just enough information to tell you who works at an organization and how you might be able to get in touch.

Essentially, digital brochures.

How to assess content deficiency

Usually you can tell pretty quickly if a website is just a digital brochure. But sometimes websites are structured in a way that looks simple and streamlined, but actually pack in a lot of content.

So the easiest way I’ve found to assess whether a website is content deficient is through a quick Google search. All you have to do is open up google and type “site:” before the web address. This will give you the pages Google has indexed for the particular hyperlink you put behind “site:”

Giving the numbers context.

More is not always better. But just like you wouldn’t frequent a library without books, you’ll likely not frequent a website without pages.

Understanding how much content your website should have is always an it depends kind of thing. But one thing I like to do is to look at a website in context to other websites that serve similar audiences.

For example, let’s look at EvalYouth. Their homepage is currently off of the main evalpartners page.

With only 3 results showing in Google for the Eval Youth homepage I would most certainly consider it a brochure style site.

EvalYouth in partnership with the UNFPA Evaluation Office and Global Parliamentarians Forum for Evaluation recently spun off a new website that appears to aspire for more growth. There is an advocacy goal, and the blog and resource library show the intention for growth.

This site at 60 results is still a fledgling from a digital perspective. There is only so much for search engines like Google to find.

So let’s look at some more established resource sites. What would it look like to really build out a stable library of content?

The European Evaluation Society which includes membership information, event information, and evaluation resources.
The American Evaluation Association which includes membership information, event information, and evaluation resources.
Better Evaluation, which is a nonprofit evaluation resource site with event information and evaluation resources.

What does it mean?

Page counts are not everything. But I will say this.

  • Better Evaluation receives far more web traffic than the American Evaluation Association.
  • The American Evaluation Association receives far more web traffic than the European Evaluation Society.
  • The European Evaluation Society receives far more web traffic than the Eval4Action website.

Anything can change, and we all start from somewhere. But building out a content library takes time and energy.

Developing a Content Strategy

Content doesn’t just develop itself. As much as we would like that to happen.

A good curator and team of creators, following a well-formed content strategy can build out a significant content library. Ask yourself why you have the site in the first place, and figure out how to build the content necessary to reach that goal.

And remember, just because your website looks the part, doesn’t mean it is actually achieving your goals.

Written by cplysy · Categorized: freshspectrum

Nov 11 2020

How We Evaluated: A Collaborative of Non-Profits Serving Immigrant and Refugee Youth

 

Defining evaluation purpose. Writing evaluation questions. Deploying data collection tools. These topics can all seem abstract on their own. To put the pieces in context, we’re offering this series on how we evaluated to show you what real-world evaluation looks like in practice.

This post explores how we at Three Hive Consulting worked with REACH Edmonton Council and other agencies to evaluate a unique initiative called Bridging Together. You’ll see how they developed and carried out an evaluation plan that yielded actionable information.

 

The initiative: Bridging Together

“When I’m focused on the day-to-day details, it’s easy to forget how many lives the collaborative is reaching. For me, evaluation helps to keep things in perspective.”

With funding from Immigration, Refugees and Citizenship Canada, REACH Edmonton Council acted as the backbone organization for this collective of youth-serving non-profits. Each of the partner organizations already offered programming for immigrant and refugee youth outside of school hours. Their after-school and summer programs varied in focus, but common elements included academics, sports, life skills, culture and recreation. These partner organizations met regularly to share resources, discuss common problems and share solutions, and REACH arranged for relevant training opportunities.

 

Intended outcomes

Bridging Together aimed to enhance outcomes for immigrant and refugee children and youth, their families, and the partner organizations.

Immigrants and refugees face many well-documented challenges when arriving in Canada, not limited to linguistic, cultural and environmental differences, physical and mental health, socialization, education and justice. While many reveal resilience and integrate well into Canadian society, a significant number do not fare so well. Through out-of-school time programming, partner organizations intended to help children and youth develop healthy relationships, improve self-efficacy, become involved in community, improve academic performance, and perhaps most importantly, have fun.

 

Developing the evaluation plan

Convening thirteen organizations to work toward a common goal is no small task. Having them agree on intended outcomes and evaluation processes was a smoother process than expected. We held a large group session to begin defining evaluation purpose, use and focus areas. From this meeting, we drafted four focus areas and posed several questions to attendees:

  • What would you like to know about your program?

  • What has worked before with evaluations you have been involved in?

  • What is your one piece of advice for how to make this a successful evaluation?

  • What difference should we see in a child or youth after participating in your program?

 

This stakeholder engagement process showed a need for a data collection approach that acknowledged commonalities while accommodating the uniqueness of different programs. We confirmed four common focus areas:

  1. Program description and participation

  2. Child, youth and family outcomes

  3. Collaboration

  4. Social return on investment

The social return on investment (SROI) was non-negotiable requirement. It is not a method we would have suggested, but as evaluators we know that sometimes we just have to do what we’re told. We’ll reflect on that SROI below.

Partners reviewed and made suggestions on draft versions of the evaluation plan until we arrived at a final version to guide the next two years.

 

Adapting data collection approaches

We mentioned above that it was important to partners that the evaluation reflected their individual programs. There was quite a bit of variation to address; one organization delivered their programming entirely in French, one provided free sports leagues for children in grades four through six, while others delivered more of a “homework club” program. Some organizations offered multiple programs through Bridging Together. Participant ages ranged from six to 24. In the first year, 390 children and youth participated.

Our methods, obviously, needed to accommodate different program activities, different languages, different reading levels, and very different logistics. So here’s what we did:

  1. Interactive, arts-based feedback sessions with youth in summer programs

  2. Program experience surveys for older children and youth

  3. Self-efficacy surveys for older children and youth

  4. Video-recorded, small group interviews with children at sports leagues

  5. Parent/caregiver program experience surveys

  6. Interviews with organization staff

  7. Social network analysis

  8. Administrative data analysis

  9. Social return on investments, requiring detailed funding and spending information from all organizations

 

Project ethics

We’re big fans of ARECCI, a project ethics review process we can access in Alberta. We made sure to include an ARECCI project ethics review in our proposal to REACH, and incorporated their suggestions into our processes.

 

Collecting data

We expected challenges in implementing the nine approaches above. In our monthly status updates, we tracked what we had done, what we planned to do next, what risks emerged and how we were mitigating them. 

Completing the summer feedback sessions required some support from sub-contractors. Our plan was to schedule these sessions, where we would also support the survey administration for older children and youth, as close to the end of their summer program as possible. Not surprisingly, many programs ended in the same week, so deploying evaluation assistants to all sites was tricky but we were able to accommodate those programs that agreed to participate.

Collecting data from this many sites also required support from program staff and volunteers. Contacting some organizations was easy; others’ capacity was so stretched that returning phone calls and emails did not always happen. Most were quite willing to support survey administration, with guidance provided. We did find, though, that sometimes younger children were completing surveys intended for older children and youth.

Getting parents and caregivers to complete surveys was challenging for some programs, and smooth for others. To make it easier for parents and caregivers to complete surveys, we provided both an online option and paper surveys, and kept the survey as short as possible while collecting the meaningful data we needed. Overall, our sample size for parents and caregivers was lower than we had hoped for—that’s a challenge many working in non-profit evaluation will be familiar with.

The SROI calculation required detailed information about program inputs and spending. Most partner organizations were running multiple programs, some of which had funding from the same sources. Many programs also relied on funding from other sources, volunteers, and subsidized facility rentals. We were fortunate to have support from REACH to create a spreadsheet for organizations to identify all financial and in-kind resources needed to run their Bridging Together program and all associated spending. Completing that spreadsheet represented a great deal of time for partner organizations.

 

Sharing findings

REACH Edmonton Bridging Together Report

We produced a few different reports throughout this contract. The major products were comprehensive written reports for Year 1 and Year 2. Each yearly report addressed the first two focus areas, program reach and outcomes, and one additional focus area. Following the preparation of the draft reports, we attended meetings with partners to review findings and gather their perspectives and suggestions.

These comprehensive reports addressed Bridging Together as a whole, but we also wanted to provide individual organizations with results that they could use to inform program changes, organizational reporting and further advocacy. We therefore provided short summaries of results for each partner organizations.

 

Informing our practice

As evaluators, we learn from every project we undertake. The Bridging Together project spanned two years and showed us the importance of strong working relationships with clients and stakeholders. This project showed us how valuable a convener or coordinator is in collective impact projects—we would have needed to invest more resources in project management if REACH had not so capably undertaken that role.

This project also demonstrated how vital data management practices are when working with multiple sites across multiple timepoints. A good spreadsheet or other tool to track which data has been received from which site supports sounds project management.

We’ve always been pretty flexible, but this project reinforced how important is to be able to adapt processes to fit different contexts. For example, our youth feedback sessions looked different across sites. In some, we used classrooms with structured space; in others, we set up in a hallway and had children and youth move through a sort of drawing and writing gauntlet. One method, the mini-interview, was used for just one program because it was simply the only feasible way to collect data from busy kids running on and off the field. Seeing how this variation in methods led to a richer knowledge product has reinforced for us that adaptability is key in real-world evaluation.

And finally, the SROI. The calculation showed that for every dollar invested, Bridging Together created at least $3.30 in returned social value. This figure is powerful in reporting and future funding applications. Obtaining the data to inform this calculation was A LOT of work for partner organizations. Many organizations’ accounting systems were not set up to track costs for individual programs; the work required to set up overhead calculations and other bookkeeping details for many different programs cannot often be accommodated through non-profit administrative allocations. We have always viewed this method with skepticism, and questioned the need for it at all. The value of improving outcomes for children and youth has been well documented. We already know that investing in children saves money later. We approached this project with the view that requiring resource-limited programs to undertake this complex and imprecise calculation is an undue burden and does not yield new findings; that view hasn’t changed.

 

Client perspective

“ Having an unbiased third-party report to show our success is so important to be able to justify the worth of this collaborative to the funder we had, as well as potential future funders.”

How has this evaluation been applied at REACH? Evaluation use is a topic many contracted evaluators wonder about. Is the report just living on a server somewhere, never to be consulted again? Or have the findings and recommendations been used to drive program changes, to advocate for funding, to share a story of impact?  

Overall, REACH is dedicated to evaluating its work. “We know that nothing is perfect and evaluation results help to inform the project as it unfolds and influences decisions,” notes Project Manager Lisa Kardosh. “When I’m focused on the day-to-day details, it’s easy to forget how many lives the collaborative is reaching. For me, evaluation helps to keep things in perspective.”

For REACH, early results were useful in ongoing planning. “The interim report gave the collaborative a chance to assess if we were on the right track or not, and thankfully for the most part we were,” says Kardosh. “One benefit of the report was that it helped to shed light on gaps that were popping up, like more training being needed, so we could address it.”

Interim reporting also yielded an early opportunity to demonstrate the value of the program to the funder. “It was useful to share the Year 1 results with our funder so that they could see that their investment was making a difference.”

REACH and the Bridging Together partners have used the final evaluation report for advocacy and communication. “We’ve shared the Year 2 results quite broadly among our networks,” says Kardosh. “Having an unbiased third-party report to show our success is so important to be able to justify the worth of this collaborative to the funder we had, as well as potential future funders.”

 

Watch for more in our How We Evaluated series.


Sign up for our newsletter

We’ll let you know about our new content, and curate the best new evaluation resources from around the web!


We respect your privacy.

Thank you!


 

Written by cplysy · Categorized: evalacademy

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 219
  • Go to page 220
  • Go to page 221
  • Go to page 222
  • Go to page 223
  • Interim pages omitted …
  • Go to page 304
  • Go to Next Page »

Footer

Follow our Work

The easiest way to stay connected to our work is to join our newsletter. You’ll get updates on projects, learn about new events, and hear stories from those evaluators whom the field continues to actively exclude and erase.

Get Updates

Want to take further action or join a pod? Click here to learn more.

Copyright © 2026 · The May 13 Group · Log in

en English
af Afrikaanssq Shqipam አማርኛar العربيةhy Հայերենaz Azərbaycan dilieu Euskarabe Беларуская моваbn বাংলাbs Bosanskibg Българскиca Catalàceb Cebuanony Chichewazh-CN 简体中文zh-TW 繁體中文co Corsuhr Hrvatskics Čeština‎da Dansknl Nederlandsen Englisheo Esperantoet Eestitl Filipinofi Suomifr Françaisfy Fryskgl Galegoka ქართულიde Deutschel Ελληνικάgu ગુજરાતીht Kreyol ayisyenha Harshen Hausahaw Ōlelo Hawaiʻiiw עִבְרִיתhi हिन्दीhmn Hmonghu Magyaris Íslenskaig Igboid Bahasa Indonesiaga Gaeilgeit Italianoja 日本語jw Basa Jawakn ಕನ್ನಡkk Қазақ тіліkm ភាសាខ្មែរko 한국어ku كوردی‎ky Кыргызчаlo ພາສາລາວla Latinlv Latviešu valodalt Lietuvių kalbalb Lëtzebuergeschmk Македонски јазикmg Malagasyms Bahasa Melayuml മലയാളംmt Maltesemi Te Reo Māorimr मराठीmn Монголmy ဗမာစာne नेपालीno Norsk bokmålps پښتوfa فارسیpl Polskipt Portuguêspa ਪੰਜਾਬੀro Românăru Русскийsm Samoangd Gàidhligsr Српски језикst Sesothosn Shonasd سنڌيsi සිංහලsk Slovenčinasl Slovenščinaso Afsoomaalies Españolsu Basa Sundasw Kiswahilisv Svenskatg Тоҷикӣta தமிழ்te తెలుగుth ไทยtr Türkçeuk Українськаur اردوuz O‘zbekchavi Tiếng Việtcy Cymraegxh isiXhosayi יידישyo Yorùbázu Zulu