• Skip to main content
  • Skip to footer
  • Home

The May 13 Group

the next day for evaluation

  • Get Involved
  • Our Work
  • About Us
You are here: Home / Archives for allblogs

allblogs

Nov 11 2020

How We Evaluated: A Collaborative of Non-Profits Serving Immigrant and Refugee Youth

 

Defining evaluation purpose. Writing evaluation questions. Deploying data collection tools. These topics can all seem abstract on their own. To put the pieces in context, we’re offering this series on how we evaluated to show you what real-world evaluation looks like in practice.

This post explores how we at Three Hive Consulting worked with REACH Edmonton Council and other agencies to evaluate a unique initiative called Bridging Together. You’ll see how they developed and carried out an evaluation plan that yielded actionable information.

 

The initiative: Bridging Together

“When I’m focused on the day-to-day details, it’s easy to forget how many lives the collaborative is reaching. For me, evaluation helps to keep things in perspective.”

With funding from Immigration, Refugees and Citizenship Canada, REACH Edmonton Council acted as the backbone organization for this collective of youth-serving non-profits. Each of the partner organizations already offered programming for immigrant and refugee youth outside of school hours. Their after-school and summer programs varied in focus, but common elements included academics, sports, life skills, culture and recreation. These partner organizations met regularly to share resources, discuss common problems and share solutions, and REACH arranged for relevant training opportunities.

 

Intended outcomes

Bridging Together aimed to enhance outcomes for immigrant and refugee children and youth, their families, and the partner organizations.

Immigrants and refugees face many well-documented challenges when arriving in Canada, not limited to linguistic, cultural and environmental differences, physical and mental health, socialization, education and justice. While many reveal resilience and integrate well into Canadian society, a significant number do not fare so well. Through out-of-school time programming, partner organizations intended to help children and youth develop healthy relationships, improve self-efficacy, become involved in community, improve academic performance, and perhaps most importantly, have fun.

 

Developing the evaluation plan

Convening thirteen organizations to work toward a common goal is no small task. Having them agree on intended outcomes and evaluation processes was a smoother process than expected. We held a large group session to begin defining evaluation purpose, use and focus areas. From this meeting, we drafted four focus areas and posed several questions to attendees:

  • What would you like to know about your program?

  • What has worked before with evaluations you have been involved in?

  • What is your one piece of advice for how to make this a successful evaluation?

  • What difference should we see in a child or youth after participating in your program?

 

This stakeholder engagement process showed a need for a data collection approach that acknowledged commonalities while accommodating the uniqueness of different programs. We confirmed four common focus areas:

  1. Program description and participation

  2. Child, youth and family outcomes

  3. Collaboration

  4. Social return on investment

The social return on investment (SROI) was non-negotiable requirement. It is not a method we would have suggested, but as evaluators we know that sometimes we just have to do what we’re told. We’ll reflect on that SROI below.

Partners reviewed and made suggestions on draft versions of the evaluation plan until we arrived at a final version to guide the next two years.

 

Adapting data collection approaches

We mentioned above that it was important to partners that the evaluation reflected their individual programs. There was quite a bit of variation to address; one organization delivered their programming entirely in French, one provided free sports leagues for children in grades four through six, while others delivered more of a “homework club” program. Some organizations offered multiple programs through Bridging Together. Participant ages ranged from six to 24. In the first year, 390 children and youth participated.

Our methods, obviously, needed to accommodate different program activities, different languages, different reading levels, and very different logistics. So here’s what we did:

  1. Interactive, arts-based feedback sessions with youth in summer programs

  2. Program experience surveys for older children and youth

  3. Self-efficacy surveys for older children and youth

  4. Video-recorded, small group interviews with children at sports leagues

  5. Parent/caregiver program experience surveys

  6. Interviews with organization staff

  7. Social network analysis

  8. Administrative data analysis

  9. Social return on investments, requiring detailed funding and spending information from all organizations

 

Project ethics

We’re big fans of ARECCI, a project ethics review process we can access in Alberta. We made sure to include an ARECCI project ethics review in our proposal to REACH, and incorporated their suggestions into our processes.

 

Collecting data

We expected challenges in implementing the nine approaches above. In our monthly status updates, we tracked what we had done, what we planned to do next, what risks emerged and how we were mitigating them. 

Completing the summer feedback sessions required some support from sub-contractors. Our plan was to schedule these sessions, where we would also support the survey administration for older children and youth, as close to the end of their summer program as possible. Not surprisingly, many programs ended in the same week, so deploying evaluation assistants to all sites was tricky but we were able to accommodate those programs that agreed to participate.

Collecting data from this many sites also required support from program staff and volunteers. Contacting some organizations was easy; others’ capacity was so stretched that returning phone calls and emails did not always happen. Most were quite willing to support survey administration, with guidance provided. We did find, though, that sometimes younger children were completing surveys intended for older children and youth.

Getting parents and caregivers to complete surveys was challenging for some programs, and smooth for others. To make it easier for parents and caregivers to complete surveys, we provided both an online option and paper surveys, and kept the survey as short as possible while collecting the meaningful data we needed. Overall, our sample size for parents and caregivers was lower than we had hoped for—that’s a challenge many working in non-profit evaluation will be familiar with.

The SROI calculation required detailed information about program inputs and spending. Most partner organizations were running multiple programs, some of which had funding from the same sources. Many programs also relied on funding from other sources, volunteers, and subsidized facility rentals. We were fortunate to have support from REACH to create a spreadsheet for organizations to identify all financial and in-kind resources needed to run their Bridging Together program and all associated spending. Completing that spreadsheet represented a great deal of time for partner organizations.

 

Sharing findings

REACH Edmonton Bridging Together Report

We produced a few different reports throughout this contract. The major products were comprehensive written reports for Year 1 and Year 2. Each yearly report addressed the first two focus areas, program reach and outcomes, and one additional focus area. Following the preparation of the draft reports, we attended meetings with partners to review findings and gather their perspectives and suggestions.

These comprehensive reports addressed Bridging Together as a whole, but we also wanted to provide individual organizations with results that they could use to inform program changes, organizational reporting and further advocacy. We therefore provided short summaries of results for each partner organizations.

 

Informing our practice

As evaluators, we learn from every project we undertake. The Bridging Together project spanned two years and showed us the importance of strong working relationships with clients and stakeholders. This project showed us how valuable a convener or coordinator is in collective impact projects—we would have needed to invest more resources in project management if REACH had not so capably undertaken that role.

This project also demonstrated how vital data management practices are when working with multiple sites across multiple timepoints. A good spreadsheet or other tool to track which data has been received from which site supports sounds project management.

We’ve always been pretty flexible, but this project reinforced how important is to be able to adapt processes to fit different contexts. For example, our youth feedback sessions looked different across sites. In some, we used classrooms with structured space; in others, we set up in a hallway and had children and youth move through a sort of drawing and writing gauntlet. One method, the mini-interview, was used for just one program because it was simply the only feasible way to collect data from busy kids running on and off the field. Seeing how this variation in methods led to a richer knowledge product has reinforced for us that adaptability is key in real-world evaluation.

And finally, the SROI. The calculation showed that for every dollar invested, Bridging Together created at least $3.30 in returned social value. This figure is powerful in reporting and future funding applications. Obtaining the data to inform this calculation was A LOT of work for partner organizations. Many organizations’ accounting systems were not set up to track costs for individual programs; the work required to set up overhead calculations and other bookkeeping details for many different programs cannot often be accommodated through non-profit administrative allocations. We have always viewed this method with skepticism, and questioned the need for it at all. The value of improving outcomes for children and youth has been well documented. We already know that investing in children saves money later. We approached this project with the view that requiring resource-limited programs to undertake this complex and imprecise calculation is an undue burden and does not yield new findings; that view hasn’t changed.

 

Client perspective

“ Having an unbiased third-party report to show our success is so important to be able to justify the worth of this collaborative to the funder we had, as well as potential future funders.”

How has this evaluation been applied at REACH? Evaluation use is a topic many contracted evaluators wonder about. Is the report just living on a server somewhere, never to be consulted again? Or have the findings and recommendations been used to drive program changes, to advocate for funding, to share a story of impact?  

Overall, REACH is dedicated to evaluating its work. “We know that nothing is perfect and evaluation results help to inform the project as it unfolds and influences decisions,” notes Project Manager Lisa Kardosh. “When I’m focused on the day-to-day details, it’s easy to forget how many lives the collaborative is reaching. For me, evaluation helps to keep things in perspective.”

For REACH, early results were useful in ongoing planning. “The interim report gave the collaborative a chance to assess if we were on the right track or not, and thankfully for the most part we were,” says Kardosh. “One benefit of the report was that it helped to shed light on gaps that were popping up, like more training being needed, so we could address it.”

Interim reporting also yielded an early opportunity to demonstrate the value of the program to the funder. “It was useful to share the Year 1 results with our funder so that they could see that their investment was making a difference.”

REACH and the Bridging Together partners have used the final evaluation report for advocacy and communication. “We’ve shared the Year 2 results quite broadly among our networks,” says Kardosh. “Having an unbiased third-party report to show our success is so important to be able to justify the worth of this collaborative to the funder we had, as well as potential future funders.”

 

Watch for more in our How We Evaluated series.


Sign up for our newsletter

We’ll let you know about our new content, and curate the best new evaluation resources from around the web!


We respect your privacy.

Thank you!


 

Written by cplysy · Categorized: evalacademy

Nov 11 2020

Finish the Cartoon – Engaging Stakeholders

When I lead workshops, one of my favorite types of activities involves using half-finished cartoons.

Sometimes I print out the cartoons as handouts, or I put them up on a slide and ask the participants to pull out a pen and paper. Then I ask them to try their hand at finishing the cartoons.

After that, we share out the results.

It’s a super fun way to launch a discussion into all sorts of topics. I’ve been in experimental mode lately and thought I would create a few sets that might be useful for evaluators. Let me know what you think, and if you would like me to create more of these.

Who is at the table?

The easiest way to form a consensus is to not invite anyone else to the table when it’s time to discuss the issue. Especially anyone who might not agree with you.

This is also the worst way to form a consensus.

Theories of change are negotiations.

You might all agree to a set of words that you put at the end of a logic model or theory of change. But that doesn’t mean that the goal written is the goal held in everyone’s minds. This cartoon is to help you start to find some of those unwritten goals.

All the important stakeholders.

You might want to create an exhaustive list of everyone who has a stake in your program or project. Especially the people who are not a part of your team.

Keep going with it.

I’d suggest going further and further down the rabbit hole. Keep asking the question in different ways.

Gut check time.

Many evaluators view themselves as speakers for a broad range of people.

But we can’t escape our own biases and privileges. Is your evaluation team composition representative of the community you are working within? If not, why not?

What happens when the funding stops?

Projects have lifespans. Some live longer than others. Recognizing the mortality is important.

And who is left behind?

People get caught up in projects. But after the project ends, people remain. Who are those people that will remain?

Download the 7 cartoon set as a Power Point deck.

You can now download the Power Point deck via Gumroad. For a free download, just put $0 in the box.

My Independent Consulting Jumpstart course is now Pay What You Want.

I decided I wanted to make the course way more accessible. So now you can grab it from Gumroad for whatever price you want to pay. This includes $0.

And there is absolutely no judgement for you putting $0 in the box.

https://gumroad.com/l/consultingjumpstart

Written by cplysy · Categorized: freshspectrum

Nov 11 2020

Comment on I’m back to blogging by Beth

Thanks, Lais! It does help with motivation to know that my postings are useful to others!

Written by cplysy · Categorized: drbethsnow

Nov 11 2020

Comment on I’m back to blogging by Lais

I can relate to that feeling, I literally have embroided this phrase on a hoop and hanged it on my wall. If it helps with motivation, as an emergent evaluator your posts on evaluation competences have helped me a lot in planning the next steps of my professional development. I’d love to see what comes next! Thanks for the hard work.

Written by cplysy · Categorized: drbethsnow

Nov 10 2020

Evaluators as Change Agents: Evaluating Community Coalitions

Hi everyone!

A few weeks ago, I was privileged to speak with the West Michigan Evaluators Network (WMEN) about evaluating coalitions and collaboratives. I started from the framework and opinion that evaluators can and should be social change agents. After all, my company’s (Community Evaluation Solutions) tagline is Partnering for Social Change. I believe that evaluators are not (just) objective observers of the programs and communities we serve. Rather, I believe that coalitions can be a powerful catalyst for change, and that evaluators can help them achieve their goals.

As evaluators we have the great opportunity to work for social change by partnering with community coalitions.

What are coalitions and collaboratives? (Note, I use these terms interchangeably).

So just why are coalitions so important? Coalitions at their best:

Coalitions are a formal arrangement for cooperation and collaboration between sectors of the community, in which each group retains its identity but agree to work together toward a common goal.

– Fran Butterfoss

  • Engage community members from all sectors, but most especially those who are most affected by the public health and social problem of interest (those with low income, are marginalized in some way, and people of color).
  • Bring together fragmented systems and help maximize resources;
  • Build community capacity for solving community problems;
  • Increase civic engagement; and
  • Organize community members, help them leverage their collective voice and maximize political power to help create long-term, systemic change.

Coalitions are MOST effective when they address community-wide problems using a public health approach to address systems level change.

Coalitions, while they have their strengths and benefits, are fraught with challenges. So, a word of warning – coalitions involve people and those people represent their organizations and themselves. They may have a hard time setting aside their perspectives for the good of the coalition or community. Structure (by-laws, committees, effective meetings) are important, but many coalition members want to skip this important work. Conflict is inevitable, and dare I say, necessary? Perhaps the most important challenge is coalition leadership. A good leader is a must for all effective coalitions. An effective coalition leader can inspire the group, bring them all together, and engage them in the work.

Coalition Development

Just like any other group, coalitions cycle in and out of stages and evaluation questions should change to reflect these stages.

During the formation stage, be mindful of the community’s context and history. Ask who is engaged and maybe more importantly, who is not engaged? Are by-laws and committees established? Are meetings effective? Does the coalition know what collaboration even is? I once worked with a new coalition and every time something needed to be done, they whipped their heads around to look at the coalition director. In that situation, we had to do some training on what it really means to collaborate before we could move on to the work.

During the maintenance stage, everyone has settled into their roles (hopefully) and fences mended (again, hopefully). Evaluation questions at this stage may include, are members satisfied with how the coalition is functioning? Are meetings effective? Is membership (still) representative of the community? Is implementation effective? Is there evidence of short-term outcomes?

Finally, after some years, the coalition reaches some level of stability. Evaluation questions at the institutionalization stage should include a focus on long-term outcomes and sustainability. Has the coalition grown in its organizational, leadership and evaluation capacity? As the coalition cycles through these stages, the evaluation plan should as well.

At times you may be called on to be a trainer, a strategy or program developer, a conflict manager and sometimes an evaluator. I promise, evaluating coalitions is often rewarding and most certainly, never boring.

Send me an email to connect with me to talk all things evaluation and community coalitions at aprice@communityevaluationsolutions.com and let me know if you want me to add you to my contact list. I am on Twitter at @annwprice or on FaceBook at @CommunityEvaluationSolutions.

Written by cplysy · Categorized: communityevaluationsolutions

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 222
  • Go to page 223
  • Go to page 224
  • Go to page 225
  • Go to page 226
  • Interim pages omitted …
  • Go to page 310
  • Go to Next Page »

Footer

Follow our Work

The easiest way to stay connected to our work is to join our newsletter. You’ll get updates on projects, learn about new events, and hear stories from those evaluators whom the field continues to actively exclude and erase.

Get Updates

Want to take further action or join a pod? Click here to learn more.

Copyright © 2026 · The May 13 Group · Log in

en English
af Afrikaanssq Shqipam አማርኛar العربيةhy Հայերենaz Azərbaycan dilieu Euskarabe Беларуская моваbn বাংলাbs Bosanskibg Българскиca Catalàceb Cebuanony Chichewazh-CN 简体中文zh-TW 繁體中文co Corsuhr Hrvatskics Čeština‎da Dansknl Nederlandsen Englisheo Esperantoet Eestitl Filipinofi Suomifr Françaisfy Fryskgl Galegoka ქართულიde Deutschel Ελληνικάgu ગુજરાતીht Kreyol ayisyenha Harshen Hausahaw Ōlelo Hawaiʻiiw עִבְרִיתhi हिन्दीhmn Hmonghu Magyaris Íslenskaig Igboid Bahasa Indonesiaga Gaeilgeit Italianoja 日本語jw Basa Jawakn ಕನ್ನಡkk Қазақ тіліkm ភាសាខ្មែរko 한국어ku كوردی‎ky Кыргызчаlo ພາສາລາວla Latinlv Latviešu valodalt Lietuvių kalbalb Lëtzebuergeschmk Македонски јазикmg Malagasyms Bahasa Melayuml മലയാളംmt Maltesemi Te Reo Māorimr मराठीmn Монголmy ဗမာစာne नेपालीno Norsk bokmålps پښتوfa فارسیpl Polskipt Portuguêspa ਪੰਜਾਬੀro Românăru Русскийsm Samoangd Gàidhligsr Српски језикst Sesothosn Shonasd سنڌيsi සිංහලsk Slovenčinasl Slovenščinaso Afsoomaalies Españolsu Basa Sundasw Kiswahilisv Svenskatg Тоҷикӣta தமிழ்te తెలుగుth ไทยtr Türkçeuk Українськаur اردوuz O‘zbekchavi Tiếng Việtcy Cymraegxh isiXhosayi יידישyo Yorùbázu Zulu