• Skip to main content
  • Skip to footer
  • Home

The May 13 Group

the next day for evaluation

  • Get Involved
  • Our Work
  • About Us
You are here: Home / Archives for allblogs

allblogs

May 07 2024

Events

On this page I’ll be listing all upcoming live events, along with the relevant registration or RSVP pages.

May 2024

5/14 & 516 – Workshop: Everyday Visual Reports [Register]

The next live session of my Everyday Visual Reports workshop is scheduled for Tuesday, May 14 & Thursday, May 16 starting at 3:00PM Eastern. The workshop is taught in two 90 minute sessions.

Registration is OPEN NOW.

5/23 – Webinar: Creativity Toolkit Mini-Session [Register]

The first mini-session on creating Icon Arrays & Pictograms will be held on Thursday, May 23 at 10:00AM Eastern. This 60 minute session is part of a series included with the purchase of any of my premium workshops. A separate mini-session only registration is also available.

Registration is OPEN NOW.

5/28 & 5/30 – Workshop: Effective Data Storytelling [Waitlist]

The first live session of my Effective Data Storytelling workshop is scheduled for Tuesday, May 28 & Thursday, May 30 starting at 10:00AM Eastern. The workshop is taught in two 90 minute sessions.

Registration will be opening soon. You can join the waitlist here.

Ongoing

5/8, 5/15, 5/22, 5/29 – Office Hours [RSVP]

Office Hours are held two times on Wednesdays at 10AM Eastern and 3PM Eastern. As long as at least one person RSVPs, I will hold the session. One year of office hours are included with every premium workshop enrollment (they stack if you enroll in more than one workshop).

You can also register separately. RSVP Here

Written by cplysy · Categorized: freshspectrum

May 01 2024

Ask Nicole: What Would It Take for Our Organization to No Longer Exist?

Have a question you’d like to be featured? Let me know. In evaluation, we assess a program or service’s impact to understand whether anticipated outcomes were achieved based on the program’s activities and resources, and whether changes in a participants’ behavior, attitude, or actions can be attributed to the program or something else. But what […]

The post Ask Nicole: What Would It Take for Our Organization to No Longer Exist? appeared first on Nicole Clark Consulting.

Written by cplysy · Categorized: nicoleclark

Apr 30 2024

complexity, the weather, and evaluation

The view from Cape Roger Curtis over the Salish Sea on Nex̱wlélex̱wm/Bowen Island.

My friend Chris Corrigan recently wrote a great blog post on weather and complexity, riffing off a statement from a retiring weather forecaster to talk about how to navigate complexity. One of my favourite COVID-era hobbies was tracking weather patterns with Chris and our friend Amanda. As systems swept in and out over the coast, we would announce in our group text the moment when rain reached our respective locations, from Nex̱wlélex̱wm/Bowen Island to East Van to New Westminister. Chris always has a fascinating app or person he follows on Twitter with cool maps and data about what is actually happening and the three of us got quite nerdy about it. (I’ll never forget on the first night of the heat dome, when he showed me a heat map visualizing that column of hot, red air going straight up to the highest levels of the atmosphere, sitting on top of us with nowhere to go. Terrifying.)

So when I need a simple way to illustrate how data alone is not the answer to our evaluation challenges, I often find myself using weather forecasting. It’s something that we’re all familiar with in a general sense, but often don’t have a full appreciation of what’s actually going on behind the scenes. It also speaks to the very practical circumstances of our day-to-day lives, including the fact that we live in a world of increasingly dangerous weather events and climate change.

Weather is also something we can collect fairly concrete data about. These are physical phenomena that we can measure directly with reasonable precision and reliability. We can then subject this data to some pretty sophisticated mathematical modelling and make decently accurate predictions about what will happen at least briefly into the future. The meteorologist that Chris’s post quotes foresees a future into which our statistical clairvoyance can still improve by leaps and bounds with new technological breakthroughs.

Yet data collection and even data analysis are only one part of what we need. A sentence that stood out to me from the quote at the start of that post, “The atmosphere is a nonlinear system, meaning our ability to forecast it is extremely sensitive to knowing the exact condition of every breath of air”, really gets at the scope of the challenge. Highly-connected, interdependent complex systems are limitless and irreproducible in models. We cannot capture every single small element that might affect what happens next. Not to mention, because these systems are non-linear, where inputs and outputs are not proportionate (i.e., ‘the butterfly effect’), even the tiniest unmeasured element, like a breath of air, might end up being an integral part of a complex emergence. Add to that the logistical cost and complexity of collecting, managing, and analyzing high volumes of data, which is is limited not just by the human capacity to do so but literally the physical computer processing power available. And this again is relatively concrete data that lends itself to this kind of statistical modelling (versus the rigamarole of developing abstract proxy indicators of non-quantitative concepts, like “wellbeing” or “motivation” or “knowledge”, which do not have universally-agreed upon definitions much less direct modes of measurement).

On top of all of that, even with all of the data we can get and the physical ability to analyze and make sense of it, and the historical and theoretical knowledge to build reasonably accurate predictive models about it, not only is that prediction still always going to be inherently limited and fallible (which does not detract from the work of the forecasters), but having all of that in hand still only at best offers us somewhat more information with which we must still ultimately make a decision about what to do. And even that information will be highly context-specific and have a fairly short shelf-life of relevance. Alongside getting a heads-up that my area is at risk of an extreme weather event within the next six hours, I need to have my own contingencies in place and hope I live in a place with a well-resourced and well-designed emergency response plan and the capacity and political wherewithal to carry it out. That also

No shade to the meteorologists, of course! Their job is to make sure the information is timely, accessible, and reliable, and that’s important. But data will never tell us what to do or how to do it or make sure that it’s acted on well—that’s on us. 

And you may think that evaluation isn’t about forecasting, it’s about accountability and learning and looking to the past to describe and report on what happened and what has been accomplished. But in practical terms, many of us approach evaluation with the idea (rightly or wrongly) that what has happened in the past will happen again the same way in the future. When we ask, “Does the program work?” (a question situated in a generic present tense), the logic of evaluation is to look at how it has (and hasn’t) worked already and extrapolate into a present (and presumably future) tense based on that. When program sponsors decide to fund or support a particular initiative, they are doing so with an eye to the future and what they hope or believe or want to see happen, usually with a lot less concrete data to go on than what the weather forecasters are working with. We look to the past the same way we use a mirror to look at the back of our heads—to see what we can’t see.

If you think the solution to the uncertainty and guesswork of this process is “data-driven decision-making”, I refer you again to the sentence, “our ability to forecast it is extremely sensitive to knowing the exact condition of every breath of air”, and ask, who will build and fund this data infrastructure and make sure it is available to and appropriate for everyone (so as not to design inequity into the system from the start)? Even the meteorologist notes, “we crudely sample the atmosphere directly with instruments that aren’t precise and numerous enough, and make even more approximations with remote sensing like satellites”. These are multi-billion investments over decades that are still underfunded and less developed than they could be. 

As Chris makes the point in his post, these models do not give us much insight into hyper-local conditions, which are greatly impacted by the geological specificities of our exact contexts. Saying, “This program works. It’s a great intervention.” does not account for the particularities of how a program might play out at another time or in another place. And the answer to that is not ‘more data’ and ‘better models’, but better situational awareness and attention to the context of the present moment, more acknowledgement of our agency and responsibility in the decisions we make about our social interventions, and looking to the data available to us with a critical eye for useful insights rather than definite answers.


Want to learn more about evaluation with me? Check out my upcoming offerings, including the WEAVING IT IN evaluation workshop.

Written by cplysy · Categorized: carolyncamman

Apr 30 2024

New Infographic: Steps for Developing an Outcome Assessment Survey

This article is rated as:

 

 

Eval Academy just released a new infographic: “Steps for Developing an Outcome Assessment Survey”


Who’s it for?

This infographic is for anyone looking to learn more about developing a survey for an outcome assessment.


What’s the purpose?

The Steps for Developing an Outcome Assessment Survey infographic will help you to:

  • Clarify the purpose of your survey.

  • Define your participants.

  • Draft your survey questions and responses.

  • Test your survey.


What’s included?

A one-page, downloadable infographic as a PDF file.

 

 


Download the Steps for Developing an Outcome Assessment Survey infographic now!


Learn more: related articles and links:

  • Survey Questions Infographic

  • Survey Question Types

Written by cplysy · Categorized: evalacademy

Apr 30 2024

Unlocking Impact: The Importance of Evaluation for Non-Profits

This article is rated as:

 

 


Evaluation isn’t just a buzzword; for non-profits focused on making a genuine difference in their communities, evaluation is indispensable. It’s the process that lets you measure how effective, efficient, and impactful your programs and operations really are. It’s about making sure your efforts are hitting the mark.

This article will offer you evidence about why evaluation is so critical for non-profits. We’ll share practical advice and an example from the field courtesy of Three Hive Consulting.

 

 


Book your consultation with Three Hive

So, if you’re part of a non-profit and keen on understanding how evaluation can elevate your work, you’re in the right place.


Why Does Evaluation Matter for Non-Profits?

Program evaluation supports non-profits in several ways:

 

  

 


Three Hive’s Tips for Effective Evaluation of Non-Profit Programs:

  1. Set Clear Goals: Before diving into evaluation, you should establish clear objectives and outcomes you aim to achieve, otherwise known as beginning with the end in mind. These objectives serve as benchmarks for measuring success and guide the selection of appropriate evaluation metrics.

  2. Encourage Collaboration: Involving partners from the beginning and throughout the evaluation process fosters buy-in, collaboration, and a deeper understanding of your organization’s impact. Whether it’s donors, beneficiaries, staff, or community members, their insights are invaluable for gaining diverse perspectives.

  3. Customize Your Approach: There’s no one-size-fits-all in evaluation. As a non-profit, you should employ a variety of quantitative and qualitative methods tailored to your unique goals and context. This may include surveys, interviews, focus groups, observations,  arts-based methods, or even participatory approaches involving community members. A tailored approach ensures more relevant and actionable insights.

  4. Foster Learning Culture: Shift the perspective on evaluation from a checkbox exercise to a valuable learning tool. Encourage your team to see evaluation as a chance to grow. Welcoming feedback and being willing to adapt based on findings can transform your programs and operations.

  5. Build Evaluation Capacity: Whether you’re managing evaluations internally or seeking external expertise, having the right skills in place is key. Train your staff on evaluation principles and techniques (don’t forget to follow Eval Academy on LinkedIn for free education and resources!), or hire an evaluation consultant to complement your team’s capabilities (find out more about how Three Hive can support you). You can also learn more about evaluating your own program through our new, online course: Program Evaluation for Program Managers. This ensures you’re well-prepared to translate evaluation results into effective action.

  6. Communicate Results Effectively: Sharing your findings is as important as the evaluation itself. Develop a clear, engaging way to present your results to stakeholders, using visuals and stories to highlight your impact and learnings. Effective communication can increase support, drive action, and demonstrate accountability.

  7. Plan for Sustainability: Consider how the insights from your evaluation will be used in the long term. Embed evaluation findings into your strategic planning to ensure that improvements are sustained, and that evaluation becomes an integral part of your organization’s rhythm. Check out our article on Evaluation Sustainability Plans.


Elevating Non-Profits: Real-World Experiences from Three Hive Consulting

At Three Hive Consulting, we’ve had the privilege of partnering with numerous non-profits on their evaluation journeys.  Our collaboration with the REACH Edmonton Council on the Bridging Together initiative serves as a testament to the transformative power of thoughtful evaluation. Three Hive conducted an evaluation of the Bridging Together initiative, aimed at enhancing outcomes for immigrant and refugee children and youth. This initiative involved a collective of youth-serving non-profits, each offering programming focusing on academics, sports, life skills, culture, and recreation.

Three Hive Consulting developed an evaluation plan, engaging stakeholders to define focus areas and questions. Data collection methods included interactive sessions, surveys, micro-interviews, social network analysis, and administrative data analysis. Despite challenges in data collection, the evaluation produced actionable information. The final report demonstrated the initiative’s success, aiding in ongoing planning, informing funders, and advocating for future support. This evaluation exemplifies how rigorous assessment can drive program improvements and justify investments in non-profit collaborations.

 

 

Some of our other non-profit clients include: United Way Alberta Capital Region, Canadian Mental Health Association BC, BGC Canada, Calgary Homeless Foundation, and Catholic Social Services.


Embark on Your Evaluation Journey

Evaluation isn’t just a checkbox; it’s a catalyst for growth, learning, and impact within the non-profit sector. By embracing evaluation practices rooted in transparency, accountability, and continuous improvement, your non-profit organization can amplify its effectiveness and create lasting change in the communities you serve.

If your non-profit organization is seeking guidance on evaluation strategies tailored to your unique needs, consider reaching out to Three Hive Consulting. With our expertise in evaluation for mission-driven organizations and commitment to fostering positive change, we’re here to support you every step of the way!


Learn more

Written by cplysy · Categorized: evalacademy

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 51
  • Go to page 52
  • Go to page 53
  • Go to page 54
  • Go to page 55
  • Interim pages omitted …
  • Go to page 310
  • Go to Next Page »

Footer

Follow our Work

The easiest way to stay connected to our work is to join our newsletter. You’ll get updates on projects, learn about new events, and hear stories from those evaluators whom the field continues to actively exclude and erase.

Get Updates

Want to take further action or join a pod? Click here to learn more.

Copyright © 2026 · The May 13 Group · Log in

en English
af Afrikaanssq Shqipam አማርኛar العربيةhy Հայերենaz Azərbaycan dilieu Euskarabe Беларуская моваbn বাংলাbs Bosanskibg Българскиca Catalàceb Cebuanony Chichewazh-CN 简体中文zh-TW 繁體中文co Corsuhr Hrvatskics Čeština‎da Dansknl Nederlandsen Englisheo Esperantoet Eestitl Filipinofi Suomifr Françaisfy Fryskgl Galegoka ქართულიde Deutschel Ελληνικάgu ગુજરાતીht Kreyol ayisyenha Harshen Hausahaw Ōlelo Hawaiʻiiw עִבְרִיתhi हिन्दीhmn Hmonghu Magyaris Íslenskaig Igboid Bahasa Indonesiaga Gaeilgeit Italianoja 日本語jw Basa Jawakn ಕನ್ನಡkk Қазақ тіліkm ភាសាខ្មែរko 한국어ku كوردی‎ky Кыргызчаlo ພາສາລາວla Latinlv Latviešu valodalt Lietuvių kalbalb Lëtzebuergeschmk Македонски јазикmg Malagasyms Bahasa Melayuml മലയാളംmt Maltesemi Te Reo Māorimr मराठीmn Монголmy ဗမာစာne नेपालीno Norsk bokmålps پښتوfa فارسیpl Polskipt Portuguêspa ਪੰਜਾਬੀro Românăru Русскийsm Samoangd Gàidhligsr Српски језикst Sesothosn Shonasd سنڌيsi සිංහලsk Slovenčinasl Slovenščinaso Afsoomaalies Españolsu Basa Sundasw Kiswahilisv Svenskatg Тоҷикӣta தமிழ்te తెలుగుth ไทยtr Türkçeuk Українськаur اردوuz O‘zbekchavi Tiếng Việtcy Cymraegxh isiXhosayi יידישyo Yorùbázu Zulu