• Skip to main content
  • Skip to footer
  • Home

The May 13 Group

the next day for evaluation

  • Get Involved
  • Our Work
  • About Us
You are here: Home / Archives for allblogs

allblogs

Jul 06 2023

Stop trying to create Goldilocks reports.

You know the story of Goldilocks and the three bears?

Little girl breaks into the house of three bears while they’re out for a walk. Then proceeds to eat their food, break one of their chairs, and eventually falls asleep in one of their beds.

The story came to mind when I was thinking about modern reporting challenges. Which I know makes absolutely no sense at all, but I stopped trying to understand my mind years ago. But I digress.

I think there is a lot we can get from the story that we can apply to how we report. Just not in an obvious way.

Goldilocks and the Three Reports

One day Goldilocks was talking to her organization’s evaluator about a few of the reports set to go out to their stakeholders.

According to Goldilocks, the first report, the one designed for the large bear, was too long. It also had too few pictures and charts.

The second report, the one designed for the medium bear, was too short. This one had too many pictures and charts given the length.

Now the third report, the one designed for the little bear, was just right. Goldilocks loved everything about this report.

And because Goldilocks was the evaluator’s direct supervisor, she instructed the evaluator trash the too long report and the too short report. Because the “just right” report is the best of the bunch and why should the organization share anything that’s not the best?

So what’s the problem?

The “just right” report is only “just right” for Goldilocks (who is not the target audience) and for one of the three bears (who are part of the target audience).

By picking just the one report, she excluded 67% of the target audience. Not because the other reports didn’t work, but because the other reports didn’t match her vision of a good report.

Unfortunately this happens all the time.

We often design reports for just a small portion of our audience. And the reports that get the green light are the ones preferred by those with authority.

What to do instead.

The simple answer. Create and share all three reports. Actually, create more reports than that if you can.

Stop assuming that one report can do it all.

Want to learn how to approach reporting in a modern kind of way?

Join me for a free webinar on July 18 at 3PM Eastern.

Click the image below to learn more and register.

https://www.eventbrite.com/e/designing-with-chris-tickets-672609191197

Written by cplysy · Categorized: freshspectrum

Jul 05 2023

Ask Nicole: Our Programs Are Outdated

Have a question you’d like to be featured? Let me know. Nonprofit organizations play a vital role in addressing social issues and making a positive impact in our communities. To effectively serve their communities and achieve their mission, nonprofits must constantly adapt and evolve. One crucial aspect of this evolution is updating program design and […]

The post Ask Nicole: Our Programs Are Outdated appeared first on Nicole Clark Consulting.

Written by cplysy · Categorized: nicoleclark

Jul 04 2023

New Template: Style Guide Template!

This article is rated as:

 

  

 

Eval Academy just released a new template, “Style Guide Template”


 Who’s it for?

Whether you’re new to evaluation or if evaluation is your main role, this Style Guide Template is for anyone looking to create consistency among project documents.


What’s the purpose?

A style guide is a time-saving tool that helps you be consistent in your formatting when creating client and public-facing products, from evaluation plans to reports and presentations. When working with a team, style guides ensure that team members are working efficiently as they create evaluation products. 


What’s included?

This Word template provides each of the main elements needed in a style guide (e.g., fonts, heading styles, colours, imagery, and charts) and space for you to enter your own project details. This template also includes a bonus section about writing styles.

Style Guide Template:


Learn more: related articles and links:

You can learn more about designing an evaluation report on Eval Academy through the following links:

  • Five elements to include in your reporting style guide

  • Take Them on a Journey

  • Consistency is Cool

  • Practice Proximity

  • Make it Pop

  • Photo Love

  • Dial Down Your Data

  • Chart Templates: The Time Saver You Should Be Using


You can also find many other templates in our list of resources to support you in designing evaluation reports. Some of our most popular templates include:

  • 1-3-25 Reporting Method Infographic

  • 10 Tips for Designing Quality Reports Infographic


What do you think of our new Style Guide template? Let us know in the comments below!

Written by cplysy · Categorized: evalacademy

Jul 04 2023

The Art of Writing Evaluation Questions

This article is rated as:

 

 

It seems so simple – just ask a question! But many new evaluators or side-of-desk evaluators struggle with confidence in creating good evaluation questions. Here are a few tips to accompany some content we’ve already shared (How To Write Good Evaluation Questions; Evaluation Question Examples by Type of Evaluation; Evaluation Question Examples). Then, I’ll share an example of how I go from a client meeting to drafting evaluation questions.


Structure Tips

Evaluation questions often have similar lead-ins, that is, that starting part of the question. Evaluation questions are intended to elicit a narrative response, not a simple yes/no. Therefore, the question (usually) shouldn’t be “Are clients satisfied?” or “Did the program reach the target?” These can be answered in one word: yes or no. Usually, evaluation questions are open-ended questions that leave room for context, exploration, or explanation.

Try starting your evaluation questions with:

  • “To what extent….”

This one is a favourite. Starting your question with “To what extent” leaves room for a range of responses. Often it addresses program effectiveness. The end of that question could be an outcomes statement, e.g., To what extent did the program provide equitable access to housing services?

  •  “Why….”

Why questions can help a program to understand the results they are getting. It can explore processes that usually aren’t documented, e.g., Why are clients choosing this program over that program?

  • “How….”

How questions are excellent for process or formative evaluation. How questions help a program to understand what works in what context and can identify enablers or barriers, e.g., How do clients learn about our services?

  •  “In what ways.…”

In what ways questions can be used when there is a specific feature that you want to explore, e.g., In what ways did self-referral impact program outcomes?

  •  “What..

    • ..contribution”

    • ..impact”

    • ..factors”

What questions can also explore specific features, e.g., What impact did the email campaign have on client access?

Or, what questions can help to identify barriers or enablers, e.g., What factors contribute to client success rates?


What about Who?

In my opinion, questions that start with “Who” are rarely evaluation questions. Having said that, I often include them in my evaluation plan. I do this because it clearly and transparently shows clients that in addition to answering their key evaluation questions, I’ll also provide descriptions or profiles of who is accessing their service. Where ethical and possible, I’ll use the “who” information to further explore answers to the evaluation questions, e.g., How did satisfaction vary by demographics?

Sometimes I’ve seen these referred to as “Descriptive Evaluation Questions”, and I think they’re important.


Client Requests

Despite my argument earlier that yes/no questions are generally not great evaluation questions, sometimes I do include them in my evaluation plan, often as sub-questions under a key evaluation question. I do this to show clients that I intend to answer their burning questions, which may be yes/no.

For example:

A key evaluation question may be:

“To what extent were clients satisfied with the service?”

And then underneath that, I’ll include:

“Did the program reach at least 80% of clients satisfied?”

You’ll see what I mean about a key evaluation question and sub-questions in my example below.

However, I have found that with a little wordsmithing magic many yes/no questions can be made stronger using the lead-ins described above.

For example:

Did the program stay within budget? -> How well did the program align with the budget?

        Are clients satisfied with the program? -> To what extent are clients satisfied with the program?

Did changing the intake process impact outcomes? -> In what ways did changing the intake process impact outcomes?


From client meetings to evaluation questions

Now that we have some structure around evaluation questions, let me share a real example. At the start of each new client contract, I hold a kick-off meeting (here are some great resources for that: Evaluation Kick-Off Meeting Agenda (Template); How to Kick Off Your Evaluation Kick-Off Meeting. A primary goal of that meeting is to get me started on drafting evaluation questions.

You can try asking your clients “what are your evaluation questions” directly, but in my experience, you’ll be left with blank stares. Clients (usually) aren’t evaluators. So instead of the direct pathway, I facilitate discussions about what questions they have about their program, what they want to learn, and what they (might) do with those learnings. Evaluation questions will often flow from this understanding.

In this example, I was working with a new client to evaluate outcomes of a mental health program for youth. After a quick orientation to evaluation that we call “The Fastest Evaluation 101 Ever”, where I frame evaluation questions as providing the roadmap for where the evaluation goes, I move into a discussion where I transition from capacity builder to listener.

Before the meeting I figure out what I need to know, and therefore what questions I’ll ask to generate discussion; determining this is tailored to each client and depends on what I already know, which is usually from documents they’ve already shared with me.  Here are some guiding questions I use that enable me to hear from the client:

1.     Why am I here? What do you want evaluated and why? What are your expectations?

This drives at purpose and scope. Sometimes I link this discussion to content in my evaluation orientation. This is why that orientation (The Fastest Evaluation 101 Ever) is so useful; now they know that evaluations can be used to make judgments, to learn, to grow and expand, or to monitor (to name a few), and they can use that knowledge to describe the current evaluation. As a group, we spend a fair bit of time understanding why they’re evaluating, and why now.

 In this youth mental health program example, the notes I took said:

  • They want to explore their flow of service and better understand effectiveness and quality

  • They want to know what’s going well, and what gaps and challenges they have

  • They want to learn and act on those learnings

  • They want to know about access, quality, and impact

  • They want to understand the client trajectory or journey

2.     What questions do you have about this program? What decisions do you make and what informs those decisions?

This section is pretty clearly about evaluation questions but also looks at gaps/opportunities and how they may act on them. Sometimes you’ll find it’s hard for your client to think of questions per se, so asking what decisions they make day-to-day, or monthly, or quarterly and then following up with “what evidence or data do you use to inform those decisions”, will help move things along.

 In this example, the notes I took said:

  • What gaps do we have in terms of who accesses the service?

  • Are there gaps by certain populations (e.g., socioeconomic status) or certain regions of the province?

  • Who are we serving? And who are we not serving?

  • What are the barriers to access?

3.     How do you define success for this program?

This question gets at outcomes and program effectiveness, which are usually closely related to evaluation questions. This also starts to give you an idea of how you may measure (and answer) those evaluation questions.

 In this example, the notes I took said:

  • Self-assessment scores improve pre to post

  • Evidence of fidelity to the model

  • Families report success

  • Improved mental health

  • Demand for services is high

 If a client is particularly interested in outcomes, I’ll ask: What does your program make better? What changes do you expect to see? I didn’t need to do that in this example, though.


From those notes, I drafted only three key evaluation questions. I really liked how one of the staff talked about three domains: Access, Quality, and Impact, so I mirrored that thinking back to them in the development of the questions.

1.     To what extent are we serving the families who need our services? (ACCESS)

Under this key question, I had 9 more specific questions. Here are some highlights:

a.     Who was referred to our program? (Notice the inclusion of a descriptive evaluation question!)

b.     How convenient were services to access?

c.      To what extent do program participants use what they learn in other scenarios?

2.     What is the experience of participating in the program? (QUALITY)

Under this question, I had 5 more specific questions. Here are some highlights:

a.     To what extent do families report a positive experience with the program?

b.     What supports enable program implementation?    

3.     What impact does the program have on participating families? (OUTCOMES)

Under this, I had 6 more specific questions. Here are some highlights:

a.     What impact does this program have on the mental health of youth?

b.     What impact does this program have on families?

c.      To what extent are outcomes sustained?

In this case, categorizing the evaluation questions added a lot of clarity, and actually helped me to structure the final report as well.


Sometimes, clients will say they’re interested in a logic model, or theory of change, or, as mentioned above, in developing new outcome statements. In those cases, the development of those products will inform and guide your evaluation questions.

Evaluation questions can also be tied to a framework. I’ve used RE-AIM or Proctor and built evaluation questions mapped onto specific domains (which is pretty similar to the categorization of evaluation questions in the example shared above).

Sometimes the clients have shared enough documentation with me before the kick-off meeting that I could show them some example evaluation questions and assess their reaction/reception. I really like this approach and use it whenever I can. I find that oftentimes reacting to something is easier and more efficient than starting from scratch – you’ll know right away if you’re on- or off-track.


I hope this helps you in your process to create evaluation questions. Use our evaluation question checklist for some additional considerations.

Written by cplysy · Categorized: evalacademy

Jul 04 2023

SWOT Analysis: What is it and how do I use it?

This article is rated as:

 

 

Have you ever participated in a strategic planning session or helped develop or start a new program? A SWOT is a tool commonly used at the beginning of a new venture or as part of a new strategic planning process. The real value in a SWOT analysis is in the gathering of perspectives across and outside of the organization to better reveal potential obstacles and find the less obvious solutions. It is frequently used because it is simple to understand. A SWOT in evaluation could be used as a data collection tool or to present evaluation findings and create recommendations.


What is a SWOT analysis?

SWOT stands for Strengths, Weaknesses, Opportunities, and Threats. SWOT started as a business and marketing tool that is now commonly used in strategic planning sessions. It is often used at the organizational level but can be used to assess a program or project. A SWOT is normally depicted as a 2 x 2 grid or matrix, like the one below.

Strengths and weaknesses examine what is going on internally. These are the areas that the organization can control and change. Strengths look at what the project excels at and is good at doing. Weaknesses are of course the other side to that coin; weaknesses look at what the project is not good at. These could be areas to improve. Evaluation findings can uncover unexpected strengths and weaknesses that those who are internal to the project just might not be able to see.

Opportunities and threats are external facing. Opportunities look at the current environment that the organization exists in and where it could grow to fill service gaps in the community or ways the organization can expand. In evaluation findings, opportunities might include external factors that could support identified program improvements or external context that has led to positive program findings. Threats look at the current environment to see areas where the growth and sustainability of the organization might experience barriers to change or be negatively impacted by external context.


How to do a SWOT analysis

I have most often taken part in a SWOT analysis in a workshop format, where someone facilitates the conversation, gathering feedback and insight from people with a vested interest in the project. However, a SWOT could also be a useful way to design interview questions that look at internal and external factors impacting a project or to structure qualitative analysis from interview findings.

When gathering people to complete a SWOT analysis, consider who you are inviting. Is this an exercise with just the evaluation advisory committee or is this an opportunity to expand beyond that group to gain outside perspectives? Should leadership, frontline staff, or other groups with vested interests be included?

When doing an in-person SWOT analysis, I like the use of flipcharts. I would normally have Strengths, Weaknesses, Opportunities, and Threats placed around the room. If doing a virtual SWOT analysis virtually, some tools that could be useful include Canva charts, Microsoft OneNote, or maybe PowerPoint with a slide for each factor. Use the tools that you already have access to, in order to gather the insights of your group.

There are a few things to keep in mind when facilitating a SWOT analysis. This is meant to be participatory, and everyone in the group is encouraged to give their insights. This can be done through a full group discussion of each of the factors, or depending on group literacy, sticky notes could be given to each participant to write their own ideas onto and stick onto the sheet with the corresponding factor.

When planning for your group, create guiding questions that are relevant to your project and use them to help stimulate conversation. Some questions to help facilitate the discussion:

  • What types of activities does our organization do best?

  • Where is our biggest success?

  • What type of skills do we have?

  • Where are we most effective?

  • Who are our strongest allies?

  • What processes do we need to improve?

  • What assets do we need to build or get?

  • What gaps do we have in skills?

  • How do clients perceive our organization?

  • How is technology changing how we deliver our activities?

  • Are there any funding changes that may affect our ability to continue running the organization’s activities?

  • What are current donor trends and how might those affect the organization?

These are just a few ideas of questions that can help to get some ideas flowing, but in no way is this an exhaustive list.

Once the group has completed listing their SWOT analysis, it is time to set priorities. Each group may do this a bit differently, maybe each member individually ranks their priorities. Again, this could be done with sticky notes with numbers one to five and each group member places these numbers beside their priority areas. The group would then discuss and come to a consensus to define which items are priorities. It is possible to skip the individual prioritization, but I find that quieter group members tend to not speak up in the prioritization within out having done the individual priorities first. These priorities would go into the completed SWOT matrix and are commonly listed in order of priority.


The following is an example of what a SWOT might look like for a small soup kitchen.

After priorities are set, an action plan or actionable recommendations are created to follow up and pursue these priorities. The SWOT matrix itself is not the goal but a tool to help set priorities. A SWOT is a simple tool to help set priorities for a project, program, or organization. A SWOT can help foster group ownership through involvement in SWOT analysis gathering. It can also be a great way to share information as it is easy to understand.


Now you can create your own SWOT. We have a SWOT analysis template to help get you started.

Written by cplysy · Categorized: evalacademy

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 81
  • Go to page 82
  • Go to page 83
  • Go to page 84
  • Go to page 85
  • Interim pages omitted …
  • Go to page 310
  • Go to Next Page »

Footer

Follow our Work

The easiest way to stay connected to our work is to join our newsletter. You’ll get updates on projects, learn about new events, and hear stories from those evaluators whom the field continues to actively exclude and erase.

Get Updates

Want to take further action or join a pod? Click here to learn more.

Copyright © 2026 · The May 13 Group · Log in

en English
af Afrikaanssq Shqipam አማርኛar العربيةhy Հայերենaz Azərbaycan dilieu Euskarabe Беларуская моваbn বাংলাbs Bosanskibg Българскиca Catalàceb Cebuanony Chichewazh-CN 简体中文zh-TW 繁體中文co Corsuhr Hrvatskics Čeština‎da Dansknl Nederlandsen Englisheo Esperantoet Eestitl Filipinofi Suomifr Françaisfy Fryskgl Galegoka ქართულიde Deutschel Ελληνικάgu ગુજરાતીht Kreyol ayisyenha Harshen Hausahaw Ōlelo Hawaiʻiiw עִבְרִיתhi हिन्दीhmn Hmonghu Magyaris Íslenskaig Igboid Bahasa Indonesiaga Gaeilgeit Italianoja 日本語jw Basa Jawakn ಕನ್ನಡkk Қазақ тіліkm ភាសាខ្មែរko 한국어ku كوردی‎ky Кыргызчаlo ພາສາລາວla Latinlv Latviešu valodalt Lietuvių kalbalb Lëtzebuergeschmk Македонски јазикmg Malagasyms Bahasa Melayuml മലയാളംmt Maltesemi Te Reo Māorimr मराठीmn Монголmy ဗမာစာne नेपालीno Norsk bokmålps پښتوfa فارسیpl Polskipt Portuguêspa ਪੰਜਾਬੀro Românăru Русскийsm Samoangd Gàidhligsr Српски језикst Sesothosn Shonasd سنڌيsi සිංහලsk Slovenčinasl Slovenščinaso Afsoomaalies Españolsu Basa Sundasw Kiswahilisv Svenskatg Тоҷикӣta தமிழ்te తెలుగుth ไทยtr Türkçeuk Українськаur اردوuz O‘zbekchavi Tiếng Việtcy Cymraegxh isiXhosayi יידישyo Yorùbázu Zulu