• Skip to main content
  • Skip to footer
  • Home

The May 13 Group

the next day for evaluation

  • Get Involved
  • Our Work
  • About Us
You are here: Home / Archives for allblogs

allblogs

Sep 25 2023

Building Capacity in Evaluation

This article is rated as:

 

 

Most evaluators are hired to evaluate something: a program, grant-funded activities, a new approach. But what do you do if you get asked to build capacity in evaluation? That is, to help another team or organization do their own evaluation?

There are a few courses available (hint check out our Program Evaluation for Program Managers course!) but maybe they want an evaluation consultant’s help to design the process and tools in a way that can be sustained in-house. This was exactly the ask on a recent contract for me.

The Ask:
Help us evaluate our impact in a way that we can monitor, review and sustain on our own!

The start of this work wasn’t all that different from a program that I would evaluate. I sought to find out: “What do you want to know and why?”, and “How will you use it?”. I held the same kick-off meeting as I normally do (check out: The Art of Writing Evaluation Questions; Evaluation Kick-Off Meeting Agenda (Template); How to Kick Off Your Evaluation Kick-Off Meeting). I figured, at the end of the day, this was still about designing a program evaluation. The key difference was that instead of our team implementing the evaluation, we had to train their staff to do it: the data collection, the analysis and the resulting action.

After understanding what they wanted to be able to monitor by outlining key evaluation questions, we started working on a toolkit. We had a vision that a toolkit could be a one-stop shop for all things related to this evaluation – a place any member of the organization could reference to understand the process and learn how to do it.  At the start of this journey, our proposed table of contents for this toolkit was quite vague and high-level, including sections like “What is the process?”, “Where do I find the data collection tools?”. But the more we field tested (more on this later), the more we kept building out what the toolkit included.

I gained a lot of learnings from this process. Here are some of my top takeaways:

Consent. As evaluators we can’t take for granted that others know about the informed consent process. Most staff at an organization probably don’t go about collecting personal information and experiences and likely haven’t thought about informed consent in a meaningful way. Therefore, a big part of our toolkit focused on defining consent: why it’s important and processes to obtain consent. We even shared some Eval Academy content: Consent Part 1: What is Informed Consent, Consent Part 2: Do I need to get consent? How do I do that?.

Confidentiality and anonymity: Part of consent covers whether or not obtained information will be kept confidential or anonymous. This raised another key learning: most staff don’t think about what this actually means or how it’s done. Often staff assume (correctly or incorrectly) that their organization has policies in place, and they wouldn’t be allowed to do things if it was unethical. This isn’t always true. We included in the toolkit some key information on what confidentiality and anonymity mean, and how they applied in their specific context. For more on this, check out our article Your information will be kept confidential: Confidentiality and Anonymity in Evaluation!

Interviewing skills. For their evaluation, the organization wanted to use volunteers with a range of backgrounds to do some client interviews. This triggered our team to figure out how to build capacity in interviewing. We came up with Tip Sheets for Interviewing, then created and recorded some mock interviews for training purposes. Because we wouldn’t be there to run the training, we wanted to make sure these volunteers were offered some direction, so we included a worksheet for the trainees to reflect on the recorded interviews as part of their training: Why was the interviewer asking that? Why was that wording used? What did the interviewer do when the interviewee said this…? etc.  We provided materials on role clarity of an interviewer – not as a therapist, but as an empathetic listener, and ensured that the interviewer would have access to a list of community resources if needed. We also raised some awareness about vulnerable populations, about offered some preparation for scenarios that might occur with individuals who are feeling distress.

Analysis. Completing data collection is just part of an evaluation. We knew this organization didn’t have a lot of capacity or expertise to be diving into Excel spreadsheets, so we built them a dashboard. They could gather survey data in Excel and auto populate a dashboard at any time, that would visualize key learnings for them. We included a step-by-step instruction guide to help them out. We also wanted to make sure that the organization understood what it means to be the keeper of data, so we shared our Eval Academy data stewardship infographic.

Reporting and reflection. The dashboard was a good start, but we really wanted to support this organization to use the information they were gathering. Also, some data were qualitative and not well represented in the dashboard. We built a report template with headings that signalled where to find information that may answer their key questions. We also built a list of reflective questions that would help them to think about what their data showed and what potential actions were possible which you can access here: Questions to Get You Thinking about Your Data

This all sounds kind of straight forward, right? We thought about what a team needs to know about evaluation and built them those things. Not so! This entire process was iterative – more of a two-steps-forward-one-step-back kind of journey. With each new idea “Ah, they need to know about consent”, we’d learn of something else to add “Oh, they also need to know more about confidentiality”.  To help with this process we did a lot of field testing.

We loosely followed a Plan Do Study Act quality improvement format. We’d get a staff member to test the process on 3 – 5 clients, we’d huddle and talk about what worked well, what didn’t and what unexpected things we encountered, then we tweaked and repeated. Eventually we landed in a spot that seemed to work well.

At the end of it all, the Toolkit (now with a capital T!) was pretty large, and we ended up breaking it up into three core sections.

  1. Describing the process. Who does what when, what roles requirements exist for various roles, links to find data tools, and links to resources. We also included some email invitation templates and scripting for consent, and a tracking log.

  2. Training. The second section focused on those niche skills that may come as second nature to a seasoned evaluator – this is where we included mock interview recordings, tip sheets, confidentiality and consent primers, when and if to disclose information, and how to be a good data steward.

  3. Reporting. The final section described what to do with the information – the dashboard, the report, the reflective questions and a recommended timeline.  We created step-by-step instructions for how to get data from an online survey platform into the dashboard and from the dashboard into the report.

This was a really different experience for me and I learned a lot about slowing down, explaining process, and not making assumptions. It’s strange not to follow-up to see how the process is working. We left them with the final recommendation that all evaluation processes should be reviewed – there is risk in going into auto-pilot. Evaluation processes are only worthy if they are answering key questions and providing actional insights.  I think it was really insightful and good future planning for this organization to understand the value of evaluation and to want to learn more about it so they could do it on their own.

Written by cplysy · Categorized: evalacademy

Sep 25 2023

Redesigning a Thesis Chapter

I’m an epidemiologist and public health researcher who studies health policies on infectious disease.

I got the opportunity to work with a public health agency which I was really exited about.

Until I had to present my research to a group of policy makers…

Before: The Dusty Shelf Report

Condensing two full chapters—73 pages of my thesis—into a short report for the policy making group seemed like an impossible task.

That’s when Report Redesign came to the rescue!

As this research was being conducted in an academic setting, I couldn’t entirely do away with all the technical details (or what Ann would call the Dusty Shelf Report 😊).

But I did manage to apply the 30-3-1 principles to summarise the two chapters into:

  1. a shorter 23-page report (with appendices) and
  2. 11 slides for a 10-minute presentation to the policy making group.

Choosing the Final Outputs: A Short Report and a Slideshow

Working with the public health agency, I realised that although they were expecting the technical details on the study methods and results to be included, the overall format expected was different compared to what I had been used to in academia.

They wanted slides which were to be presented to the policy making group and an accompanying report with more details on the study in case some of the members wanted more detailed information.

I thought this would be a good opportunity to apply some of what I had learnt during the Report Redesign course.

Choosing Which Findings to Include

We had a few meetings with the research group to identify the most important findings to include in the presentation and report.

Given the audience was technical, we agreed to include:

  • An overview of the study
  • A sentence on what the goals/aims of the study were
  • Survey respondent characteristics
  • Results section that highlighted responses to main questions in the survey
  • Limitations

Just focusing on these areas, I was able to whittle down the two thesis chapters into 23 pages with some additional information in the appendices.

The Shorter Report

In the original version of the write-up, I did have some tables, but they were too technical (too many decimal places; statistical terms like p-values).

I also had some graphs that used the default settings made within my software program without any editing.

For the report, I aimed to have one or more visuals on every single page (a goal covered in Report Redesign).

This included flow charts, graphs, tables, text boxes, and icon arrays. Whatever was needed to best communicate the takeaway finding from the research.

The agency was going to use their own design team for the final branding and layout, so I didn’t have to bother with that.  

The Presentation Slides

I further had to whittle down the report into 11 slides for the presentation.

I decided to limit the background information and focus on the key results.

Written by cplysy · Categorized: depictdatastudio

Sep 25 2023

El concepto de valoración de la preparación organizacional

Una valoración de la preparación organizacional (readiness assessment) es una medida formal de la preparación de nuestra organización para sufrir un gran cambio o asumir una nueva intervención importante o relevante. No queremos lanzarnos a un gran cambio, intervención, estrategia, programa o proyecto sin saber si nuestra organización tiene las capacidades y los recursos para lograrlo de manera efectiva.
Realizar una valoración de la preparación organizacional nos brinda el conocimiento y la seguridad de que el esfuerzo propuesto por nuestra organización tendrá éxito si decide seguir adelante y hacerlo. También puede salvar la reputación de nuestra organización al permitirle evitar un fracaso potencialmente importante por participar en una intervención que no estaba preparada para completar.
Una valoración de preparación generalmente evalúa lo siguiente:

  • Metas y objetivos del intervención
  • Expectativas e inquietudes
  • Apoyo al liderazgo del intervención.
  • Capacidad de adaptarse al cambio.
  • Formas de minimizar el posible fracaso de la intervención
  • Gobernanza de la intervención y toma de decisiones.
  • Otras necesidades críticas de la intervención

Written by cplysy · Categorized: TripleAD

Sep 20 2023

Try This: In Person vs Online Workshop Prep

Try this out and let me know how it goes for you. As an introverted workshop facilitator, I used to believe that online workshops were easier to facilitate than in person ones. Now I see that online workshops aren’t easier or harder; they just have their own set of quirks. There’s benefits and drawbacks to […]

The post Try This: In Person vs Online Workshop Prep appeared first on Nicole Clark Consulting.

Written by cplysy · Categorized: nicoleclark

Sep 19 2023

How to create video explanations

Today’s blog post will walk you through a method you can use to create explanation videos using Canva and Zoom.

The concept was designed as a simple way to walk an audience through a some type of model (ex. logic model, theory of change, etc.).

The Concept Video

This video was recorded using Zoom and edited with Canva. It is embedded here using Canva’s Embed feature.

The Basic Steps.

There are eight basic steps in this process.  Let’s go ahead and walk through each one.

  1. Write the script.
  2. Record the video.
  3. Drop it into Canva.
  4. Setup your initial frame.
  5. Split the scenes.
  6. De-emphasize the model.
  7. Add transitions.
  8. Export your video.

Step 1. Write the script. 

Pretend like you are explaining your model to a colleague.  Break it down into little pieces and walk through piece by piece.

Step 2. Record the video. 

Now that you have the script, record yourself or someone else reading the script.  I suggest using Zoom, because you probably already know how to use it.  I also suggest downloading the recording to your computer in high definition.  As you read the script, pause slightly between segments to leave room for video transitions.

Step 3. Drop it into Canva. 

Now that you have a video, click the button to create a 1920 by 1080 video in Canva.  Once you have the file started, drop in the video.

Step 4. Setup your initial frame. 

I like going full screen for the introduction and closeout, but for the majority of the video I want it to be me alongside the actual model.  You can set this up by adding in a frame and dropping the video inside.

Step 5.  Split the scenes.

Once you have the general look, go through and split your video by scene.

Step 6. De-emphasize the model. 

To focus audience attention, you can take a simple model and emphasize what you want the audience to see.  OR, you can take a bold model and de-emphasize what you don’t want the audience to see.  This is what I will do scene by scene, making use of the transparency slider in Canva.

Step 7. Add transitions. 

I have Canva Pro, and with that some extra transition options.  My favorite to use is the match and move transition.

Step 8. Export your video. 

After you are through, make sure to watch it a time or two just in case you need to edit something.  Then when you are happy, download the video.

Bonus. Embed your video.

Did you know that you can embed video right from Canva? Only caution, from personal experience, some organizations block Canva. But if it works for your audience, embedding straight from Canva certainly saves extra steps.

How are you using Video in your reporting?

Doing anything interesting with video? Have you ever used Canva to edit video? Leave a comment and let me know.

Written by cplysy · Categorized: freshspectrum

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 72
  • Go to page 73
  • Go to page 74
  • Go to page 75
  • Go to page 76
  • Interim pages omitted …
  • Go to page 310
  • Go to Next Page »

Footer

Follow our Work

The easiest way to stay connected to our work is to join our newsletter. You’ll get updates on projects, learn about new events, and hear stories from those evaluators whom the field continues to actively exclude and erase.

Get Updates

Want to take further action or join a pod? Click here to learn more.

Copyright © 2026 · The May 13 Group · Log in

en English
af Afrikaanssq Shqipam አማርኛar العربيةhy Հայերենaz Azərbaycan dilieu Euskarabe Беларуская моваbn বাংলাbs Bosanskibg Българскиca Catalàceb Cebuanony Chichewazh-CN 简体中文zh-TW 繁體中文co Corsuhr Hrvatskics Čeština‎da Dansknl Nederlandsen Englisheo Esperantoet Eestitl Filipinofi Suomifr Françaisfy Fryskgl Galegoka ქართულიde Deutschel Ελληνικάgu ગુજરાતીht Kreyol ayisyenha Harshen Hausahaw Ōlelo Hawaiʻiiw עִבְרִיתhi हिन्दीhmn Hmonghu Magyaris Íslenskaig Igboid Bahasa Indonesiaga Gaeilgeit Italianoja 日本語jw Basa Jawakn ಕನ್ನಡkk Қазақ тіліkm ភាសាខ្មែរko 한국어ku كوردی‎ky Кыргызчаlo ພາສາລາວla Latinlv Latviešu valodalt Lietuvių kalbalb Lëtzebuergeschmk Македонски јазикmg Malagasyms Bahasa Melayuml മലയാളംmt Maltesemi Te Reo Māorimr मराठीmn Монголmy ဗမာစာne नेपालीno Norsk bokmålps پښتوfa فارسیpl Polskipt Portuguêspa ਪੰਜਾਬੀro Românăru Русскийsm Samoangd Gàidhligsr Српски језикst Sesothosn Shonasd سنڌيsi සිංහලsk Slovenčinasl Slovenščinaso Afsoomaalies Españolsu Basa Sundasw Kiswahilisv Svenskatg Тоҷикӣta தமிழ்te తెలుగుth ไทยtr Türkçeuk Українськаur اردوuz O‘zbekchavi Tiếng Việtcy Cymraegxh isiXhosayi יידישyo Yorùbázu Zulu