• Skip to main content
  • Skip to footer
  • Home

The May 13 Group

the next day for evaluation

  • Get Involved
  • Our Work
  • About Us
You are here: Home / Archives for cplysy

cplysy

Mar 21 2022

El pensamiento evaluativo para transformar la evaluación  y las organizaciones

Uno de los roles de la función de evaluación es el desarrollo de pensamiento evaluativo a todos los niveles posibles.

I.) El pensamiento evaluativo (PE)

(1) es un enfoque disciplinado para la indagación y la práctica reflexiva que nos ayuda a emitir juicios sólidos, basados en evidencias, como un hábito.

(2) no es aplicable sólo en/a la evaluación (o unidades de evaluación) sino en toda la organización y fases de la gestión,

(3) tiene una dimensión de desarrollo de capacidades a nivel individual, organizacional y estructural

Thomas Archibald y su equipo han definido “Pensamiento evaluativo» de la siguiente manera:

(1) aplica el pensamiento crítico en el contexto de la evaluación,

(2) motivado por (a) una actitud de curiosidad y (b) una creencia en el valor de la evidencia, que

(3) implica (a) identificar supuestos, (b) plantear preguntas reflexivas, (c) buscar una comprensión más profunda

(4) por medio de (a) la reflexión y (b) la toma de perspectiva, e (c) informar las decisiones en preparación para la acción».

II.) Un enfoque de PE tiene como objetivo cambiar la actitud (motivación, apropiación y comprensión), la aptitud (capacidades, medios y habilidades) e incentivos (oportunidades) en relación a (1) el pensamiento evaluativo en general y (2) la utilidad de la evaluación para la organización, siendo una herramienta que podría ser integrada en las tareas de todos los miembros de la organización.

Algunas formas en que el pensamiento evaluativo se relaciona con la transformación de la evaluación para evaluar la transformación: (1) Liderazgo descentralizado, (2) gestión del conocimiento, (3) pensamiento de sistemas, (4) construcción colectiva.

  • Democratiza y descentraliza la investigación evaluativa.
  • Aprovecha la sabiduría práctica y una pluralidad de formas de conocimiento y razonamiento. .
  • Es el pensamiento de sistemas y equidad.
  • Equilibra la intuición y la racionalidad

III.) ¿Cómo asegurar un pensamiento evaluativo continuo en las organizaciones? Al ser el PE una disciplina, necesita de un enfoque específico de refuerzo de capacidades en toda la organización (no solo en las unidades de evaluación) sobre las disciplinas requeridas:

1.Refuerzo del liderazgo y motivaciones para el PE: Asegurar un liderazgo efectivo (centralizado y descentralizado) y desarrollar un cultura del aprendizaje (aprendizaje legitimado e incentivado)

2.Refuerzo de las capacidades y medios para el PE:

2.1.Desarrollo de capacidades que den oportunidad para desarrollar:

  1. La forma de «saber, conocer, plantear y priorizar» necesidades de información, conocimiento, aprendizaje y preguntas a responder (preguntas evaluativas) de la organización,
  2. La forma de buscar respuestas y la forma de utilizar las evidencias existentes,
  3. Fortaleciendo de la calidad del proceso, de las pregunta y evidencias existentes,
  4. Crear espacios y tiempos para el PE: (1) (in)formal,  (2) individual /colectivo

2.2.Algunos conductores podrían ser: (1) el PE como objetivo organizacional, (2) integrar explícitamente el PE dentro del ciclo de planificación, seguimiento y evaluación, (3) incluir la demanda de PE como requisito en los procesos de contratación, Términos de Referencia, medición de desempeño, (4) incluir el refuerzo del PE como objetivo de las evaluaciones.

3.Refuerzo de los incentivos: (1) crear marcos de rendición de cuentas individuales y organizacionales para el PE (2) fomentar relaciones de confianza, transparencia, compartir, (4) invertir en infraestructura para el uso y la gestión del conocimiento

Written by cplysy · Categorized: TripleAD

Mar 19 2022

Comment on Teaching like Evaluation by Ungrading and the Logic of Evaluation

[…] not knowing how to grade themselves, particularly when the criteria may be more amorphous. Students have previously only really been involved in step 3 of the logic of evaluation process: receiving their individual grades on assignments. They are rarely involved in developing criteria […]

Written by cplysy · Categorized: danawanzer

Mar 16 2022

How to create logic models and theory of change using Canva.

So by now, you might have guessed that I really like Canva.

Usually I consider myself to be software agnostic. I’m always on the lookout for the best tool for the job. But the thing is, now-a-days, I find that I’m just almost always in Canva. It’s just made my designer life that much easier that I find myself always using the tool.

  • Infographics > Canva
  • Social Media Images > Canva
  • Reports > Canva
  • Videos > Canva
  • What about Logic Models and Theory of Change? I’m pretty sure you can guess.
Canva Logic Models & Theory of Change

In today’s post.

  • A rundown of creating (and adapting) a basic results chain logic model using Canva.
  • A rundown of creating (and adapting) a basic theory of change using Canva.
  • A rundown of creating (and publishing) a web-based interactive logic model using Canva.
  • A couple of Canva templates so that you can start where I finished.
Zombie evaluators cartoon by chris lysy of freshspectrum. 

"Yes, I get it, you all want to eat brains.  But why do you want to eat brains? What is our intended outcome here?"

Guides for developing logic models and theory of change.

Just in case it wasn’t obvious, this is not a guide on how to fully develop a logic model or theory of change for your program or organization. There is a lot of hard thought work behind the scenes that goes into developing models beyond the finished diagram.

  • Here is a UNICEF Methodological Brief written by the amazing Patricia Rogers – Theory of Change
  • Here is a well-loved guide (created in 2004) from the W.K. Kellogg Foundation – Logic Model Development Guide
  • This one is for those who want the CDC Approach to Creating Logic Models
  • If you want an overly simplified approach, I created a guide – How to create a basic logic model [activity book]
  • Or perhaps you would like to create a Theory of Change using guidance from NCVO – How to Build a Theory of Change

Now that that’s out of the way, let’s jump to putting the metaphorical pen to paper.

A Simple Results Chain Logic Model – Example from the W.K. Kellogg Foundation

Kellogg Foundation Logic Model Example

Okay, so as a starting point I’m going to use this example logic model found in the Kellogg Foundation logic model guide.

Recreating the Logic Model using Canva

When it comes down to it, most basic logic models are really simple to design. It’s just a bunch of shapes, arrows, and lines. This is so many people just end up creating these things in Word or PowerPoint. Because it’s easy enough to do.

Kellogg Logic Model Created with Canva

So that’s where I started. I’ll just replicate a logic model from the guide using shapes and text boxes.

Logic Model created in Canva

It didn’t take too long to create something that looked really similar to the original. And because it’s now in Canva, I have a lot of download options. If I just want an image I’ll usually download as PNG. If I want something printable, then I’ll download as a PDF Standard.

Exporting the logic model to PDF

Adapting your Canva Logic Model

The nice thing about Canva is that it’s pretty easy to duplicate and adapt your logic model, trying out different styles. I find the outline boxes to be a little visually jarring. So instead I replaced the outline boxes with some solid color light gray boxes.

Alternative Canva Logic Model Style

The shapes exist separately from the text, so you just insert the rectangle and send it to the back to set behind the text blocks. If you spend a lot of time moving the boxes around it’s a good idea to group the individual text/shape pairs.

Grouping items in Canva

Color coding the logic model.

Once you have the base shape and text there are all sorts of alterations you can do to the design. I know there are a lot of evaluators who like to color code different elements. That’s certainly simple enough to do.

Canva Logic Model Alternative

Occasionally you’ll want a softer color, especially for background elements. One way to do that is just make the boxes slightly transparent.

Changing colors in Canva

Photo annotating the logic model.

If you find yourself creating a lot of different logic models annotating with some photographs and background elements can really help you to differentiate. Especially if the photographs pair well with the actual project. I created this corny version with just some Canva stock photos.

Canva logic model with photo annotations

I use a Canva a lot, so I’ve made the investment in a pro account (~$10/month at the time of this post). Given the amount of stock content I use and the bonus features, this is well the worth the subscription cost. One of the features I like to use is the background remover. It’s a nice way to make certain stock images fit almost anywhere.

Background Remover in Canva

Coding Logic Model Elements

If your logic model is still being developed and fine tuned it can be a good idea to code the individual blocks. This makes it a lot easier to talk about individual elements.

Logic Model element map

For this I just shrunk the text and offset it to the right of the gray boxes. Then I darkened the side to create a space for codes.

Changing font size in Canva

Icon Illustrating your Logic Model

My favorite way to adapt a logic model is to simply icon illustrate the individual elements.

Icon Illustrated Logic Model

Just move from element to element looking for icons that somewhat illustrate each block of text. There are all sorts of icon styles available within Canva.

Showing how to find icons in canva

A Simple Bottom Up Theory of Change – Example from NCVO

NCVO Theory of Change Example

Okay, so maybe you are less of a logic model person and more of a theory of change person. Let’s do the same thing we did for the logic model with this NCVO example theory of change.

Recreating the Theory of Change using Canva

This starts off just like the logic model. Just recreate (or create) your theory of change using Canva shapes and lines.

NCVO Theory of Change Example created with Canva

Adapting your Canva Theory of Change

For this one I am only going to icon illustrate. But I’m also only going to icon illustrate the outputs (not the outcomes). I find collections of different shapes to be just a bit dull, so I replaced the ovals with icons and little circles to anchor the arrows. It’s a really simple tweak but the whole theory of change feels more open to me now.

Illustrated Theory of Change

Turning our Results Chain Logic Model into an interactive web page using Canva

A few years ago I created a prototype of an interactive logic model in a prototyping tool called InVision. I then recreated the same prototype as a PDF. But recently Canva has rolled out a Beta version of a website builder. So since I already had a logic model created I thought I would try adapting it into an interactive.

So that’s what I did, and you can check it out by clicking this link.

Interactive Theory of Change created in Canva

A little bit of copy and paste and poof, now I have an interactive logic model. The goal of this tool is to create a way to walk a reader through the model, piece by piece. This creates ample space for additional context and conversation that just won’t fit in a traditional model.

Example from the Canva created web based logic model

Not only can you publish your design to a canva site domain for free, you can also purchase a fresh domain or publish to an existing domain you own.

Example of how to publish Canva web pages to the web.

Want to start where I finished? [Templates]

No need to start from scratch. Now that I’ve created some basic templates, you can start where I finished.

Here is a link for the Logic Model Starters template.

Canva Logic Model Template

Here is a link for the Theory of Change template.

Canva Theory of Change Template

And if you end up creating something using these templates, please do share it with me in the comments!

Written by cplysy · Categorized: freshspectrum

Mar 16 2022

Data Collection and Participation For Busy People

Among the greatest challenges of doing research and evaluation is ensuring you get participation from enough (and the right) people. Surveys are everywhere. It feels like everyone wants our feedback on just about everything. Yet, the more surveys out there the more we should be concerned about data quality, too.

On top of that, we are surrounded by media messages, distractions, and ‘noise’ that can make attention a very precious (and rare) commodity.

It can be daunting.

How do we get people to participate and get good quality data from that participation amongst the noise? We’re going to outline an approach to data collection that goes from methods to conversations.

From Methods to Conversations

There is a lot of research on improving existing methods like surveys or asking better questions. Tips like these can be useful, but they may also distract us from addressing other issues. For starters, consider the user case for doing research and evaluation.

Who benefits from the research? Is it those who are participating? If not, why would someone want to take time and spend energy answering your questions? We can no longer assume people will participate out of a sense of wanting to help. The deluge of research requests has made data gathering an imposition more than an opportunity for most people.

One of the ways that we deal with this is to shift the focus to creating conversations and learning opportunities. By thinking of data collection as part of a conversation we can change the way we gather data. This works for evaluation, design research, or any applied research context.

A great conversation is about creating exchange. That means some back and forth between the parties. What if you could do this with your data?

Data-Based Conversations

The concept of data-based conversations is all about using what you gather as the foundation for the exchange between people. This means gathering relevant information from people and then sharing what it is that you find. Individuals provide their thoughts, opinions, attitudes, and reflections and as researchers we provide the synthesis and opportunity to share what we’ve learned from others. It works because it creates exchange and value.

People choose to participate because they can both contribute and receive insights about their peers. Please keep in mind that this approach only works when people are interested in your topic.

We recommend that you design multiple, short engagements so you reduce response burden. Rather than use a single, large survey we suggest you break it into smaller batches of questions. In between each survey we provide rapid synthesis learning reports to share what we’ve learned from others.

Our rapid learning reports might be short summaries, infographics, or distilled tips gleaned from the data. Using visual media is particularly helpful because it’s simple and accessible. It says to our participants: “we heard you and here’s what others have said.”

Data collection can also include short interviews, social media exchanges, or panel feedback. The methods matter less than the way that we structure the engagement. This approach builds trust, familiarity and increases data quality.

Time to Talk?

People are busy and less invested in your product or service than you think. This approach to creating a conversation than just asking (and taking) from people changes the relationship. By enlisting people as partners and focusing on sharing what you learn in ways they can benefit, you serve others not use others.

As always, this must be done with transparency, ethics, respect, and commitment to delivering on your promises. We’ve found this approach works and it adds value to participants. People like to learn and know what’s going on with their peers. Gathering data this way is less intrusive, more natural, and less burdensome.

If you’ve got a big research question to ask, consider ways you can transform your data collection into a conversation. You might find that you get more participation, greater engagement, and better quality data.

Want to build this approach into your evaluations or research? We can help and share our experience using this approach to reach busy people. Contact us and let’s grab a coffee.

Image credits: Karen Lau on Unsplash, Jon Tyson on Unsplash, and Firmbee.com on Unsplash

The post Data Collection and Participation For Busy People appeared first on Cense Ltd. .

Written by cplysy · Categorized: cameronnorman

Mar 13 2022

The “mixing” in mixed methods

In evaluation, we use multiple types and sources of data, diverse methods of collection, or multiple evaluators to answer evaluation questions. Data integration is a way of merging these data from different sources through mixed methods. Data integration can enhance reliability in evaluation findings (e.g., by increasing the ability of findings to be replicated). It can also help to discover contradictions and inconsistencies that otherwise might not have been revealed between different sources and can clarify the results of an evaluation.  

The ability to synthesize large amounts of data to identify important information is an essential skill for evaluators. Depending on the scope of the evaluation, we often collect large amounts and different types of data, and we must triangulate them to get to the main evaluation findings. “Mixed methods” is intentionally using one data source with another, with the purpose of triangulating the results, whereas “multiple methods” is simply using different data collection strategies in the same program, but with no intention to “mix” or integrate them.  

To give you a simple analogy, “mixed methods” is like mixing coffee and milk together (e.g., latte), while “multiple methods” is having coffee and milk separately. They are both great but very different beverages.  

In this article, we discuss how qualitative and quantitative data can be integrated at the study design level, methods, or analysis level. 


Data integration at the design level

At the design level, data can be collected concurrently, or one approach can be used to inform the other.  

  • In exploratory sequential design, we can collect and analyze qualitative data and use the findings to inform upcoming quantitative data collection. A good example would be using interview or focus group results to design survey questions. This exploratory approach improves the survey as it helps to focus the questions on topics that are relevant or important to participants.  

  • Explanatory sequential design uses the findings of quantitative data to plan qualitative data collection. For example, a survey finding can be further explored using interviews to understand what, how, and why. This approach often leads to a much richer discussion as the evaluator already understands the underlying issues and can further explore those specific themes in interviews and/or focus groups.  

  • If we conduct the qualitative and quantitative data collection simultaneously in convergent design, the findings from one approach can still inform and drive change in an interactive approach. For example, using interviews and survey findings in multiple phases such that the data interact to inform subsequent versions and the final result. This approach is resource-intensive and requires many cycles of participation from respondents.  


Data integration at the methods level

Data integration at the methods level occurs when the qualitative data collection is linked to quantitative in the data collection or analysis.  

  • Data collection can be linked through the sampling frame (connecting) whereby participants for one method can be recruited/invited to participate in another method (e.g., recruiting focus group participants from survey respondents).  

I often use this approach in my evaluation practice where I recruit interview or focus group participants through surveys. It is often difficult to reach program participants thus, using one data collection effort to recruit for other methods reduces the burden on participants and minimizes evaluation cost. I use the app Calendly, which works like a dream to schedule interviews as the link to the app can be inserted at the end of the survey. Calendly will automatically show interested participants potential interview times and lets them schedule time that works for them.  

  • Other ways of integration at the methods level include embedding, which is linking data at multiple points (e.g., the use of the first round of qualitative data to understand and control for potential bias in an initial survey and using a second round of qualitative data to further explore survey results. In this example, there are two rounds of qualitative data collection and a survey between them. The evaluator uses the findings from each data collection effort to inform the next one) or bringing them together for analysis (merging). 


Data integration at the Interpretation and Reporting level

Data integration at the Interpretation and Reporting level often occurs in one of the following approaches:  

  • Narrative – describing the qualitative and quantitative findings in a report. The evaluator weaves the qualitative and quantitative findings together on a topic-by-topic basis or presents the findings in different sections.  

In my evaluation reports, I often do a combination of weaving and presenting in different sections. In the results section, I present the results of administrative data collection, surveys and interviews separately and bring them all together by general themes/topics in the discussion or key takeaway sections.  

  • Data transformation – one type of data is converted into the other type of data, then the transformed data is integrated with the other data and analyzed simultaneously. An example will be transforming the qualitative data into numeric counts and variables using content analysis to integrate with a quantitative database.  

  • Lastly, we can integrate data using joint displays, which incorporates the qualitative and quantitative data through visual means to present new considerations. The example below presents survey and focus group results side-by-side to provide comprehensive information. If you would like more information on joint displays, check out this article.  


Whichever approach you choose, integrating qualitative and quantitative data and triangulating the results often helps generate new insights and reliable evaluation results. Evaluators should consider which evaluations would benefit from mixed methods and carefully choose their data integration approach.  

Which data integration approach do you use often? Let us know in the comments.  

 

The data integration approaches listed here are a summary of the article “Achieving Integration in Mixed Methods Designs—Principles and Practices” by Michael D. Fetters, Leslie A. Curry, and John W. Creswell.  


Sign up for our newsletter

We’ll let you know about our new content, and curate the best new evaluation resources from around the web!


We respect your privacy.

Thank you!

 

Written by cplysy · Categorized: evalacademy

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 134
  • Go to page 135
  • Go to page 136
  • Go to page 137
  • Go to page 138
  • Interim pages omitted …
  • Go to page 304
  • Go to Next Page »

Footer

Follow our Work

The easiest way to stay connected to our work is to join our newsletter. You’ll get updates on projects, learn about new events, and hear stories from those evaluators whom the field continues to actively exclude and erase.

Get Updates

Want to take further action or join a pod? Click here to learn more.

Copyright © 2026 · The May 13 Group · Log in

en English
af Afrikaanssq Shqipam አማርኛar العربيةhy Հայերենaz Azərbaycan dilieu Euskarabe Беларуская моваbn বাংলাbs Bosanskibg Българскиca Catalàceb Cebuanony Chichewazh-CN 简体中文zh-TW 繁體中文co Corsuhr Hrvatskics Čeština‎da Dansknl Nederlandsen Englisheo Esperantoet Eestitl Filipinofi Suomifr Françaisfy Fryskgl Galegoka ქართულიde Deutschel Ελληνικάgu ગુજરાતીht Kreyol ayisyenha Harshen Hausahaw Ōlelo Hawaiʻiiw עִבְרִיתhi हिन्दीhmn Hmonghu Magyaris Íslenskaig Igboid Bahasa Indonesiaga Gaeilgeit Italianoja 日本語jw Basa Jawakn ಕನ್ನಡkk Қазақ тіліkm ភាសាខ្មែរko 한국어ku كوردی‎ky Кыргызчаlo ພາສາລາວla Latinlv Latviešu valodalt Lietuvių kalbalb Lëtzebuergeschmk Македонски јазикmg Malagasyms Bahasa Melayuml മലയാളംmt Maltesemi Te Reo Māorimr मराठीmn Монголmy ဗမာစာne नेपालीno Norsk bokmålps پښتوfa فارسیpl Polskipt Portuguêspa ਪੰਜਾਬੀro Românăru Русскийsm Samoangd Gàidhligsr Српски језикst Sesothosn Shonasd سنڌيsi සිංහලsk Slovenčinasl Slovenščinaso Afsoomaalies Españolsu Basa Sundasw Kiswahilisv Svenskatg Тоҷикӣta தமிழ்te తెలుగుth ไทยtr Türkçeuk Українськаur اردوuz O‘zbekchavi Tiếng Việtcy Cymraegxh isiXhosayi יידישyo Yorùbázu Zulu