• Skip to main content
  • Skip to footer
  • Home

The May 13 Group

the next day for evaluation

  • Get Involved
  • Our Work
  • About Us
You are here: Home / Archives for allblogs

allblogs

Mar 25 2024

(Mostly Free) Resources for Learning How to Code Qualitative Data

 

 

What is coding for qualitative data?

If you’ve found your way to this article, you probably have an idea of what coding for qualitative data looks like. Hint: It doesn’t require knowing Python, C++ or any other programming language.

Qualitative coding is a systematic process of labelling and organizing qualitative data. It is a way to analyze non-numerical data like interview and focus group transcripts, photographs, and field notes. 

In my current role as an evaluator, I usually use coding as a way to identify common and interesting themes from interviews I’ve conducted. These themes are then examined as a whole, to see what kind of narrative insights they can provide about the program that is being evaluated. If you want to know more about how to analyze qualitative data thematically, check out our Eval Academy article: Interpreting themes from qualitative data: thematic analysis.


How do you do it?

There are a lot of different ways to code qualitative data. In the past, I’ve used paper and a pen, sticky notes, MS Word, Excel, or a software program like Nvivo or Dedoose. I usually code my data thematically and inductively because it helps me to uncover unexpected insights. This is because inductive coding involves creating codes as you are going through the data. It forces me to keep an open mind about what the data could be saying – even if it differs from my preconceptions. I’ve also done deductive coding before, starting my coding with a set of agreed upon codes. I find that this method usually helps me to look for the answers to my evaluation questions more efficiently and can be useful when I don’t have a lot of time for analysis.

When I first started learning how to code qualitatively, it was a bit overwhelming because there are so many ways that you can do it. I really struggled to understand that there isn’t necessarily one “right way” of coding in evaluation. There are some general rules for rigour but beyond that, everyone seems to have their own preferred style. This approach slightly differs from academic institutions where you are often required to pick an established qualitative method with a specific underpinning theory, stick with it, and document your steps for review.


In my struggle to find answers about how to code qualitative data, I came across some resources that helped me learn a bit more about the theories behind qualitative coding and how others do it. These resources continue to help me to refine my coding processes. I hope you find them useful as well!

And if you still have questions after exploring my resource list, I recommend asking other evaluators and researchers about their methods or taking a course with a practical component.

You’re also welcome to leave a question or a comment on our Eval Academy LinkedIn!


(Mostly Free) Resources List for Learning Qualitative Coding

Most of these resources include some kind of step-by-step process for coding qualitative data. Some of them also include information on the different types of qualitative coding and when to use them.


Courses:

Delve’s Free Qualitative Data Analysis Course (mostly free)

Delve has created a free course on qualitative coding to promote their paid coding platform. This is a short, self-paced course suitable for beginners. I like that it guides you through coding for the first time with short, practical assignments. You don’t need to use their software to complete the course, but you can trial it for free if you want to use it for your learning.  

Qualitative Research Methods: Data coding and Analysis (mostly free)

MITx online offers free access to this self-paced course which is a shortened version of a semester long version taught by Professor Susan Silbey of MIT. The paid version allows you to participate in the assignments and receive a certificate upon completion. The free version still allows you to access all the content, as long as you sign in with an MITx online account. I found this course to be a really good study of step-by-step qualitative coding within an academic setting. Professor Silbey does a great job of explaining and demonstrating things like how to do line-by-line coding, create a codebook, and refine your codes.


Videos:

Qualitative Data Analysis 101 Tutorial: 6 Analysis Methods + Examples (free)

This is a 25-minute informational video by Grad Coach on YouTube about the different types of qualitative analysis. This is NOT a step-by-step guide to coding, but it does explain 6 different types of qualitative methods and when to use them. This is a good video to watch to learn about what kinds of qualitative methods exist outside of the ever-popular thematic analysis.

Qualitative Coding Tutorial: How to Code Qualitative Data for Analysis (4 Steps + Examples) (free)

This 27-minute YouTube video by Grad Coach explains the minute details of how to code qualitatively. It goes over some steps for how to code, as well as discussing different methods that you can use at each stage of coding. It ends with some tips for how to code your data.

Qualitative data analysis – Coding Tutorial – Initial Codes | “From Codes to Themes” episode 1 (free)

This video is the first part of a YouTube series on how to code by Dr. Kriukow. He’s sort of a qualitative data analysis influencer – if that is a thing. In this 23-minute video, he explains his thought process while he demonstrates how to code a transcript. If you ever wanted to know how other people code, this one is a good demonstration to watch. If you like the way that he codes, I think he has a paid course on Udemy. I’ve never taken it before, so I didn’t include it in this list.

Qualitative coding and thematic analysis in Microsoft Word (free)

MS Word is probably one of the most accessible ways to code because it is an app that most people already have on their computers. Dr. Kriukow shows you how to use MS Word to code and thematically analyze your data in this 28-minute YouTube video.

Ten Top Tips in Qualitative Data Analysis for New Researchers – Jude Spiers (free)

The International Institute for Qualitative Methodology hosted a master class webinar series, and this 1-hour lecture was part of it. This video is less practical than the other resources listed here, but it does offer some useful tips and tricks for how to code qualitative data in an academic setting.


Articles:

Interpreting themes from qualitative data: thematic analysis (free)

This Eval Academy article is one of our most popular. It’s a thorough guide on how to do thematic analysis, including a useful illustration on interpreting themes. Most of the other resources in this list focus on coding, but this one focuses on what you do AFTER coding all your data.

Using thematic analysis in psychology (mostly free)

The authors of this academic article on thematic analysis are well-known researchers of qualitative data methods. If you’re looking for some peer-reviewed literature on how to conduct thematic analysis, you should definitely read this one. It includes step-by-step explanations on how to conduct thematic analysis. This article may be paywalled on some sites.

The Essential Guide to Coding Qualitative Data (free)

Alongside their free course, Delve also has a free guide to coding qualitative data. It discusses a range of useful topics such as how to transcribe interviews, tools for coding qualitative data, and a step-by-step process for coding.

Analyzing Qualitative Data (free)

Learning for Action wrote this step-by-step article with tips for how to analyze qualitative data. Their example uses Excel to code the qualitative data, so it is a useful guide for that specific type of coding method.


What are your favourite resources for learning how to code qualitative data? Let us know in the comments below!

Written by cplysy · Categorized: evalacademy

Mar 25 2024

The Frustration of Searching for Evaluation Content

 

 

If you are an evaluator, or someone interested in learning more about evaluation, you might have experienced the frustration of searching for evaluation-related content online. The word evaluation is used in so many different contexts and industries, that it can be hard to filter out the noise and find the information you’re looking for.

The term ‘evaluation’ is a common denominator in numerous fields. From education to healthcare, from technology to arts, ‘evaluation’ is a universal process of assessing, measuring, and judging. It’s a critical component of decision-making, improvement, and advancement in any industry. In education, we evaluate students’ performance. In healthcare, we evaluate patients’ health conditions. In technology, we evaluate software performance. In arts, we evaluate the aesthetic appeal of a piece. The list goes on. The omnipresence of ‘evaluation’ in our vocabulary is a testament to its importance, but it also creates a significant challenge when searching for specific ‘evaluation’ content online.

I recently started following #evaluation on some social media accounts. Here’s some of the content I get that’s not at all about the professional development or learning opportunities I’d hoped for:

 

 

The problem gets even worse if you try to do some job searching. Lots of people have “evaluation” of something in their job description. Case in point:

And if you happen to be an evaluation consultant looking for RFPs for evaluation contracts it can be nearly impossible. Nearly every RFP in every field makes mention of the “RFP evaluation process”, thus wiping out your keyword search term in one fell swoop!


How can we improve the searchability of evaluation content?

As evaluators, we can do our part to improve the searchability of evaluation content online. Here are some suggestions:

  • Use specific and descriptive keywords when creating or sharing evaluation content, such as “program evaluation”, or “outcome evaluation”  

  • Use hashtags, tags, or categories to label and organize evaluation content on social media platforms, blogs, or websites

  • Join and follow online communities, networks, or groups that are dedicated to evaluation  such as your local Evaluation Association

  • Subscribe to newsletters, podcasts, or blogs that feature evaluation content, (hint: have you signed up for our Newsletter? Scroll to the bottom of this page to sign-up!)

  • Attend webinars and workshops by evaluation experts

  • Share and recommend evaluation content that you find useful, interesting, or relevant with your colleagues, friends, or followers

  • Follow your favourite evaluators on social media!


I’ve spent a bit of time in my career explaining to clients that evaluation is not the same as research, and yet sometimes searching for “research” content may generate better results than “evaluation” content. It seems unfair!

As more evaluation associations offer credentialing and add to the professionalization of our world, it may become easier to find evaluation. More universities and colleges are offering programs directly in evaluation, which further adds credibility to the role and field. I heard somewhere that evaluation is the fastest growing field that no one has heard of. Perhaps as evaluation moves more into the spotlight, searching for content will be easier.


What is your solution for finding quality evaluation content? Let us know in the comments below!

Written by cplysy · Categorized: evalacademy

Mar 25 2024

Playing the Fool: Why Asking a Few Silly Questions Makes You a Better Evaluator

As evaluators, it’s our job to ask questions. When I tell people about what my company does, I tell them we help organizations that do good, to do better, by asking the right questions and answering them. Through asking and answering questions, we frame our evaluation projects, unearth data, and share helpful insights and recommendations.

Many of us arrive in evaluation because of our penchant for asking questions. And even though we’ve all been told “there are no dumb questions,” sometimes it feels like we really should know the answer already. It can be intimidating to speak up and admit your lack of knowledge or understanding.

But evaluators need to get comfortable asking questions, silly or not. Ever heard the term “playing the fool?” The idea behind that phrase is that if someone behaves in a silly way, or repeatedly asks silly questions, they’ll be seen as, well, silly. They won’t be taken seriously. What evaluator, whether an internal employee or contracted consultant, wants to look silly? Well, there are some important reasons to consider asking some silly questions at your next evaluation meeting.


Michael Quinn Patton has likened the role of the evaluator to that of the court jester, also called a fool. The jester’s role in English courts was to entertain, and they had the special privilege of being immune from punishment for what they said. Their unique role enabled them to question, to bear bad news, or to present new or unwelcome perspectives without fear of reprisal. Personally, I’ve latched on to this metaphor and love the idea that evaluators can occupy a special position, speaking truth to power without (too much) worry for their position.


Here are a few situations where thinking of yourself as a court jester, playing the fool, can help you to be a great evaluator.

1. When understanding how an initiative operates

As an outsider, you probably don’t have all the details of the initiative you’re evaluating. Even if you are an internal evaluator, there may be some aspects of a program that you’re not familiar with, or don’t make sense. By positioning yourself as a person in need of educating, you create an environment in which those important details can be shared. In a project kick-off meeting, I like to say very directly that I will probably ask some silly questions and ask the team to help me learn.

2. When certain questions aren’t being asked, but should be

In setting the context where I’m seen very clearly as a non-expert, I can also query the “why” behind puzzling aspects that other team members can’t safely ask about – but may very well also be questioning. After all, I, this silly outsider, shouldn’t understand why a process was set up in the way it was, or how a decision was made. But an internal team member who may also be wondering the same thing doesn’t necessarily have the same psychological safety to ask that very question. By playing the fool here, I can ask about that “elephant in the room” that nobody seems to be addressing.

3. When the group needs to know it’s safe to query

Not every workplace operates with psychological safety. In some settings, the organizational culture is such that people are afraid to fail, or they have a very realistic concern that their job would be at risk if they asked too many questions. An evaluation project requires trust and honesty; when those qualities are absent from an evaluation, the project risks being able to fulfill its purpose. As an evaluator, you can be very intentional about modelling question-asking and encourage the team you’re working with to also speak up when something doesn’t make sense.

4. When important but unwelcome insights need to be shared

We always hope that evaluation projects are driven by a true desire to learn and improve. That’s not always the case, though. As much as we prepare clients for the possibility of both positive and negative outcomes, and potentially scary recommendations, they’re not always ready to hear those findings. I note that with great empathy – as a business owner myself, I wouldn’t be super excited to hear that things aren’t going well and major adjustments are needed, either. It’s tough to hear that you may have been misdirecting your efforts. But in the best interests of all, those insights do need to be addressed. You can lead with your inner fool by being curious, vulnerable, and creating that safe space for receiving unwelcome news. A bit of well-placed humour can help to reduce defensiveness and bring a bit of joy to an otherwise dismal event.


Now don’t take this imagery too far! You don’t need to intentionally appear less intelligent than you are, and you don’t need to wear a goofy hat with bells on it. I’m not suggesting that you ask outright stupid questions (not that you would anyway). Nobody said that the court jester was dumb – far from it, that royal fool was full of crafty insights and clever language (just like you!). You know that your “silly questions” are actually a very intentional way to get people talking about the important things. As a skilled court jester, you’re carefully navigating humility and professionalism to create a safe environment for constructive conversation. And your evaluation project will be better for it.

Written by cplysy · Categorized: evalacademy

Mar 18 2024

My Cartoon Illustration Process – Realist Evaluation Comics

Back in 2017 I was commissioned by the RAMESES II project (funded by *NIHR) to draw a series of cartoons on realist evaluation.  They have been made available for royalty-free use at ramesesproject.org, along with a collection of other realist evaluation resources.

In this blog post I want to take you through my cartoon illustration process using this project as an example.  The cartoons were created through direct collaboration with the wonderful Joanne Greenhalgh and Ana Manzano of the University of Leeds.  The full RAMSES II project provided insights throughout the process.

*The RAMESES II project was funded by NIHR HS&DR 14/19/19. The views expressed are those of the author(s) and not necessarily those of the NHS, the NIHR or the Department of Health.

Starting with the Illustration Challenge

A cartoon illustration project always starts with something to illustrate.  In this case it was a series of briefs on realist evaluation.

The challenge was to develop a series of cartoons inspired by the content within the briefs.  To get a sense of my process, let’s focus on one of those briefs.

A realist understanding of programme fidelity [PDF].

Finding cartoon inspiration and narrowing the scope.

In any substantive piece of literature, brief, or written report there is usually a ton that could inspire a cartoon.

The challenge is to figure out the important pieces and the ones that create the best cartoons.  Good illustration is about supporting the words, not replacing the words.

I find that the best cartoon illustrations usually come from the challenges, problems, and confusions.  My philosophy is to illustrate the problem, and the reader will read the text in search of the solution.

To find the key challenges I turn to the experts.  Through conversation I engage them in the process by finding where to focus.  Ultimately we narrow the scope.

Here is one of those areas of narrow focus.   This certainly helped to inspire the cartoons I have shared below.

Common confusion: illusion of control (that you can standardise programmes and control the context into which they are implemented either within an evaluation or the real world). RCTs are ardent believers in this illusion and think its possible to create at ‘closed system’ by controlling everything about context so that you can have ‘programme on’ compared to ‘programme off’ and everything else is the same except the existence of the programme. This idea has been transferred from drug trials to complex social programmes.

Sketching out the concepts.

Before diving into drawing the final cartoon, I start with a pen and my notebook.

With the low tech approach I can come up with more concepts than I could if started with my iPad.  And if a cartoon doesn’t work, or the client doesn’t like a certain cartoon, we can cut it here instead of putting more effort into the process.

Refining and completing.

After drawing the sketches and sharing with my client we have another conversation.  This time we talk about the concepts (what do they like? what don’t they like?).

At this point we tweak and cut.  And sometimes the sketches lead to more concepts.  Cartoon illustration is an iterative process, but hopefully with each revision we get a better product.

Here are a couple of the final fidelity cartoons.  What do you think?

A note from Joanne

We really enjoyed working with Chris. Not only was it fun, but it challenged us to think about what we really wanted to communicate about realist methods. The idea of illustrating ideas or issues people found confusing or difficult really resonated with us, as that had been one of the main reasons for embarking on the whole project in the first place. It felt like a genuine collaboration and I’m really pleased with what we have produced.

Here are a few more of my favorite comics from the collaboration.

What a realist reviewer looks like.

Realist recipe.

The protocol says we go this way.

Written by cplysy · Categorized: freshspectrum

Mar 13 2024

A Meta Reflection on Equitable Communications: Behind-the-Scenes of Creating the Equitable…

A Meta Reflection on Equitable Communications: Behind-the-Scenes of Creating the Equitable Communications Guide

After researching ways to share our findings and reports with equity in mind, we realized there wasn’t a go-to resource for equitable communications in the evaluation field. Together, we were inspired to develop the Equitable Communications Guide. This guide is a resource designed for evaluators in the social sector, which has relevant lessons for anyone looking to improve their communications! The guide explores how to communicate equitably, center the experiences of others, and convey the meaning behind key messages.

Equitable communications refers to using evaluation reports and messages to counter dominant narratives, embrace inclusivity, and center marginalized peoples.

We are not experts in equitable communications. But some of us know what it feels like for others to speak on our behalf and misrepresent our identities and experiences. Others of us have perpetrated the same violence against others and want to do better. We decided to draw on the work and guidance of thought leaders in other sectors, along with our own experience, to write the content found in this guide.

Creating this guide was a collaborative design process among our current and former team members. We didn’t just want a guide that showed people how to communicate equitably, we wanted a guide that could model what equitable communications can look like.

This blog post is an inside look at the inspiration behind developing the guide and everything that went into the process.

Early Stages

This project began the way many projects do: with a big vision, but uncertainty with how to create it. Our efforts began when our amazing intern Aranzazu Jorquiera Johnson conducted research on equitable communications and brought in lessons from her own knowledge around diversity, equity, and inclusion. But we did not have time to come together to create a shared vision for what we wanted the final guide to look like and how it could be useful for an evaluation audience.

When Shelli Golson-Mickens added her leadership to the project she began by centering the audience of the guide from the start, using human-centered design processes. She led us through a journey mapping exercise that considered the many people who could read and benefit from the guide. Using this persona profile template, we created personas, identified their needs, and created ideas for how to design a guide that met them.

On reflection, we would have liked to bring these personas into our process throughout the writing stage. While we struggled with time constraints, we found that the process of creating personas allowed us to envision and create a more inclusive guide and led us to a simplified and visual structure.

Process Stage

Using Aranzazu’s research and conducting some of our own, we started coding for themes we saw within resources related to equitable communications. These codes became the guideposts and strategies that are the heart of the guide.

Before we started writing, our teammate and evaluator/illustrator Kayla Boisvert helped us to envision an effective layout and design for the guide. Remembering our personas, we wanted something that would be digestible, where someone could open to the section relevant to them in the moment and get the information they need. We also knew this guide would be an aggregator of resources: not being experts ourselves, we were translating information for an evaluator audience and bringing lessons together in one place. There were wonderful deep dives into specific topics like equitable data visualization and language justice that we wanted people to find through our guide. Kayla mocked up some design options that captured the lessons we wanted to share and identified key resources for people to learn more, the design you see in our guide today.

Early sketch by Kayla Boisvert for the guide.

Once deciding on the layout, Shelli and co-author Alissa Marchant divided the writing by theme so that we could bring lessons together from across resources in several fields, including marketing, research, and advocacy. It felt like an iterative process because we found the strategies overlapped one another. We had many conversations to discuss ways to share pithy themes without repeating information. It was also hard to stay narrow: ultimately, this guide is about communicating data findings and not about how to maintain open and transparent communications throughout an evaluation (which is also important!). There was too much to say, but at the same time, we felt limited by the bounds we placed on ourselves. Ultimately, having a deadline — the Evaluation 23 conference where we were presenting our findings — obliged us to narrow our scope and workshop the language with feedback from our team, fellow evaluators who are a primary audience for the guide.

Publication

Living our values from the guide, we knew that putting our pens (and digital markers!) down was just the beginning.

Our first step was making the guide accessible to people with disabilities. As sighted people, we quickly realized how our visual approach to the guide (perfect for a sighted audience) was challenging for audiences who have limited eyesight. Kayla spent hours writing detailed alternative texts for each visual, and unfortunately we later learned that Canva (the design platform we used to design the guide) was poorly equipped to make the document ADA compliant. (Canva is improving. Shout out to Chris Lysy, who details its pitfalls and keeps us up with Canva’s capabilities in this helpful blog post.) We ultimately hired a freelancer who specializes in ADA compliance to help make a final PDF of the guide more accessible.

Since the guide was published, we have been sharing information from within the guide in various ways. We are not just relying on the written word! We also shared the guide at Evaluation 23 and a recent webinar (watch the recording here), with more workshops in the works. (If you’re reading this before March 15, please join us for Talking Data Equity!) We are grateful for the partnership of Elizabeth Grim, an independent consultant who writes about non-violent language, and Jonathan Schwabish, an author of the Do No Harm Guides, who co-presented with us. Sharing together has allowed us to continue to learn about equitable communications through our collaborations.

Takeaways

Writing this guide helped us to internalize some of the lessons within the guide in a new way. We felt a shift in our own perspective from what we should say to being curious about how other people perceive our communications and working to understand their cultural perspectives. Rather than a right or wrong answer, communications has become an opportunity to learn and better understand the people we are working with.

We understand that knowing how to communicate equitably is different from doing it well. We are just starting to practice the strategies we captured in the guide, and still learning how to communicate about the importance of equitable communications and advocate for more resources to communicate equitably in our client projects.

Although we sought to be as thorough as possible when writing this guide, we recognize that our use of language changes as society continues to evolve. And we know that what we created may miss something! We considered — and may still — develop a living document version of the guide where others can add their own insights as the world evolves and guidance changes. Until that time, please send us your thoughts and feedback on the guide in the comments here, or directly at info@innonet.org.

Thank you for learning alongside us! We look forward to your insights and continuing to learn together.


A Meta Reflection on Equitable Communications: Behind-the-Scenes of Creating the Equitable… was originally published in InnovationNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.

Written by cplysy · Categorized: innovationnet

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 55
  • Go to page 56
  • Go to page 57
  • Go to page 58
  • Go to page 59
  • Interim pages omitted …
  • Go to page 310
  • Go to Next Page »

Footer

Follow our Work

The easiest way to stay connected to our work is to join our newsletter. You’ll get updates on projects, learn about new events, and hear stories from those evaluators whom the field continues to actively exclude and erase.

Get Updates

Want to take further action or join a pod? Click here to learn more.

Copyright © 2026 · The May 13 Group · Log in

en English
af Afrikaanssq Shqipam አማርኛar العربيةhy Հայերենaz Azərbaycan dilieu Euskarabe Беларуская моваbn বাংলাbs Bosanskibg Българскиca Catalàceb Cebuanony Chichewazh-CN 简体中文zh-TW 繁體中文co Corsuhr Hrvatskics Čeština‎da Dansknl Nederlandsen Englisheo Esperantoet Eestitl Filipinofi Suomifr Françaisfy Fryskgl Galegoka ქართულიde Deutschel Ελληνικάgu ગુજરાતીht Kreyol ayisyenha Harshen Hausahaw Ōlelo Hawaiʻiiw עִבְרִיתhi हिन्दीhmn Hmonghu Magyaris Íslenskaig Igboid Bahasa Indonesiaga Gaeilgeit Italianoja 日本語jw Basa Jawakn ಕನ್ನಡkk Қазақ тіліkm ភាសាខ្មែរko 한국어ku كوردی‎ky Кыргызчаlo ພາສາລາວla Latinlv Latviešu valodalt Lietuvių kalbalb Lëtzebuergeschmk Македонски јазикmg Malagasyms Bahasa Melayuml മലയാളംmt Maltesemi Te Reo Māorimr मराठीmn Монголmy ဗမာစာne नेपालीno Norsk bokmålps پښتوfa فارسیpl Polskipt Portuguêspa ਪੰਜਾਬੀro Românăru Русскийsm Samoangd Gàidhligsr Српски језикst Sesothosn Shonasd سنڌيsi සිංහලsk Slovenčinasl Slovenščinaso Afsoomaalies Españolsu Basa Sundasw Kiswahilisv Svenskatg Тоҷикӣta தமிழ்te తెలుగుth ไทยtr Türkçeuk Українськаur اردوuz O‘zbekchavi Tiếng Việtcy Cymraegxh isiXhosayi יידישyo Yorùbázu Zulu