• Skip to main content
  • Skip to footer
  • Home

The May 13 Group

the next day for evaluation

  • Get Involved
  • Our Work
  • About Us
You are here: Home / Archives for cplysy

cplysy

Apr 15 2020

Evaluation Roundup – April 2020

 


New and Noteworthy — Reads


A framework for adapting evaluation designs in times of COVID-19

Last month’s roundup focused specifically on COVID-19 and its implications for evaluation. Evaluators continue to produce COVID-19 related resources as the global crisis continues.

The Independent Evaluation Group of the World Bank recently produced a framework organized around four questions to address the evaluation challenges during the COVID-19 pandemic, which include:

  1. Should we adapt our evaluation questions and scope? In short, of course. As they state, “we must consider what evaluation scope or angle can bring the most value at this time.”

  2. Can we improve what remains feasible? Yep – lots of good ideas on how evaluators can improve how we review and synthesize existing knowledge, including experimenting with AI.

  3. Can we find ways around what is infeasible? Lockdown conditions means findings creative ways to engage with stakeholders and collect data (see below under New and Noteworthy tools for a remote survey toolkit).

  4. Can we tap into alternative sources of evidence? There are opportunities to incorporate existing sources of data that evaluators do not typically tap into (e.g. geospatial, financial or social media).

Tapping into big data: Lessons from evaluators working in the food and agriculture sector

Speaking of opportunities to incorporate existing sources of data, Eval Forward recently posted a blog post titled Evaluation in the age of big data: Opportunities and challenges in agriculture and food security. In this post, they explore the challenges evaluators face (i.e. time and cost constraints and the trade-offs that are made because of that) and how big data and data analytics can strengthen evaluations. The post provides a list of some widely used big data techniques and their actual or potential applications in food and agriculture evaluation. These techniques include satellite and drones, remote sensors, GPS location data, social medial, Internet search data, integrated data platforms and biometric data. Interested in learning more about these techniques? Check out The UN Global Pulse website for over 100 case studies of how they have been applied.


New and Noteworthy — Tools


Remote Surveying – A toolkit on how to do it right

60 Decibels (@60_decibels) created a Remote Survey Toolkit to help you navigate phone survey best practices, survey providers, survey questions and more. Now that nearly everyone is practicing social distancing, face-to-face data collection has gone the way of the Dodo bird (well maybe not, but for the time being). The toolkit has lots of user-friendly tips, cheat sheets, decision trees, example questions, and other resources 60 Decibels has collected over the years that they have generously curated for all of us to use and benefit from!

Tools for capturing activities, outcomes, and learnings

Evaluation Support Scotland produces a number of free tools that can be downloaded from its site. Under its COVID-19 page it lists not only tips for evaluating during COVID-19 but practical tools. New tools include: 

  • Using contact forms to gather evidence during a call

  • Taking stock in a time of change (method sheet)

  • Using social media to evaluate other activities (method sheet)


New and Noteworthy — Courses and Webinars


April 30, 2020

Krazy Glue Messaging: Making Your Evaluation and Research Findings Sticky – Webinar

  • Presenter: Kylie Hutchinson (@EvaluationMaven)

May 2020

Evaluation in a time of change – Webinar (May 12, 2020)

  • Presenter: Evaluation Support Scotland

Facilitating Evaluation – Online Course

  • Instructor: Michael Quinn Patton (@MQuinnP)

Dynamic Data – Mastering Pivot Tables for Engaging Data Viz – Online Course

  • Instructor: Carolyn Hoessler (@carolynhoessler)

June 2020

Feminist Evaluation: Not your standard gender-responsive approach! – Online Course

  • Instructor: Donna Podems (@DonnaPodems)

Transformative Mixed-Methods Evaluation – Online Course


Subscribe to Our Newsletter

Sign up with your email address to receive news and updates from Eval Academy every month.


We respect your privacy.

Thank you!


 

Written by cplysy · Categorized: evalacademy

Apr 14 2020

The Evaluation Mindset: When Nobody Listens

If only we had known we were vastly underprepared for a global pandemic…

I say that tongue-in-cheek. Of course we knew. President after president was briefed with data and evidence showing the potential devastation a pandemic could have on our global society.

In advance of a pandemic, anything you say sounds alarmist,” Leavitt explained. “After a pandemic starts, everything you’ve done is inadequate.

Inside America’s 2-Decade Failure to Prepare for Coronavirus

I would also guess that across the world, epidemiologists, doctors, and health department staff are thinking, “I kept saying this would happen. But nobody would listen!”

As anyone in working in the realm of evidence knows, this kind of thing happens all the time with all sorts of problems.

The reality behind informed decision making is that while having the knowledge is necessary, it is never sufficient.

The curse of Cassandra

Imagine an oracle from a book or movie, say Professor Trelawney from Harry Potter. Quite a bit eccentric but offers important insight, by way of a prophecy, that is clouded in riddles.

Our focus is then on the hero who is tasked with disentangling the riddle. The oracle’s role in modern storytelling is usually just to deliver the riddle and then disappear.

The origin of the character type can be traced to Cassandra from Greek mythology. In that story Cassandra was given the gift to know the future. Subsequently she was cursed so that nobody would ever believe her prophecies.

She had the power to know, but not be believed.

It is a curse that unfortunately hits close to home for far too many evaluators. Knowing is only half the battle.

If they don’t get it, they won’t use it.

Evaluators love their methods. So much so that they often stick them at the front of the report.

How else is everyone going to know the precise steps we went through to come up with the answers and advice we plan to share?

The knee jerk reaction to nobody listening or caring about your methods is to skip to the conclusions and recommendations. Give the audience what they want! Put them up front, then leave the methods in the back for anyone who wants to go deeper.

But here’s the thing. That’s not enough.

The problem isn’t that the methods section is boring (well it might be, but that’s not the biggest problem).

The biggest problem is that in sharing your work and the eventual solutions, you forgot to show your audience why they should care. Maybe you assumed they already knew the problem, or they don’t care, so you skipped diving deeper.

But that could be a big mistake.

The best way to start anything, whether it’s an evaluation report, presentation, blog post, or cartoon set, is to start where you started. What is the essential problem or challenge? Why should anyone care?

Because if they don’t get why they should care, they’ll never listen.

When the HiPPO rules.

Do you know the HiPPO?

It’s an acronym for the Highest Paid Person’s Opinion. Unfortunately, so many design decisions are driven not by the evidence but by the HiPPO.

If you’re lucky, maybe the HiPPO is the most informed and should be driving the decisions. But I wouldn’t necessarily count on that, even when it seems true.

Here are some tips to change the situation.

  • It’s never a good idea to make it you versus them. Yes, speak up every once and awhile, but remember there is a power imbalance.
  • Bring in a third party. UX and Human Centered Design feature lots of user testing. It’s so much more effective to have a user tester indirectly tell a HiPPO their idea stinks.
  • Fall back on data and evidence. If you are truly using evidence to inform your perspective you shouldn’t need to cherry pick. “While that could be true, the evidence does not support moving in that direction.”
  • Data parties and placemats. In other words, have stakeholders analyze the data together. Sure you can do it yourself, but if you include stakeholders in the analysis they will be more likely to follow the recommendations.

Also, if you can’t identify the HiPPO on a project team, maybe it’s you.

Deliver a better presentation.

A better presentation isn’t just about making the data and evidence easier to understand. Remember, knowing is only half the battle.

A better presentation connects with the audience. And unless the audience is super hungry for the information, it doesn’t just show the data.

Show them why they should care.

And if they shouldn’t care…why are you presenting?

April 15, 3:30 PM Eastern/12:30 PM Pacific

Come join us for tomorrow’s unwebinar, our guest facilitator will be Dana Wanzer. The seed topic, The Integration of Research and Practice in Evaluation.

You can register here > https://www.crowdcast.io/e/evalcentral/

Topic inspiration: ROE TIG Week: Research on Evaluation – A Glance Towards Integrative Evaluation Science 

If you haven’t attended one, these sessions have been a blast.

The audience is growing and might even push me to have to pay for the bigger Crowdcast account sooner than later.

Lil Help?

So between Eval Central and Fresh Spectrum my tech expenses are increasing.

I’m just an indie consultant, so my technical overhead all comes out of pocket. So if you appreciate my work would you consider becoming a Patron?

Written by cplysy · Categorized: freshspectrum

Apr 14 2020

Goodbye Microsoft Defaults, Hello Data Viz Toolkit!

Hi there! My name is Courtney Sims and I have the privilege of being a monitoring and evaluation Associate for Sharp Insight, LLC based in the Washington, D.C. region.

About Sharp Insight

At Sharp Insight, we support our clients wherever they are on their program evaluation path, often creating reports and presentations for funders and other key stakeholders.

The majority of my clients are youth-serving nonprofit organizations, including those that run out-of-school time programs and adolescent sexual and reproductive health initiatives. I find joy in being able to produce data visualizations that tell their own story, with one of my proud moments being when a client shared, “I just love this, it’s like it reads itself to me.” 

Producing Reports that “Write Themselves”

Of course, we all know that reports with easy-to-read visualizations and clear messaging don’t just “write themselves!” 

Our team works hard to produce meaningful deliverables with charts and graphs that honor best practices in data visualization. 

However, creating effective visuals is just one part of our jobs.  We have to balance our time in chart development with not only report writing but other key tasks including research, tool development, data analyses, site visits, and workshops. 

This led us to wonder, how can we work smarter by streamlining our internal data visualization development processes?

Developing a Data Visualization Toolkit with Examples and Templates

As a team, we knew we needed a go-to resource that would guide each of us in developing powerful visuals without (re-)creating them from scratch each time.  We also knew we needed someone to “own” this project for our team – me!  

And so, in October of 2018, I enrolled in Depict Data Studio’s Great Graphs online course, and what a meaningful, practical professional development opportunity it has been!  

Over the course of months and inspired by Depict Data Studio’s Chart Chooser, I was able to gain the skills needed to develop a comprehensive data visualization toolkit for our team, with:

  • visualizations to show big numbers,
  • approach toward a target,
  • change over time, and even
  • qualitative data visual examples and standards.   

Building on our existing visuals, I worked to generate these templates in our company branded colors, easily adapted to client colors with our handy saved color-palettes.

The Impact of Developing a Data Visualization Toolkit

Across our team, we’ve developed even more. 

Thanks to Great Graphs, gone are the days of each of us taking the time to adjust Microsoft’s default formats by removing vertical and horizontal bars, shrinking gap widths, increasing data label font sizes and alignment, and the list goes on and on.

Our new data visualization toolkit allows us to maximize our ability to exceed our clients’ expectations for visual-heavy reports while still giving us the time and energy to fulfill our numerous other tasks associated with being productive, responsible evaluators.  And for that, we are genuinely grateful!

Written by cplysy · Categorized: depictdatastudio

Apr 14 2020

Better Data Collection

With so many people working from home and using their communication devices to do many of the tasks we once did in other ways or are now doing much more often or differently it’s tempting to think: it’s a perfect time to reach people for my research project.

That might be true, but it’s also fraught with problems. Before you set out on your ethnographic journey through the lives of your stakeholders or prep Surveymonkey for its journey through the jungles of the Internet we suggest you take a pause and consider the following before venturing forward.

  1. Context counts. Every time we engage in social research we must account for context. In the current situation with a global pandemic, we don’t know what the context is. The epidemiological, social policy, economic, and communications landscape is changing day-to-day and is influenced on a global level. With so many areas changing at once, the ability to gauge or even state the context becomes nearly impossible without resorting to over-generalized or vague statements like “complex” or “uncertain”.
  2. ” Seeing is not the same as looking”. Physician and economist Anupam Jena provides a great example of how we can miss the forest for the trees without examining some of the things that are hidden in plain sight. In times of profound transformation, we might need to re-think what it is we see as that will shape what questions we ask, what data we gather, and what answers we discover.
  3. User-experience. What is the state of mind of those who are answering your survey or responding to your interview? You might be speaking to someone who hasn’t left their house in three weeks. They might have people nearby all the time. This will determine the willingness or ability to respond, the kind of answers that are provided, and the openness of the response (for example, people might not want to share highly personal data on a shared computer or where people might see them entering or speaking about it).
  4. Sensemaking. When we don’t understand the context or its entirely new we look for what we know. The challenge right now is that we don’t know what it is that we’re looking at. Unless our research or evaluation work is focused on the now and understanding how and what we are doing at this moment, about this moment, and for this moment we risk developing data that is examined through the lens of history (what we’ve done before), which will be another context altogether. We’ll be making sense of the past through the lens of today.
  5. Attention. Are we paying attention? When so much of what we are exposed to now is coming through screens — big and small — there is a likelihood that we are reading things quickly. Electronic reading is not the same as reading paper-based text and tends to encourage skimming. When what we have read is — save for the back of the cereal box at breakfast — almost entirely digital (for many of us) the likelihood of instructions being skimmed might be higher. Proceed with caution.
  6. Health. Lastly, how well are we? When the effects of being inside, isolated, and perhaps exposed to a virus are real, present and pervasive, your audience might not be in the state where the depth and quality of thought are what we need to get the responses we want. Many of us are not our usual selves these days and our responses will reflect that.

See differently, think differently and that goes for how we assess and do our social research.

Photo by Stanislav Kondratiev on Unsplash

Written by cplysy · Categorized: cameronnorman

Apr 14 2020

The Virus Makes the Timeline

A few weeks ago, we heard from Dr. Anthony Fauci: “You’ve got to understand that you don’t make the timeline, the virus makes the timeline.”  Fauci’s words ring ever-more true to me today than when he first spoke them.  Today marks the one-month anniversary of my daughter’s school closing, which has now extended through the end of the year.

The lack of control on the timeline for a return to relative normalcy post-coronavirus is certainly frustrating for everyone.  But, I realized my experience as an evaluator has prepped me for being on someone else’s (or in this case a virus’s) timeline.  In particular, it recalls my experiences working with Institutional Review Boards and schools district reviews for research protocols, which I wrote about several years ago.  The review protocols are often tedious and time-consuming, and I am ultimately on the timeline of these reviewing bodies.  There are some lessons that I have learned in working with IRBs and school districts that apply to our current situation:

Preparation: I cannot control review schedules, but I can plan for them as best as possible. In the current working world, I can keep tabs on evolving situations with museum operations and have rough plans in mind for if, how, and when evaluation activities may resume.  The landscape is ever-changing, but having a pulse on events is necessary for adaptability.

Communication: It is tempting to stop communicating when things are out of your control, and you have no tangible updates, but it is at these times that communication is even more important.  This tip loops back into preparation.  Staying in touch with the changing landscape helps us stay prepared for a time when we can take back control of the timeline.

Empathy: Uncertainty is challenging for everyone.  As has been said many times of late, “we are all in this together.”  While this is true, it is also important to keep in mind that we may each be affected by COVID-19 in different ways (physical illness, financial insecurity, food insecurity, etc.) as well as cope with uncertainty in different ways.  Stephanie wrote a blog post a few weeks ago about vulnerability, which can help us facilitate empathy and understanding among individuals.

The post The Virus Makes the Timeline appeared first on RK&A.

Written by cplysy · Categorized: rka

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 271
  • Go to page 272
  • Go to page 273
  • Go to page 274
  • Go to page 275
  • Interim pages omitted …
  • Go to page 304
  • Go to Next Page »

Footer

Follow our Work

The easiest way to stay connected to our work is to join our newsletter. You’ll get updates on projects, learn about new events, and hear stories from those evaluators whom the field continues to actively exclude and erase.

Get Updates

Want to take further action or join a pod? Click here to learn more.

Copyright © 2026 · The May 13 Group · Log in

en English
af Afrikaanssq Shqipam አማርኛar العربيةhy Հայերենaz Azərbaycan dilieu Euskarabe Беларуская моваbn বাংলাbs Bosanskibg Българскиca Catalàceb Cebuanony Chichewazh-CN 简体中文zh-TW 繁體中文co Corsuhr Hrvatskics Čeština‎da Dansknl Nederlandsen Englisheo Esperantoet Eestitl Filipinofi Suomifr Françaisfy Fryskgl Galegoka ქართულიde Deutschel Ελληνικάgu ગુજરાતીht Kreyol ayisyenha Harshen Hausahaw Ōlelo Hawaiʻiiw עִבְרִיתhi हिन्दीhmn Hmonghu Magyaris Íslenskaig Igboid Bahasa Indonesiaga Gaeilgeit Italianoja 日本語jw Basa Jawakn ಕನ್ನಡkk Қазақ тіліkm ភាសាខ្មែរko 한국어ku كوردی‎ky Кыргызчаlo ພາສາລາວla Latinlv Latviešu valodalt Lietuvių kalbalb Lëtzebuergeschmk Македонски јазикmg Malagasyms Bahasa Melayuml മലയാളംmt Maltesemi Te Reo Māorimr मराठीmn Монголmy ဗမာစာne नेपालीno Norsk bokmålps پښتوfa فارسیpl Polskipt Portuguêspa ਪੰਜਾਬੀro Românăru Русскийsm Samoangd Gàidhligsr Српски језикst Sesothosn Shonasd سنڌيsi සිංහලsk Slovenčinasl Slovenščinaso Afsoomaalies Españolsu Basa Sundasw Kiswahilisv Svenskatg Тоҷикӣta தமிழ்te తెలుగుth ไทยtr Türkçeuk Українськаur اردوuz O‘zbekchavi Tiếng Việtcy Cymraegxh isiXhosayi יידישyo Yorùbázu Zulu