• Skip to main content
  • Skip to footer
  • Home

The May 13 Group

the next day for evaluation

  • Get Involved
  • Our Work
  • About Us
You are here: Home / Archives for cplysy

cplysy

Aug 04 2021

Ask Nicole: Hire a Consultant vs. Do it Yourself

Have a question you’d like to be featured? Let me know. I had a discovery call with a prospective client recently, who wants to build an evaluation framework for their staff to implement. The executive director wanted an evaluation framework that was general enough to cover each program but could be tailored to each program’s […]

The post Ask Nicole: Hire a Consultant vs. Do it Yourself appeared first on Nicole Clark Consulting.

Written by cplysy · Categorized: nicoleclark

Aug 02 2021

Comment on Factors that promote use: A conceptual framework by Defining evidence use

[…] Factors that promote use: A conceptual framework […]

Written by cplysy · Categorized: danawanzer

Jul 30 2021

How to Deliver Bad Results

 

You’ve just designed, implemented, and analyzed a client satisfaction survey. Trouble is: clients are not satisfied. Uh oh. No one likes to deliver bad news.

However, there are some strategies that will help not only to soften the blow, but to make this a rewarding experience.


1. Don’t try to sugarcoat it.

In many cases, organizations hire evaluators so that they can improve, or to learn if what they are doing is hitting the mark. If there’s room for improvement, this is exactly what they’ve hired you for. Hiding bad results is not helping your client. While they may not be thrilled to learn about low satisfaction scores, or that their outcomes are not being achieved, they do need to know it so that they can make changes to their services. It may be tempting to frame data in a positive light:

“27% of clients thought your services were the best!”

But try to think about what would be most helpful to your client, and likely being more direct is the better strategy:

“73% of clients saw room for improvement.”

2. Don’t wait.

No one wants to be blindsided. Chances are, as you begin data analysis you may start to see some clues of negative results. Definitely by the time you are writing the report and working on some data viz you know what the results of the evaluation are. Don’t wait to share these!

Reading about poor outcomes for the first time in a pdf-ed final report may make your client feel defensive. Take away the element of surprise by dropping some hints in your regular meetings; allow them to get past an emotional response.

This will also give you an opportunity to assess their early reactions and think about how to frame their response when you deliver your report. Better yet, engage your client in the report writing process – the stakeholders likely have insights to share. Take a participatory approach to the data analysis and writing to maximize engagement and learning. 

3. Offer some action-oriented steps to help your client improve. 

So, the clientele said they see some gaps in services, or perhaps the service delivery model is not achieving the desired outcomes. Hopefully, your evaluation has uncovered some of the “why,” which can help you to frame some recommendations or lessons learned. What has your evaluation uncovered that will help your client to address those gaps? This is where the value of your evaluation really comes through – the “so what?”. Better Evaluation offers some great tips for drafting recommendations.

4. Supplement with other results.

Perhaps clients weren’t loving the services, and there is room for improvement, but likely you do have some positive results. Perhaps staff are really satisfied in their work, or part of the process is going really well. This approach is a nice balance to the bad news, without sugar-coating. Just because you’ve uncovered some bad results, doesn’t mean that’s all you need to focus on. Highlight the great outcomes as well.

5. Prepare to get lots of questions and time for discussion.

Poor evaluation results, while disappointing, are also interesting. Likely your client will want to know more. Their questions may actually lead to follow-up evaluations. At the very least, it will likely result in some rich discussion.

Be prepared to facilitate these discussions when you present the findings. Consider adding in some additional time to your presentation, and whether other individuals could/should be invited. Hearing from everyone may uncover new insights or point to some next steps for the organization.

Clients may ask a lot of questions. Be careful to answer according to the stories told by the data without bringing in your own biases. You may even want to plan for a follow-up discussion.


As an evaluator, one of our primary roles is to help programs or organizations understand if they are effective. Are they achieving the results they intended to? If they already knew the answer with certainty, they likely wouldn’t have hired you.

So, chances are, from time to time you will come across findings that clients are not satisfied, or a program is not performing well. As an evaluator, you can view those findings as a real opportunity.

While delivering the not-so-great news may not be the best part of your job, helping that group to uncover new insights and make changes based on your findings is very rewarding.

To learn more about applying evaluation in practice, check out more of our articles, or connect with us over on Twitter (@EvalAcademy) or LinkedIn.


Sign up for our newsletter

We’ll let you know about our new content, and curate the best new evaluation resources from around the web!


We respect your privacy.

Thank you!


 

Written by cplysy · Categorized: evalacademy

Jul 30 2021

Preparing for Another Uncertain Year

Preparing for Another Uncertain School Year

I think we had all hoped that after two school years disrupted by COVID-19, the 2021-2022 school year could be a return to “normal.” 

Yet with the Delta variant surging throughout the country and children under 12 still unvaccinated, it’s becoming clear that educators and families are in for another year of uncertainty.

The theme of this summer’s blog posts has been to figure out the meaning behind the data we collect and track – what’s the “so what?”

We’re all wishing that the taste of “normalcy” that we’ve gotten in early summer could stick around –  we’re exhausted from the constant worry and grief that the pandemic has wrought on us as individuals, families, and communities. 

But if there’s one good thing to come from how long we’ve been dealing with the pandemic, it’s this: it’s not an unknown anymore. We have data to help us navigate it.

This is true on the medical and public health sides of the issue, and it’s also true for education. 

Let’s think about all of the information we already have about how to handle whatever comes our way this school year:



  • We know how to keep our schools clean and help students follow good hygiene practices.


  • We know what challenges our families have been facing and how we can engage and support them, even remotely.


  • If we need to, we know how to provide high-quality remote and hybrid learning opportunities for students to keep them engaged.

Plus, we’ve got a ton of documentation to remind us of what we know if we feel overwhelmed. 

We’ve got logs of when and how the building was cleaned, we’ve got past family surveys and chat transcripts/recordings from family Zoom meetings to tell us what their needs and concerns were, and we’ve got lesson plans and other records of the teaching and learning strategies we tried and felt successful with. 

… and many other sources of data and information too!

As we turn our calendars from July to August and get ready to embrace the new year, we can look back on our data from the past two years and figure out what the “so what” was. 

Now is the time to dig deep with the information we do have – even as we await more information about what this year will hold. 

As we look back at the “what” from the past two years, think about the impact that each of those things had:

Which routines and practices helped students comply with safety measures and made the community feel safer? Which really didn’t work?

What were our families’ greatest needs, and how did we work to meet them? Did our supports and referrals help mitigate the challenges they faced?

What did our students really enjoy about remote or hybrid learning? How can we replicate those practices? What strategies really did not work and should be avoided this year?

What did we do in a remote or hybrid environment that we might want to keep when we’re in person full-time because they were really effective for family engagement (i.e., Zoom PTO meetings and conferences)?

With some review of your data and reflection on what those data points actually meant for students, families, and staff, I hope that you’ll have a sense of reassurance.

Even though we are about to enter another uncertain year, our data can teach us a lot about how prepared we actually are.

Written by cplysy · Categorized: engagewithdata

Jul 29 2021

Comment on IRB 101: What are they? Why do they exist? by IRB 101: Risks to Research Participants » RK&A »

[…] my first post in this IRB 101 series, I described what IRBs are and why they exist.  IRBs exist to protect […]

Written by cplysy · Categorized: rka

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 169
  • Go to page 170
  • Go to page 171
  • Go to page 172
  • Go to page 173
  • Interim pages omitted …
  • Go to page 304
  • Go to Next Page »

Footer

Follow our Work

The easiest way to stay connected to our work is to join our newsletter. You’ll get updates on projects, learn about new events, and hear stories from those evaluators whom the field continues to actively exclude and erase.

Get Updates

Want to take further action or join a pod? Click here to learn more.

Copyright © 2026 · The May 13 Group · Log in

en English
af Afrikaanssq Shqipam አማርኛar العربيةhy Հայերենaz Azərbaycan dilieu Euskarabe Беларуская моваbn বাংলাbs Bosanskibg Българскиca Catalàceb Cebuanony Chichewazh-CN 简体中文zh-TW 繁體中文co Corsuhr Hrvatskics Čeština‎da Dansknl Nederlandsen Englisheo Esperantoet Eestitl Filipinofi Suomifr Françaisfy Fryskgl Galegoka ქართულიde Deutschel Ελληνικάgu ગુજરાતીht Kreyol ayisyenha Harshen Hausahaw Ōlelo Hawaiʻiiw עִבְרִיתhi हिन्दीhmn Hmonghu Magyaris Íslenskaig Igboid Bahasa Indonesiaga Gaeilgeit Italianoja 日本語jw Basa Jawakn ಕನ್ನಡkk Қазақ тіліkm ភាសាខ្មែរko 한국어ku كوردی‎ky Кыргызчаlo ພາສາລາວla Latinlv Latviešu valodalt Lietuvių kalbalb Lëtzebuergeschmk Македонски јазикmg Malagasyms Bahasa Melayuml മലയാളംmt Maltesemi Te Reo Māorimr मराठीmn Монголmy ဗမာစာne नेपालीno Norsk bokmålps پښتوfa فارسیpl Polskipt Portuguêspa ਪੰਜਾਬੀro Românăru Русскийsm Samoangd Gàidhligsr Српски језикst Sesothosn Shonasd سنڌيsi සිංහලsk Slovenčinasl Slovenščinaso Afsoomaalies Españolsu Basa Sundasw Kiswahilisv Svenskatg Тоҷикӣta தமிழ்te తెలుగుth ไทยtr Türkçeuk Українськаur اردوuz O‘zbekchavi Tiếng Việtcy Cymraegxh isiXhosayi יידישyo Yorùbázu Zulu