• Skip to main content
  • Skip to footer
  • Home

The May 13 Group

the next day for evaluation

  • Get Involved
  • Our Work
  • About Us
You are here: Home / Archives for cplysy

cplysy

Jun 11 2025

Benefits Realisation and Evaluation – Can outcomes and benefits be friends?

This week’s post is a guest post by Liz Richardson who is a senior specialist in evaluation at Natural England. As per usual when I have guest post, Liz wrote the words and I just drew the comics 🙂

As an evaluator I talk a lot about outcomes, what they are, how to word them, what they are not!  Terminology can be a daily hurdle as the confusion around outputs, outcomes and impacts is very genuine.  But then along come Benefits and their need to be realised.  I now feel I am on the other side of the terminology fence.  How does this fit in with evaluation and how are they different from outputs and outcomes? I am going to attempt to unravel this here.

Where I learned about the term Benefits Realisation.

This is something that has been more prominent in the UK since 2003, but I have become more aware of benefits through my job working on priority projects and they are a requirement  alongside evaluation as part of a project management process.  My experience is working with teams who are trying to define project benefits alongside outputs and outcomes and, as we evaluators are collaborators, this has led me to exploring this further and trying to bring both processes closer together.

Why it’s important.

Before we get on to what the difference is let’s think about why this is important to help us understand if our project or programme is achieving its aims and making a positive change.  Having two separate processes is unhelpful for project staff, so we need to bring both these things together in a meaningful way so one supports and informs the other, reduces the workload for teams and meets our learning and accountability needs.  Evaluation and benefits realisation are both crucial aspects of project management, but they serve different purposes and are conducted at different stages of a project.

How do we define benefits realisation and evaluation?

Let’s look at the definitions for each of them.  Evaluation draws on a range of approaches and methodologies to assess a project’s outputs, outcomes and impacts and uses a range of methods to collect and analyse data.  As part of evaluation, we consider attribution and contribution and the extent to which the delivery meets these criteria. Outcomes focus on changes in knowledge, skills, behaviours and attitudes.  We want to know how and why a project is effective and when it isn’t. So, hopefully we all agree with that.

A Benefit is a positive measurable improvement resulting from an outcome that is perceived as an advantage by an organisation. Benefits management identifies, plans and tracks benefits through to realisation which is the practice of ensuring that benefits are derived from outputs and outcomes using KPIs, timelines and milestones. 

What are some parallels between benefits and outcomes?

That seems reasonably clear, but it still feels like there is some overlap here.  Could a benefit be an impact or long term outcome?  While I expect there is room for overlap perhaps the difference lies in the focus.  Benefits feel very specific, tangible, have a value and can be monitored whereas a long term outcome is a broader change that leads to impact and  impact is something that a project is contributing towards rather than achieving solely on its own.

 What some of the differences?

 So now we can perhaps begin to see where some of the differences lie. Evaluation is using methods and approaches to ‘test’ theories and answer questions and is interested in what worked well and less well and why as well as the extent to which the outcomes are a result of the intervention.  Benefits management is ensuring that the planned benefits deliver value for the organisation and often focus on what they are worth. This highlights different audiences, outcomes for beneficiaries and benefits for organisations or funders, although there will be overlaps here.

Here is an example of evaluation and benefits realisation in action.

Let’s explore this using an example of a Workplace Wellbeing programme aiming to improve mental and physical health of employees.

For the Evaluation outcomes would include, reduced stress, improved mental well being and physical health.  We would want to know if the programme has been effective and what has worked well and not so well? The evaluation would want to understand if the changes are a result of the programme or whether anything else may have caused this e.g. a change in work practices implemented at the same time or activities outside of work. The methods used would take this into account.

Benefits realisation would focus on ensuring that the benefits defined at the start of the program are achieved and aligned with strategic goals. Benefits might include reduced absenteeism and employee retention thus supporting the organisational strategic aims around employee satisfaction and higher productivity. These benefits could be monitored and tracked and would be a result of the outcomes.

The key here is if these benefits are not realised it is the evaluation that will tell us why. I now feel that Benefits could have a place in a Theory of Change, or I am certainly getting closer to bringing evaluation and benefits realisation together.

Be good to get people feedback on if this rings true.

Written by cplysy · Categorized: freshspectrum

Jun 11 2025

Ask Nicole: How to Fix Cross-Sector Collaboration Challenges

Learn how to fix cross-sector collaboration challenges by addressing power, pace, priorities, and communication across roles.

The post Ask Nicole: How to Fix Cross-Sector Collaboration Challenges appeared first on Nicole Clark Consulting.

Written by cplysy · Categorized: nicoleclark

Jun 06 2025

Making Sense of Complexity: A Strategic Design Approach

In today’s health and service sector environment, leaders are facing more complexity than ever. From overlapping user needs and environmental disruptions to rapid policy changes and emerging technologies, it’s a lot to navigate. But complexity doesn’t have to be a roadblock—it can be an invitation to design smarter, more adaptive strategies. That’s where strategic design comes in.

At Cense, we think about complexity not as a problem to solve, but as a condition to work with.

What is complexity?

Complexity shows up when things are connected in ways that make outcomes uncertain. It happens when multiple forces—people, policies, technologies, environments—are interacting at the same time, often unpredictably. This isn’t something we can fix with a checklist or a five-year plan. Instead, it calls for a different mindset: one that’s curious, flexible, and aware of how things influence one another over time.

Why traditional strategy doesn’t work

Many planning models assume we can predict and control what’s going to happen. But in complex systems, that’s rarely the case. When the path ahead is unclear and keeps shifting, we need tools that help us sense, learn, and adaptas we go. That’s where strategic design comes in.

How strategic design helps

Strategic design blends systems thinking with creative problem-solving. It gives leaders a way to work with uncertainty by focusing on:

  • Framing the problem: Taking the time to ask the right questions before jumping to solutions.
  • Engaging people: Bringing in diverse perspectives from those with lived and professional experience.
  • Making and testing: Using quick experiments or pilots to learn what works in real settings.
  • Reflecting and adapting: Making time to learn from what’s happening and adjust as needed.

This approach doesn’t ignore the messiness—it embraces it. Strategic design helps you design with complexity, not against it.

Getting started

If you’re working in a health or social system and feeling stuck or overwhelmed, try this:

  1. Zoom out: Map out who and what is involved. What forces are interacting?
  2. Zoom in: Where are people feeling the pressure most? What are the patterns or signals?
  3. Ask different questions: Instead of “What’s the solution?”, try “What’s really going on here?”
  4. Start small: Try a low-risk experiment. Learn from it. Build from there.

You don’t need to have all the answers—just a way to learn as you go. Strategic design gives you a structured, flexible way to move forward with purpose, even when the future is unclear. We’ve prepared a simple 2-page strategic design worksheet that can help you get started.

Want help bringing this approach into your organization? Let’s talk.


Written by cplysy · Categorized: cameronnorman

Jun 05 2025

Slice-able Gantt Charts in Excel

I spent a couple hours livestreaming, and created this masterpiece:

a slice-able Gantt chart that automatically updates and populates itself when you add more rows to your dataset (i.e., no tedious manual updates).

How to Make Slice-able Gantt Charts in Excel

You can watch the high-level tutorial here:

https://youtu.be/LXF2du3-W70

What’s Inside

  • 0:00 Intro
  • 1:08 The end product: Slice-able in Excel. or printed/PDFd
  • 1:52 Gantt chart options in Excel: 1) Stacked bar chart or 2) Inside cells, like this
  • 3:50 Dataset
  • 5:51 Pivot table
  • 6:29 Slicer
  • 6:40 List of projects and their amounts
  • 8:30 Helper cells to the left and above
  • 9:55 AND formula to fill in the body of the table
  • 11:18 Conditional formatting
  • 12:50 Theme Colors
  • 13:40 Your Homework List 41:17 Want more details? Watch the 2.5-hr livestream
  • 14:37 Download this Gantt chart

Download the Excel File

It’s here.

Related Resources

  • The full livestream where I created this from scratch
  • Helper cells

Written by cplysy · Categorized: depictdatastudio

Jun 04 2025

Is the high reading level appropriate? Data UX Review

In this series of Data UX reviews I take real published reports and look for areas of improvement. The ultimate goal for these reviews is shared learning. This week, we discuss readability.

Today’s Report is an Evaluation Synthesis Report (2022-2023) produced by the Evaluation Office of the UN Environment Programme (UNEP). You can peruse the report on your own by following this link.

UNEP Evaluation Synthesis Report 2022-2023

The Basics

This is a long but fairly standard type of report. 114 pages plus a 20 page appendix. It feels professionally designed, with uniform structure throughout. My guess is that the evaluation team wrote the report and then handed it off to a graphic designer to organize and polish at the end.

There are charts throughout and a few icons but it is not systematically illustrated. The report also has fairly lengthy sections without any illustrations at all. It is not designed to be easy to skim and does not highlight important or critical information. On a skim your eyes will probably go from chart to chart and possibly stop to reach the chapter introductions.

There are quite a few things that we can learn from this report in terms of user experience, but let’s focus on one in particular. Readability.

Most reports fail at basic readability.

This feels like a very standard kind of global program evaluation report. Many of the final reports and synthesis reports I read feel a lot like this one. And the easiest way to show one of the biggest challenges is to assess the reading level.

Let me repeat. This is not to pick on the report authors. MANY reports I see, read a lot like this one.

Luckily, reading level assessments are super easy these days, just ask one of your handy dandy generative AI tools for some support (I used Claude’s Sonnet 4 for this one).

Here’s the basic findings.

Overall Reading Level: Graduate/Professional Level (17th+ grade)

Key Metrics:

  • Flesch-Kincaid Grade Level: 17.3 (Graduate school level)
  • Flesch Reading Ease Score: 18.0 (Very difficult – graduate level)
  • Average sentence length: 27.3 words (Very long – typical academic writing)
  • Complex vocabulary: 28.5% of words have 3+ syllables

Specific Challenges:

1. Sentence Complexity

  • 60% of sentences exceed 20 words
  • Average sentence length of 27+ words is significantly above readable standards
  • Multiple embedded clauses and technical qualifications

2. Vocabulary Difficulty

  • High density of multi-syllabic technical terms
  • Specialized UN/environmental jargon throughout
  • Complex institutional terminology

3. Document Structure

  • Dense, academic formatting
  • Heavy use of abbreviations (UNEP, GEF, MTS, etc.)
  • Multiple cross-references and technical citations

Reading Level Interpretation:

Flesch Reading Ease Categories:

  • 0-30: Very Difficult (Graduate level) ? This report scores 18.0
  • 30-50: Difficult (College level)
  • 50-60: Fairly Difficult (High school level)

Target Audience: Environmental professionals, policy makers, academic researchers, and UN system stakeholders with specialized knowledge.

Recommendations for Improved Accessibility:

  1. Reduce sentence length to 15-20 words average
  2. Add executive summaries with simplified language
  3. Define technical terms in a glossary
  4. Use more active voice instead of passive constructions
  5. Break up dense paragraphs with bullet points and subheadings
  6. Provide visual summaries (charts, infographics) for key findings

This document is appropriately written for its intended professional audience but would require significant simplification for broader public consumption.

According to the AI, this is appropriately written, but that’s a trap.

Just because your audience is filled with people who have advanced and insider knowledge does not mean it’s a good idea to write at a 17th grade reading level.

This is especially true for any type of guidance set to be delivered to busy, overwhelmed, policy decision makers. Or for any guidance designed to reach a global audience, many of whom not speak english as a first (or even maybe second) language.

Let’s look at some specific examples of complex language pulled from the AI analysis that gives potential revisions.

Top of Page 28

"This set of performance evaluation criteria have evolved over time to: a) be responsive to UNEP's policies and areas of priority; b) make necessary areas of learning visible and c) create a comprehensive and transparent framework against which all performance aspects of UNEP's work can be evaluated. In Table 1 below the major adjustments made in recent years are noted."

From the Evaluation Criteria section:

“This set of performance evaluation criteria have evolved over time to: a) be responsive to UNEP’s policies and areas of priority; b) make necessary areas of learning visible and c) create a comprehensive and transparent framework against which all performance aspects of UNEP’s work can be evaluated. In Table 1 below the major adjustments made in recent years are noted.”

Problems:

  • Technical terminology without definition
  • Complex list structure interrupting flow
  • Abstract concepts not explained
  • Run-on sentences

Better version:

“UNEP has updated its evaluation standards over time for three reasons: to match current priorities, to highlight key lessons, and to create clear measurement tools. Table 1 shows recent changes to these standards.”

Top of Page 44

"Since project interventions are very diverse thematically, geographically, and in terms of their resource envelopes, conscious effort is needed to ensure that performance assessments are made in a consistent manner. In this regard, the Evaluation Office is continually aiming to improve the objectivity and comparability of its evaluation approach across UNEP projects. External evaluation consultants are routinely provided with standard Terms of Reference for project evaluations, as well as detailed guidance on evaluation processes and methods, report structure, content, and quality, to help maintain consistent approaches and performance assessments."

From Chapter 3:

“Since project interventions are very diverse thematically, geographically, and in terms of their resource envelopes, conscious effort is needed to ensure that performance assessments are made in a consistent manner. In this regard, the Evaluation Office is continually aiming to improve the objectivity and comparability of its evaluation approach across UNEP projects. External evaluation consultants are routinely provided with standard Terms of Reference for project evaluations, as well as detailed guidance on evaluation processes and methods, report structure, content, and quality, to help maintain consistent approaches and performance assessments.”

Problems:

  • Three very long sentences (34, 25, and 44 words respectively)
  • Unclear pronoun references (“In this regard”)
  • Technical jargon not explained
  • Passive voice predominates
  • Poor paragraph unity – jumps between topics

Better version:

“UNEP projects vary widely in theme, location, and budget size. This diversity creates challenges for fair evaluation. To solve this problem, the Evaluation Office gives all external evaluators the same guidelines and standards. These tools help ensure consistent, objective assessments across all projects.”

Middle of Page 76

"The absence of a results-based framework with expected accomplishments, indicators (in the MTS) and also the lack a reporting system with gender results, was a major obstacle in terms of identifying gender results that can be attributed to the impact of the GPS."

From the Strategic Evaluations section:

“The absence of a results-based framework with expected accomplishments, indicators (in the MTS) and also the lack a reporting system with gender results, was a major obstacle in terms of identifying gender results that can be attributed to the impact of the GPS.”

Problems:

  • Bureaucratic buzzwords (“results-based framework,” “expected accomplishments”)
  • Grammatical errors (“also the lack a reporting system”)
  • Unclear abbreviations (MTS, GPS)
  • Convoluted structure

Better version:

“UNEP lacked clear goals and indicators for gender work. Without proper tracking systems, it was impossible to measure the success of the Gender Policy Strategy (GPS).”

So what is an appropriate reading level for an evaluation report?

Reading level can be a design choice. Those terms of reference things you have to check off to use some piece of software is overly complicated often by design. But I don’t think this applies to most evaluation and research reports.

The evaluation world is filled with super smart people who spent a lot of their time in school. Meaning they have pretty expansive vocabularies, often like precision, and have been rewarded for that precision in their academic studies. So when it comes to report writing, words and paragraphs will naturally get complicated.

One easy way to pick an aspirational reading level is to compare the reading levels of different publications. There are major differences between reading a People magazine, The Economist, and the journal Science.

My suggestion for most reports is to try to write for grades 10-12, but you can settle for writing that ends up in the 12-14 range.

What is this blog post’s reading level?

I thought it would be interesting the assess the readability of this particular blog post. And it came out to a 12.2 grade level 🙂

Here is the full review from Claude.

Overall Reading Level: College/Professional Level (Grade 12.2)

Key Metrics:

  • Flesch-Kincaid Grade Level: 12.2 (College/Professional level)
  • Flesch Reading Ease Score: 41.6 (Moderately difficult)
  • Average sentence length: 19.2 words
  • Complex vocabulary: 17.8% of words have 3+ syllables
  • Total words: 1,212
  • Total sentences: 63

Reading Level Interpretation: This falls into the “Difficult (College level)” category on the Flesch Reading Ease scale.

Assessment:

This document is a great example of practicing what you preach. The author critiques the UNEP report for being written at a 17th+ grade level, and their own writing comes in at a much more accessible 12.2 grade level – exactly in the range they recommend (grades 10-14).

Strengths of this writing:

  • Reasonable sentence length (19.2 words average vs. the UNEP’s 27.3 words)
  • Lower complexity (17.8% complex words vs. UNEP’s 28.5%)
  • Clear structure with headings and examples
  • Conversational tone that makes technical concepts accessible
  • Concrete examples rather than abstract concepts

Why this works well: The author successfully demonstrates that you can discuss complex evaluation methodology and data analysis without resorting to overly academic language. They use technical terms when necessary but explain them clearly, and they break up dense concepts with examples and clear formatting.

This document achieves the sweet spot the author advocates for – professional enough for the target audience (evaluation professionals, data analysts) while remaining accessible to busy decision-makers who may not have deep technical expertise.

Brief Reading Level Guide

Want to go deeper on the differences between different reading levels? Here is a quick guide.

Grade Level 8-10 (Middle/High School)

Characteristics:

  • Sentence structure: Short to medium sentences (10-15 words average)
  • Vocabulary: Common, everyday words with occasional complex terms explained
  • Concepts: Straightforward topics, concrete rather than abstract
  • Organization: Clear, linear structure with obvious transitions
  • Tone: Conversational, direct communication
  • Background knowledge: Minimal assumptions about reader expertise
  • Writing style: Active voice, simple explanations

Example Publications:

  • Time Magazine
  • People Magazine
  • Reader’s Digest
  • USA Today
  • Entertainment Weekly
  • Sports Illustrated
  • Cosmopolitan
  • Popular Mechanics (basic articles)

Grade Level 10-12 (High School/Some College)

Characteristics:

  • Sentence structure: Medium-length sentences with some complexity (15-20 words)
  • Vocabulary: Mix of common and moderately advanced words
  • Concepts: Current events and issues explained with context
  • Organization: Clear structure with supporting details
  • Tone: Professional but accessible
  • Background knowledge: Basic cultural and educational literacy assumed
  • Writing style: Balance of explanation and analysis

Example Publications:

  • The New York Times (news sections)
  • The Wall Street Journal (general articles)
  • Newsweek
  • U.S. News & World Report
  • Rolling Stone
  • Wired (general tech articles)
  • National Geographic (accessible pieces)
  • Psychology Today

Grade Level 12-14 (College/University)

Characteristics:

  • Sentence structure: Complex sentences with multiple clauses (20-25 words)
  • Vocabulary: Advanced vocabulary, specialized terms used without definition
  • Concepts: Abstract ideas, nuanced analysis, multiple perspectives
  • Organization: Sophisticated structure with implied connections
  • Tone: Formal, analytical, assumes educated readership
  • Background knowledge: College-level cultural, historical, and subject knowledge
  • Writing style: Dense information, sustained argumentation

Example Publications:

  • The Economist
  • The Atlantic
  • The New Yorker
  • Harper’s Magazine
  • Scientific American
  • Harvard Business Review
  • Foreign Affairs
  • Smithsonian Magazine

Grade Level 14-16 (Graduate/Advanced Professional)

Characteristics:

  • Sentence structure: Very complex, multi-layered sentences (25+ words)
  • Vocabulary: Sophisticated terminology, discipline-specific language
  • Concepts: Highly abstract, theoretical frameworks, expert-level analysis
  • Organization: Complex argumentation with subtle transitions
  • Tone: Scholarly, assumes significant prior knowledge
  • Background knowledge: Graduate-level expertise in subject areas
  • Writing style: Dense prose, intricate reasoning, cross-disciplinary references

Example Publications:

  • The New York Review of Books
  • London Review of Books
  • Foreign Policy
  • The Wilson Quarterly
  • Daedalus
  • Commentary Magazine
  • The New Criterion
  • Advanced Harvard Business Review pieces

Grade Level 17+ (Expert/Academic Research)

Characteristics:

  • Sentence structure: Highly complex, technical precision over readability
  • Vocabulary: Field-specific jargon, technical terminology without explanation
  • Concepts: Original research, specialized methodologies, expert-only content
  • Organization: Academic format (abstract, methodology, results, discussion)
  • Tone: Objective, formal, peer-to-peer expert communication
  • Background knowledge: Deep specialization in specific academic/professional field
  • Writing style: Primary source material, data-heavy, assumes expert readership

Example Publications:

  • Nature
  • Science
  • The Lancet
  • JAMA (Journal of the American Medical Association)
  • American Economic Review
  • Harvard Law Review
  • American Political Science Review
  • Cell (biology journal)
  • Physical Review Letters (physics)
  • Journal of Finance

Want help making your reports or data products more readable?

Check out my Data UX Framework. Peruse my services. Download my quick assessment.

Or simply, schedule a free 30 minute consultation with me by following this link.

Written by cplysy · Categorized: freshspectrum

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 8
  • Go to page 9
  • Go to page 10
  • Go to page 11
  • Go to page 12
  • Interim pages omitted …
  • Go to page 304
  • Go to Next Page »

Footer

Follow our Work

The easiest way to stay connected to our work is to join our newsletter. You’ll get updates on projects, learn about new events, and hear stories from those evaluators whom the field continues to actively exclude and erase.

Get Updates

Want to take further action or join a pod? Click here to learn more.

Copyright © 2026 · The May 13 Group · Log in

en English
af Afrikaanssq Shqipam አማርኛar العربيةhy Հայերենaz Azərbaycan dilieu Euskarabe Беларуская моваbn বাংলাbs Bosanskibg Българскиca Catalàceb Cebuanony Chichewazh-CN 简体中文zh-TW 繁體中文co Corsuhr Hrvatskics Čeština‎da Dansknl Nederlandsen Englisheo Esperantoet Eestitl Filipinofi Suomifr Françaisfy Fryskgl Galegoka ქართულიde Deutschel Ελληνικάgu ગુજરાતીht Kreyol ayisyenha Harshen Hausahaw Ōlelo Hawaiʻiiw עִבְרִיתhi हिन्दीhmn Hmonghu Magyaris Íslenskaig Igboid Bahasa Indonesiaga Gaeilgeit Italianoja 日本語jw Basa Jawakn ಕನ್ನಡkk Қазақ тіліkm ភាសាខ្មែរko 한국어ku كوردی‎ky Кыргызчаlo ພາສາລາວla Latinlv Latviešu valodalt Lietuvių kalbalb Lëtzebuergeschmk Македонски јазикmg Malagasyms Bahasa Melayuml മലയാളംmt Maltesemi Te Reo Māorimr मराठीmn Монголmy ဗမာစာne नेपालीno Norsk bokmålps پښتوfa فارسیpl Polskipt Portuguêspa ਪੰਜਾਬੀro Românăru Русскийsm Samoangd Gàidhligsr Српски језикst Sesothosn Shonasd سنڌيsi සිංහලsk Slovenčinasl Slovenščinaso Afsoomaalies Españolsu Basa Sundasw Kiswahilisv Svenskatg Тоҷикӣta தமிழ்te తెలుగుth ไทยtr Türkçeuk Українськаur اردوuz O‘zbekchavi Tiếng Việtcy Cymraegxh isiXhosayi יידישyo Yorùbázu Zulu