• Skip to main content
  • Skip to footer
  • Home

The May 13 Group

the next day for evaluation

  • Get Involved
  • Our Work
  • About Us
You are here: Home / Archives for allblogs

allblogs

Sep 07 2021

What you will learn if you join my workshop.

Just a note before I jump in. The pre-launch discount for my workshop expires tomorrow (9/8) at 12PM eastern. The workshop will never be cheaper!

Over the years, when a colleague would take a workshop at a conference, I would ask them what they learned. And then I would ask, “was it worth the money?”

The answers would vary of course, depending on the subject, the workshop host, and who paid for it in the first place. It’s always easier to give a good review of a workshop if someone else is paying the bill.

But regardless, over and over again, the answer to the “was it worth it question” almost always came down to one thing…takeaways.

What is a takeaway?

A workshop takeaway is that thing that sticks in your head well after the workshop has ended. It’s the tool, the method, or the idea that you will take home from the workshop and then put into practice.

Most workshops are designed to deliver a few takeaways. Usually one or two big ones and then maybe a few secondary little ones. It’s incredibly hard to expect to deliver anything more in just a few hours sitting with some peers inside a stuffy hotel conference room.

The value of an online workshop.

The value of an online workshop is TIME.

Some of my peers use this time to deliver a comprehensive curriculum designed to bring about a specific transformation. If you join _____ you will become ____.

But that’s not the only way to use that extra time.

What if instead of trying to deliver a transformation, I used that extra time to deliver lots and lots of takeaways? So instead of this being just a workshop, I lead something equivalent to lots of workshops.

Lots of Workshops in one Workshop.

Ummmmm well I don’t know what I don’t know so I’m excited to learn some of that.

I love that quote from one of my new workshop registrants. After she registered I had asked what she was most looking forward to learning. It didn’t occur to me until that moment, but that’s exactly what I’m trying to teach.

Each week we’ll dive into something different. Each time a practical topic that I find useful in my own work but likely a topic that many of my data peers don’t know much about.

I want each week to feel like a self-contained mini-workshop, with it’s own takeaways.

These first 8 weeks are just a start.

  • Session 1: Building slide style infographics. (9/8 at 2PM Eastern)
  • Session 2: The web is filled with funnels, how to build yours. (9/15 at 2PM Eastern)
  • Session 3: Building one-filter dashboards with Tableau. (9/22, time TBD)
  • Session 4: External analytics as inspiration. (9/29, time TBD)
  • Session 5: Illustrating social media with data. (date & time TBD)
  • Session 6: The continued importance of building an email list. (date & time TBD)
  • Session 7: An intro to powerful (and free) UI design software. (date & time TBD)
  • Session 8: Social media insight tools and how to use them. (date & time TBD)

My starting plan is to meet weekly on Wednesdays at 2PM Eastern. But depending on the response (you are part of a global audience) we may shift/stagger the time/days to meet participant needs.

Join US!

It’s not too late to get the starting pre-launch 30% off discount, but that will expire on Wednesday, September 8 at 12PM Eastern. So click this link and join us today.

Written by cplysy · Categorized: freshspectrum

Sep 07 2021

When Disaster Strikes: Assistance by Museums Nearby

By: Emlyn Koster

Emlyn recalls how Liberty Science Center, located across the lower Hudson from Manhattan and where he was President & CEO from 1996-2011, assisted the next-of-kin and surrounding community in the aftermath of the terrorist attacks on the World Trade Center on and after September 11, 2001.

Every year it seems, the museum world is jolted by breaking news of damage or destruction of an institution due to a fire, natural disaster, or an invasion. Recent examples are Brazil’s National Museum in Rio de Janeiro, Haiti’s Art Museum in Port-au-Prince, the Museum of Chinese in America in Manhattan, and antiquities in Afghanistan. Museum associations, including ICOM and the US Committee of the Blue Shield, may step in to help salvage collections. However, not making headline news―but the focus of this post—are situations of museums in the vicinity of a disaster which, while not directly impacted, assist next-of-kin and surrounding communities.

Twenty Years Ago

My vivid memory of the terrifying event across Liberty State Park and the lower Hudson from Liberty Science Center (‘the Center’) in full view of the World Trade Center’s twin towers has interwoven parts. Foremost is distress over what transpired at Ground Zero: the second is the aftermath involving the Center. Below is my recollection of an early moment that began the preface in a book about fostering empathy in museums:

The Twin Towers hover over a swampy wilderness in Liberty State Park on a foggy, misty, ethereal morning. Jersey City. March 1991
Source

“One afternoon in late September 2001, I joined a ferry taking several hundred next-of-kin of New Jersey victims from the World Trade Center across the Hudson River to Lower Manhattan’s North Cove Marina, the closest dock to Ground Zero. Grieving family members clutched teddy bears given to them as we boarded. Arranged by the New Jersey Family Assistance Center at Liberty State Park… this trip was their first opportunity to visit the remains of the fallen World Trade Center twin towers. Anxiously huddled together, we slowly walked past ash-laden trees and walls covered with frantic messages about missing loved ones, past emergency officials and site workers with bowed heads, across empty roads, and onto a makeshift platform overlooking the smoldering mountain of jagged debris. Weeping and whispering were the only sounds as we solemnly reflected on the riveting scene before us. Indivisible were the gravity and uncertainty of the entire Ground Zero situation and the overwhelming, deeply personal sadness of everyone there. Personal tension was exponentially compounded by the indescribable sorrow of the next-of-kin all around me” (Emlyn Koster, 2016. Foreword. In: Fostering Empathy in Museums, Elif Gokcigdem (Ed.), Rowman & Littlefield, vii-xi).

Below is a reprise of details in a requested article for a magazine one year after the tragedy (Emlyn Koster, 2002, A Tragedy Revisited, Muse, Canadian Museums Association, September/October, 26):

“Around 8:15 am on September 11, 2001, the car radio told me that my commute would be delayed by a major highway accident, so I detoured though local city streets. As the station interrupted normal programing to report that a plane had just collided with the World Trade Center’s north tower, the Manhattan skyline came into full view and I saw smoke billowing from its upper floors… I sped to the office and was met by several frantic staff... it had just become evident that the plane hitting the tower was terrorism, not an accident. I wasted no time in ordering a building evacuation… [we were] confronted with the biggest news story in recent memory and our faces and cries expressed our deepening shock and grief… We notified the region’s emergency authorities that our institution stood ready to help in whatever way it could. Soon, medical teams left with our first aid supplies, and commuters who had escaped the disaster site by ferry poured into our building. By late evening, several dozen mobile TV news units from across the eastern U.S. and Canada were set up around the same spot where the staff had gathered that morning.

… Liberty Science Center was closed to the public for two weeks. We continued to support the media, assisted with police communication needs, and were involved in a public vigil organized by the State. When New Jersey developed its plan for an assistance center for the families of victims, it was concluded that Liberty Science Center would be in a support mode to a full-service facility [at the Hudson River bank]… Working overnight with State government officials and aid workers, our staff helped to set up this Family Assistance Center. We processed all security credentials for its staff and volunteers, our caterer worked with the Red Cross to provide meals, and we had staff ready and trained for families of victims… We were also working with trauma counselors to help ourselves come to terms with what had happened and to guide us on how to interact with visitors when we reopened. We made changes to an exhibition about buildings and to advertising for our new giant screen film about the human body in sensitivity to the new issues on people’s minds. Over the following weeks, we hosted a wide variety of related events, including an international religious ceremony… Almost a year later, we continue to be a venue for follow-up events. We also participate in the Gift of New York Program which gives free museum visits to families of victims.

… September 11, 2002 will be a special day. The staff will gather for breakfast, as we did every morning during our closure last September, and we will share our memories. There will be a blood drive for staff and news crews will return to our site for live network feeds.

… I have thought a great deal about the broader learning for the museum field afforded by this experience. The lessons are many, particularly because September 11 occurred at time of increasing external consciousness on the part of museums.

                                                … Does your museum have contact information for all emergency authorities in its region? Would you have to check with your board before closing and switching to an emergency assistance role? Do you have arrangements for assembly of staff elsewhere in the event of evacuation? Are computer records regularly stored offsite at more than one location?  Can you access your phone system for changed public messages from the outside? Does your staff know how to access update information during an emergency or closure? What is your museum’s inventory of facilities and skills that could be useful in an area emergency and is this in the hands of those in charge of emergency planning? Have your museum’s learning environments ever served as a helping hand … at times of uncertainty and stress? … As the head of one of the mostly directly affected museums, I was proud of the way that Liberty Science Center did all it could to help. As an organization, we came closer together and we never blinked at the accumulating $700,000 direct cost of our participation... with a mission strongly rooted in social responsibility, we managed to approach an extreme situation with flexibility and fortitude.”

Trauma psychologists promptly documented their reflections about the Center as a place for comfort in a time of need and about its partnership with The Families of September 11 which nurtured a curriculum for how schools could boost their resilience in troubled times (Donna Gaffney and Emlyn Koster, 2016. Learning from the challenges of our time: The Families of September 11 and Liberty Science Center. In: Fostering Empathy in Museums, Elif Gokcigdem (Ed.), Rowman & Littlefield, 239-263). I later surmised that the Center’s track record of commitment to the welfare of its community helped to propel it through this extraordinary period.

As Liberty Science Center reopened, a public service announcement in The New York Times, The Star-Ledger and Jersey Journal at no charge to the Center. This began: “The trustees, president, employees, and volunteers of Liberty Science Center express their heartfelt sympathy to the many families, friends and communities of those who suffered losses, were injured, or are still missing as a result of the terrorist attacks at the World Trade Center, The Pentagon and in Pennsylvania”. And it ended: “Liberty Science Center joins the countless other voices that call upon all of us to strive for greater global harmony… We also express our desire for the peaceful use of science and technology to create a better world”.

Wider Reflections

Thinking back, there were additional uplifting, and even some truly transformative, moments. These included a requiem in Liberty State Park featuring Andrea Bocelli, an out-of-the-blue six-figure donation from a pharmaceutical company who knew the Center was hurting financially, and the board’s adherence to a long planned first meeting of a facilities task force which would lead to an $109 million expansion and complete renewal of the Center.

“A spring 2002 survey showed that the staff foresaw significant advantages and opportunities in the Center’s expansion plans. These included renewed visibility, enhanced image, recharged spirit, a better working environment, new jobs for the community, and more and better offerings. But not surprisingly, the study also revealed questions about job loss, whether the organization had the capability to handle such a major project, and whether departments could work together well enough to deliver on the plans… Frequent communications about developments and reminder continued apace. Piece by piece, our planning became a reality… All in all, this has been a story of mission above self. The idea that Liberty Science Center could be a more useful resource to the diverse communities in its surrounding region always seemed to trump the anxiety of the moment” (Emlyn Koster, 2007, The Reinvented Liberty Science Center, LF Examiner, 10:7, 1-9).

Also in 2007, museum critic Edward Rothstein for The New York Times wrote: “The Center, which reopened yesterday after two years of construction, has been rethought and reshaped, with the goal of doing nothing less than reinventing the science museum”. He quoted me: “The science museum … should provide ‘resources for living, learning, working in and caring for its surrounding area’ … It should aim for ‘relevancy’ and have the ultimate goal of leading its visitor to a form of activism”. Today, my outlook is exactly the same. A few days before Rothstein’s review, The New York Times expressed this editorial opinion: “…The new center also tries to make science feel accessible, even local… To its credit, the center does not shy away from the things we wish were less real-world. The skyscraper exhibition holds two pieces of the World Trade Center; one is an I-beam mangled by the heat and pressure into a twisted U”.

As the world warms and violence increases, and with more of humanity living next to natural hazards, it is probable that museums will increasingly find themselves in helping-hand situations. I hope the foregoing reflection ago encourages institutions to be maximally useful if and when a disaster strikes nearby and thereby to boost their resilience.

About the Author

Emlyn Koster, PhD is a geologist who also became a museologist and a humanist. The CEO of four major nature and science museums in Alberta and Ontario, Canada and then in New Jersey and North Carolina, he is an advocate for the museum sector’s alignment with ‘glocal’ societal and environmental needs. Following September 11, 2001, community recognition included an award from New Jersey’s Arab-American Anti-Discrimination Committee and the Humanitarian of the Year award from the American Conference on Diversity. He is a member of the Ambassadors Circle of the International Coalition of Sites of Conscience and involved in a new UNESCO-supported project about the global language of the Anthropocene. He welcomes comments and inquiries at koster.emlyn@gmail.com. You can read his previous blog posts for RK&A here.

The post When Disaster Strikes: Assistance by Museums Nearby appeared first on RK&A.

Written by cplysy · Categorized: rka

Sep 07 2021

Six lessons from practicing “true” developmental evaluation

 

Remember when you were a kid and you heard there was going to be a new kid at school? The news spread like wild fire. Students were excited and intrigued by this new person. When the new kid arrived at school, many were eager to get to know them and figure out if the new kid could be their new friend. The more reticent school kids might have hung back and instead heard rumors about who this new kid is and what they are like. Some of those rumours turned out to be true and some not so true. 

Developmental Evaluation (DE) has been the evaluation world’s so-called new kid. Over the past ten years there has been excitement and intrigue around it. Some evaluators have explored DE as their new evaluation friend, while others remain unsure and intimidated by it. Just like the new kid at school, there remains misconceptions about what DE is and how it is actually practiced. 

 I have practiced evaluation for over ten years and throughout that time loosely thrown out the term “developmental” when describing an initiative. I have prescribed DE when it was not appropriate and I have not used DE when it was appropriate (I know I am not alone in this; I have heard many of you do it too!) DE has been overused and misconstrued so many times that now we are hearing, “but is it true developmental evaluation?” 

 I am now finally working on an evaluation that I believe to be true developmental evaluation (no really, I’m serious this time). The initiative and evaluation is in its early days. Yet, I have already had a very different experience than previous DE (and so-called DE) experiences and learned a number of lessons I’d like to share. 

In this article, I will outline my six lessons from my DE experience, including why I think it is true DE, but before we jump into that let’s quickly review what DE is. 


What is Developmental Evaluation?

According to the originator, Michael Quinn Patton, DE “supports innovation development to guide adaptation to emergent and dynamic realities in complex environments1.” Unlike more traditional approaches, it is embedded rather than detached, continuous rather than episodic, and has a goal of learning not judgment. 

My favourite metaphor to clarify how DE differs from other traditional evaluation approaches is the following Bob Stake metaphor:   

“when the cook tastes the soup, that’s formative; when the guests taste the soup, that’s summative….DE begins before cooking, when the chef goes to the market to see what vegetables are the freshest, what fish has just arrived and meanders the market considering possibilities of who the guests will be, what they were served last time, what the weather is like and considers how adventurous and innovative to be with the meal.”  


Lesson 1 – Just because it is called DE, doesn’t make it so

In the past, I have worked on evaluations where stakeholders have said they wanted DE; however, as the work progressed it became clear it wasn’t. For example, they have stated the purpose for the evaluation was for learning, but then continued to focus on funder needs and accountability-type questions. Or, the initiative may have already developed its model and was really just looking to improve it, not necessarily to develop it. I’ve also worked with clients who say they want “rapid feedback” but then indicate that rolling up the findings into an interim and/or final report is how the data should be reported. Some clients insist on DE, but then want a traditional, static evaluation plan with retrospective evaluation questions. This is not DE. 

 My current DE experience was different from the start. It started with a RFP that identified the need for a developmental evaluator. While this is not unique, what was unique was how the RFP articulated details that aligned with DE. The RFP described the vision for its work and outlined a high-level idea of what is needed to get there, including the need for data to inform the development and implementation of its work. It also outlined related DE concepts like complexity and systems change. It did not focus on us relaying our understanding of the initiative and spelling out a detailed evaluation plan.  After being shortlisted, our interview focused on DE concepts and our firm’s experiences with DE, particularly my knowledge and evaluator skills. That conversation really solidified that, if selected, it would be true DE.  

Lesson 2 – Drink Through a Fire Hose

The J.W. McConnell Family Foundation and the International Institute for Child Rights and Development have published A Practitioner’s Guide to Developmental Evaluation in this guide they state: 

 “your effectiveness as a developmental evaluator is determined, in part, by how well you understand the initiative and the broader context in which it is situated.”  

My client and I both understand this to be true. As a result, our first meeting didn’t have an agenda. The only purpose was having an unstructured conversation that went over the history of the initiative, where it is at now, what data has been collected, what questions they are struggling with, and where they think they might be headed. Through that discussion, we talked about various stakeholder groups, past conflicts that have arisen, and potential future conflicts that may occur. We recorded that session and I took oodles of notes that I still refer back to. During that conversation, there was reference to a number of documents that I combed through afterward to try and understand. It was like drinking through a fire hose, but slowly, after attending subsequent meetings and having multiple conversations, the stakeholders, language, questions, and broader context are beginning to make more sense. 

Lesson 3 – Make Friends

A Practitioner’s Guide to Developmental Evaluation also says that the quality of relationships determines the degree to which a developmental evaluator can access information and influence change. 

As an external consultant working virtually with clients, I cannot emphasize enough the importance of making friends (aka building relationships). But making friends takes time. Remember that new kid at school? It took a while to get to know each other and trust each other. 

A couple of things I did to try and speed up that process was to set aside a good amount of time at the first meeting with our client’s entire team to get to know one another. And, you guessed it, we used good old-fashioned icebreaker activities! Part of one of the icebreakers was telling them a story about me that highlighted why I’m an evaluator and what evaluation is to me. We also went around and had them present an object, why it was special to them and how it reminded them of evaluation.  

I also did a short presentation on DE to try and outline expectations for the evaluation and how it differs from traditional evaluation they might have been involved in previously. In that presentation, I focused on what it might mean for them. Here’s a snapshot of some of those slides:  

Take note of the “Include me” one. I have found that because I am not on-site people can forget about including me in important conversations and meetings. One solution for this is having an insider – a key friend who keeps me in the loop on what is happening AND reminds others to include me.

Learning #4 – Capture the gold nuggets

While drinking through the fire hose and making friends you will likely hear a number of questions that your clients are struggling with. You may hear phrases like: 

  • “how do we….?”  

  • “wouldn’t it be interesting to know….?”  

  • “I wonder if we….”  

  • “I don’t know how we’re going to….”  

  • “One of the things we need to figure out is….”  

These are your gold nuggets. Pay attention to these nuggets and start capturing them. I often jot the questions down on my meeting notes, star them and then transfer them over to a question inventory. An abbreviated version is included below:  

Sometimes you might need to dig for the nuggets. You can do that by posing questions like:  

  • What questions are you struggling with? 

  • What issue(s) need clarity? 

  • Where is the energy and focus? 

  • What activity are people most animated about? 

  • Where are the quick wins? 

Learning #5 – Try to organize the chaos

One of the key roles of a developmental evaluator is to help our stakeholders find their way through complexity. One way I’ve tried to do that is to take the list of questions I captured and work with the group to prioritize the questions we need answers to.

For this group, I asked which two questions we needed to answer in the next quarter. To identify these questions I uploaded the list of learning questions to our survey platform, Qualtrics, and then got them to rate each question according to urgency and importance. The questions were then analyzed and the two priority questions emerged. The figure below shows the matrix and the two priority questions (the remaining questions are blocked out from the graph).

Now that we identified the two priority questions, we slotted them into a learning framework (see template below) and are beginning to answer those questions. Once we answer those questions the intent is to rinse and repeat (i.e. identify and prioritize the next questions to answer). The next questions might not even be on the original inventory – that’s the point of DE – to be responsive and adaptable as the initiative evolves and to uncover what you haven’t even considered before.

Learning #6 – Watch what you spend

Budgeting for DE and monitoring the costs is important. Prior to starting with our clients, we had an agreed budget and some loosely defined deliverables that would be included in that budget. However, if you do a good job of making friends and organizing the chaos you will find that your work will quickly snowball – pretty soon you will be included in every meeting!

My approach with this is to keep a close watch on what is being spent, update the client often on the budget, and discuss with them if/how to adjust their evaluation priorities and expectations. Another option would be to estimate the number of hours or FTE (if an employee) when scoping the project.


Those are my six lessons until this point. I am sure lessons seven through one hundred will emerge soon. Make sure to sign up for our newsletter so you don’t miss out!

Sign up for our newsletter

We’ll let you know about our new content, and curate the best new evaluation resources from around the web!


We respect your privacy.

Thank you!


 

Written by cplysy · Categorized: evalacademy

Sep 06 2021

Does my program need a dashboard?

 

One of the deliverables my clients often request is a dashboard. Those clients are looking for easy-to-understand, powerful insights at a glance. They want to be able to know what they need to know when they need to know it, then make evidence-informed decisions. And that’s great! That’s exactly the mindset needed to build a culture of evaluation. When I hear “we need a dashboard,” what I hear is “we need relatively current information that we can quickly understand and trust, and we want it on one page.”  

But a dashboard may or may not be the best way to fulfill that need. Here, I’ll clarify what a dashboard is, and what it isn’t, then provide a checklist you can use to decide if your program or organization needs one. 


What is a dashboard?

The term “dashboard” is common in business but the understanding of that term varies. A dashboard isn’t just a short report, and it’s not just an infographic. It’s:

  • A visualization of important metrics – and ONLY the important metrics 

  • Easy to understand 

  • Live – it isn’t old data, it’s what happened yesterday, or even what has happened so far today 

  • Updated through automated processes 

  • Often interactive – you can filter and segment the dashboard to find exactly what you need 

You can expect to see column charts, line charts, tables and minimal text in a dashboard.

A dashboard is NOT: 

  • Static 

  • Lengthy 

  • Text-heavy 

  • An infographic 

  • The only reporting product you’ll ever need 

United Way of Greater Toronto prepared the above infographic to demonstrate how it works. It’s a great infographic, and it’s not a dashboard.

When are dashboards helpful? 

Dashboards can be a great tool to help you keep up with the latest trends in your program. I would argue that a dashboard may be more helpful for program operations than it is for program evaluation. 

 For example, if your social media engagement is important for your program, seeing how many comments are generated every day may help you plan your next day’s posts. If it’s registration time, a quick glance at yesterday’s signups could help you decide whether to open up a waiting list. If your participants complete surveys after each visit, seeing last week’s results can inform how many staff you have at the front desk next week.  

 In a monitoring and evaluation project, it may be helpful to have a dashboard displaying the proportion of sites that have reported data, or the number of households approached to complete surveys. If you need the ability to sort and filter by site or by demographic, a dashboard might just be the best solution. 

 Remember that a dashboard needs an automated process to load and refresh data. That means that you need a data source and a method to continuously collect data. A true dashboard is most helpful when your program needs to know what’s happening urgently and regularly.  

What do you need to build a dashboard? 

The most important thing you need before building a dashboard is a very clear understanding of which metrics matter. Your team will need to come together to clarify what kinds of decisions need to be made, and which indicators support those decisions.

Each indicator will need a definition and a data source (our Performance Measures Definitions Template might be helpful). Indicators that require explanation are likely not a good fit for your dashboard; you’ll need to make sure you know your audience and how familiar they are with the content. This indicator selection process may not be easy, but it is foundational to creating a dashboard that works. Building the dashboard will likely be a significant investment of time and resources, so you’ll want to be sure it will meet your needs. 

Dashboards require a continuous source of data. Data may be gathered through registration processes, surveys, financial systems, client records or other existing administrative processes. For an effective dashboard, your program will need a clear plan with responsibilities assigned to specific team members to ensure that the right data is collected at the right time. 

 Next, you’ll need to build it. You may already have the tools and people in your office to do this work. Power BI is commonly used to create dashboards in-house, and an external consultant may use Power BI, too. You can use Excel, but it will require more manual effort; Power BI and Tableau are better suited for handling large volumes of data and creating the high-quality visualizations dashboard users want. There are also dedicated dashboard products such as Cyfe, Klipfolio, Sisense, or Geckoboard that might meet your needs.  

Hint: while we love using Canva, if that’s the platform you’re working in, you’re probably building an infographic, not a dashboard! 

You’ll probably want to test out a few versions of the dashboard and make edits based on feedback from the intended audience. If your site managers don’t understand what they’re being shown, the dashboard won’t be useful. Gather their questions, ask them to find glitches, then adjust as needed. As your program changes, you may find that revisions are necessary; it might be helpful to schedule a dashboard review to coincide with strategic planning or annual reporting. 

What are the alternatives to dashboards?

Does continuous data collection and digital automation sound like a bit much? You might not need a live, interactive dashboard. Depending on your information needs, your program might benefit more from concise, focused, monthly or quarterly reports.  

 Many healthcare, non-profit, and government programs have data that arrives slower, with decision points happening less frequently than businesses in the private sector. A lot of the dashboard literature to be found online speaks more to finance or retail companies; those companies may need to make daily modifications to their operations, while a youth after-school program may only need to reflect on data when doing annual reporting or applying for grants. Your financial team may find their QBO dashboard vital in tracking program costs, but your program planner might not have the same need for real-time data. 

Now that you’ve read about dashboards, consider whether your needs would be met by a report that is:  

  • Short 

  • Targeted at the information that helps you make decisions 

  • Delivered on a monthly or quarterly basis 

  • Easy for program staff to create with the software they already have 

Checklist: 

If you can check off most of the items below, your program might really need a dashboard. If not, consider concise, regular reporting instead. 

  • Defined audience 

  • Need for up-to-date information (daily or weekly) 

  • Shared understanding of what metrics should be prioritized 

  • Clear definitions of metrics 

  • Ongoing data collection processes 

  • Analytic software 

  • Budget for a consultant, or internal capacity to build the dashboard and automate data feeds 

  • Budget or internal capacity to troubleshoot issues 

 

If you’re still not sure whether your program needs a dashboard, consider a conversation with one of our evaluation coaches.  


Dashboard 

A visualization of up-to-date, meaningful quantitative data. Dashboards may contain many types of data visualizations, such as line charts, column or bar charts, pie charts, or tables. Dashboards are often interactive, allowing the user to filter and segment data. They should fit on one screen, allowing rapid understanding of the information. There should be very little text; each indicator should be clear enough that it does not require a narrative explanation. 

 Dashboards require a continuous, automated data collection approach.  

 Dashboards can be used to support performance measurement, communication, operational decision-making, learning and improvement. 

 

Make sure to sign up for the Eval Academy newsletter for more evaluation consulting experience shares.


Sign up for our newsletter

We’ll let you know about our new content, and curate the best new evaluation resources from around the web!


We respect your privacy.

Thank you!


 

Written by cplysy · Categorized: evalacademy

Sep 03 2021

Comment on IRB 101: What are they? Why do they exist? by IRB 101: What is (and isn’t) human subjects research? » RK&A

[…] my first post in this IRB 101 series, I described what IRBs are and why they exist (i.e., to protect research […]

Written by cplysy · Categorized: rka

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 165
  • Go to page 166
  • Go to page 167
  • Go to page 168
  • Go to page 169
  • Interim pages omitted …
  • Go to page 310
  • Go to Next Page »

Footer

Follow our Work

The easiest way to stay connected to our work is to join our newsletter. You’ll get updates on projects, learn about new events, and hear stories from those evaluators whom the field continues to actively exclude and erase.

Get Updates

Want to take further action or join a pod? Click here to learn more.

Copyright © 2026 · The May 13 Group · Log in

en English
af Afrikaanssq Shqipam አማርኛar العربيةhy Հայերենaz Azərbaycan dilieu Euskarabe Беларуская моваbn বাংলাbs Bosanskibg Българскиca Catalàceb Cebuanony Chichewazh-CN 简体中文zh-TW 繁體中文co Corsuhr Hrvatskics Čeština‎da Dansknl Nederlandsen Englisheo Esperantoet Eestitl Filipinofi Suomifr Françaisfy Fryskgl Galegoka ქართულიde Deutschel Ελληνικάgu ગુજરાતીht Kreyol ayisyenha Harshen Hausahaw Ōlelo Hawaiʻiiw עִבְרִיתhi हिन्दीhmn Hmonghu Magyaris Íslenskaig Igboid Bahasa Indonesiaga Gaeilgeit Italianoja 日本語jw Basa Jawakn ಕನ್ನಡkk Қазақ тіліkm ភាសាខ្មែរko 한국어ku كوردی‎ky Кыргызчаlo ພາສາລາວla Latinlv Latviešu valodalt Lietuvių kalbalb Lëtzebuergeschmk Македонски јазикmg Malagasyms Bahasa Melayuml മലയാളംmt Maltesemi Te Reo Māorimr मराठीmn Монголmy ဗမာစာne नेपालीno Norsk bokmålps پښتوfa فارسیpl Polskipt Portuguêspa ਪੰਜਾਬੀro Românăru Русскийsm Samoangd Gàidhligsr Српски језикst Sesothosn Shonasd سنڌيsi සිංහලsk Slovenčinasl Slovenščinaso Afsoomaalies Españolsu Basa Sundasw Kiswahilisv Svenskatg Тоҷикӣta தமிழ்te తెలుగుth ไทยtr Türkçeuk Українськаur اردوuz O‘zbekchavi Tiếng Việtcy Cymraegxh isiXhosayi יידישyo Yorùbázu Zulu