• Skip to main content
  • Skip to footer
  • Home

The May 13 Group

the next day for evaluation

  • Get Involved
  • Our Work
  • About Us
You are here: Home / Archives for allblogs

allblogs

May 13 2020

“Capacidad y Sistema” Nacional de Evaluación en la Agenda 2030

Parte de la Agenda 2030 y de la Agenda de Evaluación 2020 están enfocadas al desarrollo de la Capacidad Nacional de Evaluación (NEC) y de los Sistemas Nacionales de Evaluación (NES). Aquí tratamos de clarificar estos dos conceptos:

A. La Capacidad Nacional de Evaluación (NEC) es la capacidad colectiva del sistema de evaluación nacional (1) para lograr, alinear y mantener sus objetivos, estructura, procesos, cultura, capital humano y tecnología (2) para producir conocimiento evaluativo que informe prácticas y toma de decisiones para (3) mejorar la efectividad y la rendición de cuentas.

B. Un Sistema Nacional de Evaluación (NES) es (1) el conjunto de (a) instituciones, (b) personas y  (c ) actividades, junto con (d) las políticas, procedimientos y relaciones que los vinculan y guían, (2) involucrados en la demanda, oferta y uso de evaluación (3) para apoyar (a) la rendición de cuentas, (b) el aprendizaje y (c) la toma de decisiones.

La Agenda de Evaluación 2020 identificó cuatro dimensiones de un Sistema de Evaluación Nacional: (1) entorno propicio, (2) capacidades institucionales y (3) capacidades individuales para la evaluación, así como (4) integración entre estas tres dimensiones

Los pasos para fortalecer el entorno propicio para un Sistema Nacional de Evaluación son: (1) evaluar la necesidad de evaluación, (2) mapear el interés de las partes interesadas, (3) identificar vías para el crecimiento y (4) crear partenariados, asociaciones y/o grupos de interés.

 

 

Written by cplysy · Categorized: TripleAD

May 13 2020

Evaluation and COVID-19

Social media is not a monologue. It’s a collection of voices seen (or unseen) based almost entirely on who you do or do not follow.

But the world we see is based, as it always has been, on the people that surround us. Historically this has been primarily determined by where we live, and the communities where we interact. But increasingly, what we see it is based at least somewhat on who we choose to follow in the digital realm. And who we choose to invite into our own conversations.

I am an on and off lurker* of the American Evaluation Association’s Evaltalk listserv and recently stumbled into this comment by Doug Fraser.

It raises the same question Bill Fear has tried to raise a couple of times, namely: why, at this of all times, when the world faces the biggest and most vital evaluation challenge in its recent history, is the evaluation profession missing in action?

I don’t see it that way.

I see evaluators across the globe stepping up. They are using their approaches, methods, and other tools in their toolkits to make contributions. And they are using their voices to share their perspectives and guidance across the globe.

In this post I wanted to create a compilation of some of those viewpoints. And as I collected, I found that I could have kept going and going.

If you have read (or written) something recently that inspired you, please share in the comments.

*EvalTalk has a long history, but listserv technology has a way of elevating the perspectives of those members who prefer shouting to discourse. Subsequently, I find many of the conversations lacking sufficient diversity in both participation and perspective. But I digress.

When Was There Certainty?

The pandemic has laid bare that which many of us (meaning people of the global majority that live in the United States) have always known: this country is designed to maintain a power dynamic that privileges white male power and wealth at the sacrifice of most everything else.

Jara Dean-Coffey in A note from EEI’s Director: “I have feelings.”

On May 13, at 12PM Eastern/3PM Pacific, Jara will be joining me for an Eval Central UnWebinar to talk about being of service when you are not essential. Join us!

The Lines Between Work and Life and Life and Work

The resources designed to help us adapt to Covid-19 don’t match up with our lives right now. Every day, I sit down at my kitchen table — the same table where I (used to) host dinners and put together puzzles — in front of a make-shift workstation where I do my job. My living space is also my workspace. We’re managing a whole new definition of work-life balance right now and it turns out work is part of our lives.

Alissa Marchant in 18 Resources helping me in work and life with Covid-19

Don’t Overthink, Just Do

@timbidey/@traversepeople Tim is an experienced qualitative researcher with a passion for helping charities explore what works (and what doesn’t) and why to inform their project design and practice.

It’s clear that many of the voluntary and community sector organisations that Traverse works with are struggling at the moment in the UK. Charities have had to adapt to continue delivering frontline services or develop new ones to meet emerging needs in a world of new public health restrictions – all amid a catastrophic loss of funding and, in some cases, lack of staff where people have been furloughed.

In some cases, evaluation has fallen fast down the list of priorities – but it’s important to remember that ‘evaluation’ in itself is not a homogenous practice. Sure, now is not the right moment for continued impact evaluation of multi-year programmes, but evaluation has many faces and need not be so traditional or comprehensive.

My advice to organisations has been to keep it simple. Evaluation is so often seen as a ‘mystifying practice’, but now, more than ever, it’s better to collect something than not collect anything through fear of being seen as unsystematic.

After all, evaluation’s value lies in its utility – it needs to serve the information needs of its users. Right now, the needs of voluntary and community sector organisations demand real-time data to inform weekly decisions about delivery, rather than demonstrating the differences that they’ve made for funders or members of the public. 

Evaluation for this purpose does not need to be theory-based with perfectly rounded edges, it just needs to capture ‘good enough’ data about the essentials on a regular basis. For new or developed services these might include: what is the problem? Who is affected, how? Who are we reaching or not reaching – why? What do people think? What, if any differences are we making? What do we need to do to improve?

A sense of intended outcomes, basic monitoring data, simple service user feedback or even reflective, anecdotal data from staff can all provide ‘good enough’ insights into these questions for the situation at hand – so long as people remain honest with themselves (and others) about how insights were generated and what limitations sit behind them. 

So don’t overthink, just do.

Tim Bidey of Traverse is part of the FreshSpectrum Panel of Experts. These were his words.

Becoming Developmental Evaluators

All evaluators must now become developmental evaluators, capable of adapting to complex dynamics systems, preparing for the unknown, for uncertainties, turbulence, lack of control, nonlinearities, and for emergence of the unexpected. This is the current context around the world in general and this is the world in which evaluation will exist for the foreseeable future.

Michael Quinn Patton in the Evaluation Implications of the Coronavirus Global Health Pandemic Emergency

Because Our Decisions Have Consequences

 Some of us want to do everything we can to stop the spread of the pandemic and minimize the overall harm it will cause. Others of us are more concerned with managing the indirect effects of the crisis on communities or causes we care about. Still others of us are just trying to figure out the role we can and ought to play. All of us, though, can benefit from approaching these challenges with thoughtfulness and rigor.

Ian David Moss in Deciding Well in Tumultuous Times

Changing Definitions

But now, with the uncertainty of what visitation will look like over the coming months and potentially years as museums phase into reopening with limitations on visitor capacity and new social distancing measures, I wonder what does a “representative sample” mean now?  

Katie Chandler in Sampling: What does “representative” mean during and after coronavirus?

Who is Afraid of Rigor?

Bio: @b3consults/www.b3consults.com  Betsy brings all the tools of data & program evaluation in harmony with the heart and intuition-led world of coaching, to increase the impact of results-driven organizations.

My gut reaction, though I’m scared to put it out there, is two fold.  One, what about the organizations that were ready for the pandemic? Two, why shouldn’t we see a call for increased rigor?

To the first: I believe some organizations planned for something along the lines of a global pandemic.  As early as 2012, some organizations were making the case that a COVID-19 like event was coming.  The crisis was foreseen, and some organizations were ready for it. 

To the second: I’ve seen implied that we should be more adaptive and iterative, in a sense softening the rigor as organizations lean into change and adapt.  
But what if COVID-19 isn’t a one-time event (and most experts will tell you it isn’t)? And what if our ability to measure, with a high degree of rigor, how organizations served critical populations during this time will be crucial to inform what we need to do next time? 

I’m an unlikely architect of highly rigorous evaluation design, I tend to focus in the PrEvaluation space. However, when I think of my clients, the ones that know that their ability to rise up and serve critical populations well NOW will prevent further exacerbating equity issues in the future, I want to hear the kind of evaluation for which they would strive.  And I would hesitate to assume they want less rigor, or that they are afraid of rigor in a crisis. 

Ask yourself: what if they crave rigor right now?

Betsy Block of B3 Consults is part of the FreshSpectrum Panel of Experts. These were her words.

Responsible COVID-19 Data Visualization

Amanda said, “There are different points in which we make decisions about how and what we visualize, and then how we publish and share. What are we creating and doing more for our own exploration and understanding? And what are doing so that we can share it with the public to help others make sense of information?”

From Ann K Emery’s interview with Amanda Makulec in Visualizing COVID-19 Data Responsibly: An Interview with Amanda Makulec

Evaluation Contingency Plan

We’re often providing funders an evaluation plan that includes best case scenario and prioritize in-person interactions. Rarely do plans require the evaluation team to offer contingency options. As many of us are now tailoring methods to respond to social distancing and travel recommendations, we’re switching to virtual interviews and other methods of web-based data collection. Building contingencies into future evaluation plans will leave us better prepared to pivot and could save time and resources on creating post-hoc plans.

Martena Reed in Reflex or Reflection: Three Lessons for Evaluators Amid COVID-19

Changing Your Data Strategy

Consider what sort of data collection activities are going to help you use your data and collect data that is useful.

Michelle Molina in her Video on Nonprofit Data Adjustments to COVID-19

Forget Returning to Normal

So no, I don’t want us to return to normal. I want us to use this an opportunity to change, to create systems and social structures that create deep and lasting equity and a world where we work together for the common good. One can dream, right? If anything, this crisis should teach us that we are all connected. 

Ann Price in There are words I really hate right now.

The Importance of Our Work

We work to support better evaluation globally. Good evaluation helps people identify the information they need and make sense of it.  It helps inform decisions about what to do and how to improve results. Good evaluation is essential to guide the best use of resources and to ensure accountability and learning. During this pandemic and in the post-pandemic world our work is more important than ever.

Patricia Rogers in BetterEvaluation COVID-19 Statement

Compensating for a Lack of Monitoring

Una evaluación en tiempo real (RTE) está diseñada para proporcionar retroalimentación inmediata (en tiempo real) a aquellos que planifican o ejecutan un proyecto o programa, para que puedan realizar mejoras. Esta retroalimentación generalmente se proporciona durante el trabajo de campo de la evaluación, en lugar de después.

Carlos Rodriguez-Ariza in La evaluación en tiempo real en emergencias

[English Translation] A real-time evaluation (RTE) is designed to provide immediate (real-time) feedback to those who plan or execute a project or program, so they can make improvements. This feedback is generally provided during the assessment fieldwork, rather than afterwards.

Access for On-Site Data Collection

On 1 April 2020, USAID and IDEAL hosted a webinar on ‘Challenges and Strategies for Monitoring and Evaluation (M&E) in the Time of COVID-19’. The virtual meeting was attended by 500 M&E professionals.

The participants of webinar completed a poll that yielded the following results.

Ann-Murray Brown in A New Dawn: Monitoring and Evaluation during COVID-19

Before Expanding Boundaries

This is the smallest visible system (SVS) in which you can make a difference. Once you can act wisely on this system, you can expand the boundaries and scope to work larger.

Cameron D. Norman in Acting in Complex Times

Looking to the Desired Present

I’m looking to the desired present instead of a desired future. Not because I have no hopes or aspirations for the future, but because I don’t find it helpful right now to aim for something I can’t see. I don’t know what the future will hold. I don’t know where this moment goes. I’m hoping there’s a future out there so different from this one that I can’t even imagine it fully much less trace a path to it by design. All I want to do is find the best part of whatever moment I am in, and work with that.

Carolyn Camman in Entering the Clearing

Written by cplysy · Categorized: freshspectrum

May 13 2020

Strategic Learning and Evaluation – What Boards Need to Know

 

Recently I was asked by a client about an evaluation literacy course for its board. The client’s board members had just attended a strategic planning day and through that discussion felt they needed education on evaluation and metrics. On one hand I thought “bravo, they want to know more about evaluation!”; on the other hand I thought “shit…., I’ve totally failed them as their evaluator – what have I been missing?”

Boards need quality information to make strategy and leadership decisions; however, the reality is this board isn’t getting the information it needs to inform its decisions. As their evaluator, it is my responsibility (and also opportunity) to show them the way forward, so they are no longer left with answers that are “a definite ‘maybe,’” but instead have data and insights that they can use to inform their decision making. This means a more systematic, coordinated, and intentional approach to evaluation and learning – a strategic learning and evaluation system (SLES), as is described in FSG’s Building a Strategic Learning and Evaluation System for your Organization. The course I developed therefore focused not only on evaluation literacy, but also how evaluation can support a SLES. The course has three overall objectives for its board members:

  1. Understand the necessity of and advocate for strategic learning;

  2. Understand the basics of evaluation and how it can support strategic learning, and;

  3. Begin developing the building blocks for a strategic learning and evaluation system.

Here are some of the key learnings and action imperatives for the board: 

Understand that evaluation is one piece of the learning pie

While evaluation is important for learning and improvement, it is only one of many information-gathering approaches that can be used to inform decision making about strategy. Organizations also collect information through performance measurement, audits, research, case studies, discussions at the water cooler and a number of other ways. So, if the ultimate goal is for organizations to learn and use that information to improve, boards need to shift thinking from “leading with evaluation to leading with learning” (Centre for Evaluation Innovation).

Evaluation is one of many ways to gather information to inform decision-making and learning. Others include performance measures, monitoring, audit, research, cost effectiveness analysis, and case studies.

Evaluation is one of many ways to gather information to inform decision-making and learning. Others include performance measures, monitoring, audit, research, cost effectiveness analysis, and case studies.


Understand and advocate for strategic learning

I am lucky that my client’s board has a desire to learn more about evaluation and how it can support strategy development. The Centre for Evaluation Innovation recently conducted a survey to collect data on evaluation and learning practices in foundations and found that senior management often communicate support for evaluation, but their behaviours do not demonstrate support for evaluation. So what are the board behaviours that would demonstrate support for evaluation? And even more specifically, evaluation for strategic learning? 

The Centre for Evaluation Innovation produced a report titled, Evaluation to Support Strategic Learning: Principles and Practices, in this report they explain that “designing data collection and evaluation specifically to support strategy decisions requires shifts in thinking about what questions get asked, the role the evaluator plays, how data collection is timed, and the framing of the findings” (pg. 3). They go on to articulate nine principles of evaluation for strategic learning that boards can advocate for within their organization:

  1. A support for strategy

  2. Integrated and conducted in partnership

  3. Emphasizes context

  4. Client focused

  5. Places high value on use, and helps to support it

  6. Data to inform strategy can come from a wide variety of sources and methods

  7. Must take place within a culture that encourages risk taking, learning and adaptation

  8. Is flexible and timely and ready for the unexpected

  9. Is constructivist

Understand the basic evaluation terms and steps

Part of advocating for strategic learning is understanding basic evaluation concepts and terms. If learning and evaluation efforts are to inform an organization’s decision-making practices, then boards need a clear vision for evaluation – what it is and is not.

I’m not going to lie – I don’t believe in reinventing the wheel, and Chris Lovato and Kylie Hutchinson put together an Evaluation for Leaders course. Much of what I covered in the evaluation basics module of my course follows what they outline for evaluation terms, types and steps. However, as Chris and Kylie say in their course:

 

“Leaders and decision makers don’t need to be evaluation experts, just expert supporters and users.”

 

So, in my course there is an emphasis on the first evaluation step – focus. For boards it is important to understand this step. An evaluation can’t be all things to all people – focusing it provides a clear direction of who needs what information and how the user(s) are going to use it.

Understand what is credible evaluation evidence

Thinking has changed on what constitutes “credible” evaluation evidence. Many board members from this organization come from a science background that believes RCTs (randomized control trials) are the gold standard for evaluation evidence. This is a common misconception, but one that is particularly important to address with board members. As the Evaluation Yoda, Michael Quinn Patton states:

 

“Despite the recognition more than 35 years ago that the reductionist approach to complex problems is likely to fail, many still persist in believing that we must rigorously apply the scientific method to problems in medicine and public health.”

 

If a board is trying to impact systems and trying to shift the conditions that are holding problems in place (i.e. systems change) then it is important that the board shift their thinking from measuring and proving against some sort of fixed model to understanding and improving – in other words, to adopt a systems thinking lens. The evaluation methods that are selected to evaluate that change comes back to how appropriate the methods are given the purpose of the evaluation, the questions it needs to answer, and how technically adequate the findings are given the time and cost constraints. As the United States General Accounting Office Program Evaluation and Methodology Division (1991, pg. 17) quotes:

 

“A strong study is technically adequate and useful – in short, it is high quality”

 

Know what you want/need and communicate it to your organization

If a board is not clear on what information it needs to inform its decisions, you can be sure the rest of the organization won’t either. A board gets a lot of information, but it may not:

  • Contain the right information,

  • Be presented in a useful and useable format,

  • Be received on time (i.e. after a decision has been made), or;

  • Be connected to organizational strategy, which means the findings aren’t getting used (or at least not fully). 

An easy first step to enhance use of evaluation findings is for boards to make their timelines and reporting preferences known. A more difficult next step a board should consider when trying to enhance usability of findings is to implement a SLES.

According to FSG’s Building a Strategic Learning and Evaluation System for your Organization, a SLES contains:

  1. A clear vision for evaluation,

  2. A culture that fosters individual, group and organizational learning,

  3. A compelling and strong strategy,

  4. Coordinated evaluation and learning activities, and;

  5. A supportive environment.

Bottom line – a SLES will provide guidance and align organizations on who, what, when, where, why and how to measure and report.

I left my client a lot to chew on. As I mentioned above, implementing a SLES will be difficult, but ultimately should provide the board with an evaluation strategy that increases the value of evaluation for its organization.

Interested in learning more? Sign up for our newsletter and we’ll notify you when the online version of this course is available.


Sign up for our newsletter

We’ll let you know about our new content, and curate the best new evaluation resources from around the web!


We respect your privacy.

Thank you!


 

Written by cplysy · Categorized: evalacademy

May 12 2020

Recursos para trabajar y facilitar a distancia

 

Muy relevante post en su blog “Agilefacile” del gurú de la gestión del conocimiento y la facilitación, pero sobre todo excelente persona, Ewen LeBorgne, sobre la “Facilitación en línea: Una visión meta de los recursos para trabajar y facilitar a distancia más efectivamente” (“A meta look at resources to work and facilitate online more effectively“)

De forma muy resumida nos indica que la facilitación en línea en realidad sigue muchos principios de facilitación cara a cara. Sin embargo debemos tener en cuenta algunos elementos prácticos, logísticos, de diseño y emocionales:

  • Necesitamos conversaciones sincrónicas (al mismo tiempo) o asincrónicas (en momentos diferentes), según la distribución geográfica.
  • La mejor división del tiempo.
  • Cómo romper el hielo o ‘leer las emociones de las personas”

Da varias fuentes, pero a modo de recomendación estrella recientemente se ha diseñado el “Kit de herramientas de recursos para reuniones en línea para facilitadores” durante la pandemia de coronavirus,  gracias al “Grupo de facilitadores para la respuesta a la pandemia” donde podemos profundizar sobre conceptos relacionados básicos y avanzados: “mudarse al completo” al trabajo en línea, equipos virtuales, e incluso sobre artefactos de facilitadores para reuniones de zoom durante la respuesta al Covid19. Facilitemos la facilitación a distancia en estos tiempos inciertos

Written by cplysy · Categorized: TripleAD

May 12 2020

Visualizing COVID-19 Data Responsibly: An Interview with Amanda Makulec

In April, I sat down with Amanda Makulec, one of my longtime data and evaluation friends, to learn about visualizing COVID-19 responsibly.

Amanda is the Data Visualization Capability Lead at Excella; a co-organizer for Dataviz DC; and the Operations Director for the Data Visualization Society (DVS).

She’s also one of the most knowledgeable people around when it comes to visualizing COVID-19 data.

Listen to Our Conversation Here

What’s Inside: Amanda’s Career Path

“Ten years ago, when I finished graduate school, I couldn’t have guessed that this would be my full-time job and I could wear as many hats as I do in the data viz world. Now I think it’s really exciting because there are a lot of paths into data viz,” Amanda told me.

Excella

Amanda is currently the Data Visualization Capability Lead at Excella, a technology consulting firm. She works with organizations from the CDC to Fortune 500 companies.

“It’s been a really transformational learning experience for me to see not just different ways data and tech get used, but the different ways projects get managed. I’ve learned a lot more about Agile processes and software development and thinking about how some of those same practices actually apply when we’re building different analytical applications like dashboards,” Amanda said.

Amanda Makulec is currently the Data Visualization Capability Lead at Excella, a technology consulting firm.

Previously, she worked in global public health.

DataViz DC

Amanda is also involved with DataViz DC. They focus on bringing people together from various disciplines, from graphic designers to software developers. They host monthly meet-ups, which have included hands-on workshops, guest speakers, and career panels. It’s a great way to connect in the DC area- there are over 8,000 members!

Amanda Makulec is also involved with DataViz DC. They focus on bringing people together from various disciplines, from graphic designers to software developers.

Data Visualization Society

Amanda is the Operations Director for the Data Visualization Society.

They are focused on bringing people together across the world and serve as a global, professional organization for dataviz professionals at any level. They have over 13,000 members from more than 130 countries around the world. They communicate through Slack, email and Fireside Chats with panels of dataviz experts.

Amanda told me that the Data Visualization Society is “a great space for actually bringing together different disciplines. Instead of focusing on one tool or tech stat, we instead said, ‘How do we bring together the people that are individually engaged in Tablueau groups? Power VI groups? R groups? Graphic designers? How do we bring all those people together and create a space for people early in their career or looking to change careers and do data visualization as their full-time job and create a space for them to grow and learn and share best practices?’”

She continued, “While we use different tools and technologies in different data viz disciplines and roles, I think there are so many cross cutting best practices that once you learn and master them in one tool, it’s really easy to think about how they are used and applied in other spaces. DVS tries to create that central space to bring people together and really advance the data viz discipline as a practice profession.”

Amanda Makulec is the Operations Director for the Data Visualization Society.

Visualizing COVID-19 Data Responsibly

Next, I asked Amanda to share tips for visualizing COVID-19 data responsibly.

She wrote “Ten Considerations Before You Create Another Chart About COVID-19” on the Data Visualization Society’s blog in March 2020, and I also wanted to know whether her guidance had evolved or shifted since writing the article.

Amanda Makulec wrote “Ten Considerations Before You Create Another Chart About COVID-19” on the Data Visualization Society’s blog in March 2020.

Amanda said, “There are different points in which we make decisions about how and what we visualize, and then how we publish and share. What are we creating and doing more for our own exploration and understanding? And what are doing so that we can share it with the public to help others make sense of information?”

Amanda went through some of the top considerations, from data quality, to data collection, to remembering the people behind the data, to color choices.

COVID-19 Data Quality Issues

Amanda said, “Consider the fact that even though the dataset are very accessible right now, does not mean it is high quality data.”

“There are so many issues and challenges with the different ways that COVID-19 cases are counted in different states or countries. Are we including only cases that have been lab confirmed with a swab test that came back positive? Or are we also including probable cases or diagnostically confirmed cases?

How those cases are counted are very different in some states and countries. It’s really hard to make these apples to apples comparisons, as easy as it might seem since the data is so accessible.”

Understanding COVID-19 Data Collection

Amanda said, “If you’re going to dabble in COVID-19 data in some way, try to really understand how the data gets collected so that you have a firm understanding of what that process looks like and why there might be issues with the accuracy, timeliness or completeness of those datasets.”

She continued, “Make sure you understand how the data got collected. Just because it’s there as a nice shiny, analyzable table doesn’t mean it’s not something you should try to understand the underbelly of.

In your choices around how you analyze and design certain visualizations, be mindful that the data that we have is really incomplete. The case data is really a function of how many tests are being done.  As a result, where we have more certainty on the death counts, deaths are also a function of cases.

So we have to think about the fact that we have a lot of cases not represented in that data. We’re seeing that come out more and more in some of the retroactive reporting being done in countries that are farther along in their epidemic curves.

Remember that there’s a lot of uncertainty in the data and it’s not uncertainty that we can represent visually well. We can’t always quantify that uncertainty.

Make sure that you’re considering the ways in which your visualizations could be misinterpreted or misused.”

Be Careful with Comparisons and Reference Points

Amanda said, “One of the common comparisons I’ve seen is comparing COVID-19 to the flu.

We’ve seen that in the media and early on even public health folks were trying to make sense of this disease by comparing it to the flu.

When we look at how we collect data on the flu in the US, we have routine, structured reporting systems for that data with better quality data. We have a disease that comes in seasonally and we understand what that seasonality looks like. We don’t know that about COVID-19.

So comparing cases in March for COVID-19 and the flu [doesn’t make sense]. We’re in a very different point of that epidemic curve for COVID-19. It really isn’t an apples to apples comparison. We’d need a whole year of COVID-19 data to start to make that comparison.

So be cautious in how you start to try to create those reference points which can help us enable understanding, but also can mislead.”

Remembering the People Inside the COVID-19 Datasets

Amanda said, “Remember that every single case and every single death represents a person. As we visualize and think about health-related data, thinking about that fact that each of those cases and each of those deaths represents a person and their story, makes it really important to be thoughtful and mindful about how we’re presenting that information.”

COVID-19 Data Visualization Color Choices

Amanda said, “Those red, big bubble maps can be hard to interpret, but the color choice also creates such a visceral, angry, sad response. I hope we can be thoughtful in the ways our visualizations can create emotional responses especially when visualizing such sensitive data.”

Connect with Amanda Makulec

  • Data Visualization Society: DataVisualizationSociety.com/Join
  • Slack Workspace: DataVizSociety.Slack.com
  • Twitter: @ABMakulec

Written by cplysy · Categorized: depictdatastudio

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 266
  • Go to page 267
  • Go to page 268
  • Go to page 269
  • Go to page 270
  • Interim pages omitted …
  • Go to page 310
  • Go to Next Page »

Footer

Follow our Work

The easiest way to stay connected to our work is to join our newsletter. You’ll get updates on projects, learn about new events, and hear stories from those evaluators whom the field continues to actively exclude and erase.

Get Updates

Want to take further action or join a pod? Click here to learn more.

Copyright © 2026 · The May 13 Group · Log in

en English
af Afrikaanssq Shqipam አማርኛar العربيةhy Հայերենaz Azərbaycan dilieu Euskarabe Беларуская моваbn বাংলাbs Bosanskibg Българскиca Catalàceb Cebuanony Chichewazh-CN 简体中文zh-TW 繁體中文co Corsuhr Hrvatskics Čeština‎da Dansknl Nederlandsen Englisheo Esperantoet Eestitl Filipinofi Suomifr Françaisfy Fryskgl Galegoka ქართულიde Deutschel Ελληνικάgu ગુજરાતીht Kreyol ayisyenha Harshen Hausahaw Ōlelo Hawaiʻiiw עִבְרִיתhi हिन्दीhmn Hmonghu Magyaris Íslenskaig Igboid Bahasa Indonesiaga Gaeilgeit Italianoja 日本語jw Basa Jawakn ಕನ್ನಡkk Қазақ тіліkm ភាសាខ្មែរko 한국어ku كوردی‎ky Кыргызчаlo ພາສາລາວla Latinlv Latviešu valodalt Lietuvių kalbalb Lëtzebuergeschmk Македонски јазикmg Malagasyms Bahasa Melayuml മലയാളംmt Maltesemi Te Reo Māorimr मराठीmn Монголmy ဗမာစာne नेपालीno Norsk bokmålps پښتوfa فارسیpl Polskipt Portuguêspa ਪੰਜਾਬੀro Românăru Русскийsm Samoangd Gàidhligsr Српски језикst Sesothosn Shonasd سنڌيsi සිංහලsk Slovenčinasl Slovenščinaso Afsoomaalies Españolsu Basa Sundasw Kiswahilisv Svenskatg Тоҷикӣta தமிழ்te తెలుగుth ไทยtr Türkçeuk Українськаur اردوuz O‘zbekchavi Tiếng Việtcy Cymraegxh isiXhosayi יידישyo Yorùbázu Zulu