• Skip to main content
  • Skip to footer
  • Home

The May 13 Group

the next day for evaluation

  • Get Involved
  • Our Work
  • About Us
You are here: Home / Archives for cplysy

cplysy

May 02 2023

Alt Text in Canva

Remember how I said you couldn’t add alt text in Canva? Well, you can now!

Transcript

Hey, so you may already know that I am a huge fan of Canva. And I use it for all sorts of things from presentations to infographic design. And last year they came out with a web design tool. Which is great. Except that one thing. There was absolutely no way. To add alternative text. Meaning. You couldn’t add any kind of descriptions. You couldn’t change the header structure, all sorts of things that you would need to do.

To make a web design accessible.

And this is really important. In general. But especially if you work on government kind of projects, It’s essentially a killer. You can’t. Do the work, if you can’t make it accessible. Well, I was playing around with Canva the other day and I clicked on an image. And what do you, what did I find? I found a little button.

That says alternative text. Yes. Now Canva at least has the bare minimum of what you need to create. Something that is accessible to share online. ? Just to show you here’s a picture up here somewhere. Of. Website, I was creating just a little simple test website I created in Canva. And then put in a picture.

Now if you right. Click on the picture, you’ll see on the menu. You’ll see a little alternative text button. And here let’s zoom in. Okay. Now, if you go ahead and click on that. A box will pop up. ? And you have up to 250 characters of alternative texts that you can add in this box. You can also click the little button to say that it’s decorative and it doesn’t really have a meaning other than to add decoration to the page.

?So that’s it. Those were really standard requirements. Now, if you’re doing kind of serious design work it’s still, probably better to work with WordPress or PDFs. If you need to do some good accessibility work, there are things you can’t do in Canva still because you can’t really adjust underlying HTML code. You can’t change like the header structure.

In a way you can’t mark down H one and H. and paragraph markings. It’s getting there. I think it’ll be there. Eventually. There is some reordering elements. But this is they’re coming. Along and it’s a long way from where it was before. ? And here it actually even past. A test. I went ahead and put it into an accessibility checker.

And yeah, it’s the bare minimum pass, but it Passed. And that’s something I can say before. ?So there you go. That’s it. Canva has all text. So rejoice and hopefully they’ll continue to improve accessibility and we’ll be able to use it more often in the future. All right. That’s set for today. Have a great one. I’ll talk to you soon. Bye.

Written by cplysy · Categorized: freshspectrum

Apr 30 2023

Tipos de auditoría de la gestión del conocimiento

Ya hemos tratado con anterioridad sobre auditoría de la gestión del conocimiento.

Nick Milton nos describe diferentes tipos de auditoría de la gestión del conocimiento y son los siguientes:

· Una auditoría del Marco de gestión del conocimiento de nuestra organización, para identificar las fortalezas y los elementos que faltan, de modo que pueda implementar un plan de acción para cerrar las brechas;

· Una auditoría de la cultura de Gestión del conocimiento, para que pueda desarrollar un plan y una estrategia para fortalecer los elementos culturales de apoyo y eliminar los elementos de bloqueo;

· Una auditoría del conocimiento en sí mismo, para que pueda identificar aquellos temas de conocimiento que más necesitan atención, y así priorizar y enfocar sus esfuerzos de GC donde darán mayor valor añadido y harán la mayor diferencia;

· Una auditoría de una o más Comunidades de Práctica, para que pueda ayudarles a desarrollarse a través de una serie de etapas;

· Una evaluación de madurez de GC de alto nivel de la organización, para obtener una visión general muy rápida de las fortalezas y debilidades;

· Una auditoría contra un estándar GC, con fines de acreditación.

Written by cplysy · Categorized: TripleAD

Apr 28 2023

Un marco o sistema de gestión del conocimiento

Un marco de gestión del conocimiento (GC) es un sistema completo de personas, procesos, tecnología y gobernanza, que garantiza que la gestión del conocimiento se aplique de manera sistemática y eficaz para mejorar los resultados organizacionales.

– Gobierno de GC: sin un sistema de gobierno (y asignación de responsabilidades/accountability) que promueva, reconozca, valore e incentive el intercambio y la reutilización del conocimiento, cualquier intento de introducir la GC será una lucha difícil (una batalla casi perdida).

– Personas: en las organizaciones y comunidades los roles y responsabilidades para la gestión del conocimiento (1) deben establecerse y clarificarse, (2) deben configurarse para compartir y reutilizar el conocimiento tácito, los comportamientos tales como buscar y compartir el conocimiento deben ser incentivados y convertirse en ‘la forma en que trabajamos’

– Procesos de la GC: tiene que haber un proceso probado y comprobado para capturar, filtrar/destilar, validar, almacenar, aplicar y reutilizar el conocimiento, y también para innovar.

– Tecnologías de GC: las personas y el proceso deben contar con el apoyo de la tecnología habilitadora, que permita encontrar y acceder al conocimiento donde sea que se encuentre (en las bases de datos, en la Intranet, en las cabezas de las personas). La Tecnología de la Información (TI) juega un papel importante en GC, al proporcionar la tecnología para permitir que las personas se comuniquen.

Written by cplysy · Categorized: TripleAD

Apr 28 2023

Questions to Get You Thinking about Your Data

This article is rated as:

 

 

Data are only useful when used! They do no good buried in reports, sitting on shelves (or shared drives) hidden away. Data, particularly data from an evaluation, are begging to be discussed, contemplated, and put into action!

Let’s chat about some ways to make sure your data are used. One place to start is to think about why an evaluation was conducted to begin with. Evaluations can serve many purposes – check out these 10 reasons to evaluate.  If you can articulate why an evaluation was conducted, you can review your data with a focused lens.

Another place to look is to explore your program goals, objectives, or intended outcomes and see how your data can help you to meet them or speak to them. Or perhaps you have targets or Key Performance Indicators that will help to frame your review of the data. It goes without saying that your data should be answering your evaluation questions, but still, this isn’t helping the move into the “now what” phase of an evaluation.

One important step is to put together an engaged and passionate team of representatives from across your organization, like described here: Three Ways to Increase the Chances your Evaluation Results will Actually Get Used. This team can help to interpret the data (often called sense-making) and can help to identify actions that can be taken on data. They can also help build organizational engagement and spread key messages.

Once you have your dream team, check out our new list of reflective questions that can help to uncover new insights buried in your data.

Trying to answer all of the questions in this checklist will be too many to tackle at once. But pulling out, say, 5 can get some discussion started. I’ve often used these questions in my sense-making sessions or even final presentations to stakeholder audiences to get them talking about the data and really thinking about what it means.

Of course, talking about the data assume that your data are high quality and presented in a way that your audience can learn from them. Here are some final tips to make sure your data are working for you: From Data to Actionable Insights


Let us know what you think about our new infographic Questions to Get you Thinking about your Data below!

Written by cplysy · Categorized: evalacademy

Apr 28 2023

Putting an ethics lens on your evaluation planning

This article is rated as:

 

 

We’ve written before about Ethical Decision Making in Evaluation, which describes those grey areas in evaluation planning, data collection and analysis, and reporting, and we’ve offered Program Evaluation Standards in Practice as a guiding tool.

We’ve also shared first-hand experience when faced with real-time ethical decision-making in My Interviewee is Drinking Vodka: An Evaluation Ethics Case. So why am I  writing about ethics again? Because ethical practice, to me, is a cornerstone of my practice. Not only does ethical practice ensure we are doing right by everyone involved, but conducting evaluations ethically adds to the professionalization of our work.


What is ethical practice?

Ethical practice in evaluation is ensuring your work is guided and driven by standards of conduct, that promote integrity, honesty, and respect, where the potential for harm is minimized. This starts with determining whether the evaluation is worth doing at all, considering, for example, the burden placed on participants.

Ethical practice comes into play at every level of your evaluation: from planning to designing data collection tools, data collection, analysis, and reporting.  Take a look at each section of your evaluation plan and see if you can identify the risks. If you can’t, keep reading!

The Canadian Evaluation Society offers some Guidance for Ethical Evaluation Practice. They suggest that ethical practice is based on 3 values:

  • Rights and well-being of persons and peoples

  • Truth-seeking, honesty and transparency

  • Responsibility to stakeholders and society

In addition, many ethical guidelines draw from the 1979 Belmont Report, created by the National Commission for the Protection of Human Subjects of Biomedical and Behavioural Research. The Belmont report provides three guiding principles:

  • Respect for persons, including informed consent

  • Beneficence: do no harm

  • Justice: equality, without bias or discrimination

Typically, evaluations don’t require review from a Research Ethics Board, but often there is nowhere else to turn for an ethical review. In Alberta, Canada where Eval Academy is based, we’re fortunate to have a program called ARECCI that can provide ethical review to evaluation and quality improvement projects. This article is intended for those without a formalized review process, to help you apply your own ethical review to your own evaluation work.


How do I ensure I apply an ethical lens to my evaluation?

The first step in ethical evaluation is awareness. Take a look at our infographic of questions to ask to ensure your evaluation is ethical.

Ethical practice is about self-awareness, honest understanding of limitations and self-reflection, knowing your own biases, assumptions, and values, and considering how they influence an evaluation. Ultimately, asking your self “Am I doing the right thing?” will force you to consider some of these questions.


Types of Risk

Now that we’re aware of the potential for risk in evaluations, and where to look for risk, what exactly are you looking for? As you’d expect, there are many types of risk. Usually, we’re talking about risk to participants so let’s start there.

Risks to participants include:

  • Mental and emotional risk, including re-traumatization and distress

  • Power imbalances, real or perceived coercion

  • Reputational risk through breached confidentiality or privacy

  • Physical risk, safety from the nature of participation

  • Legal risk through disclosure of information

  • Financial risk through disclosure of information

Vulnerable populations require additional consideration. These could include children, equity-deserving populations, those with limited capacity, or those in power imbalances, but often this includes any group who are regular targets for research, evaluation, or any data collection. The burden of participation or risk of exploitation is not insignificant for many populations.

There are also risks to your project. You’ll want to consider if your evaluation plans would put the project or funding at risk by not meeting timelines, or perhaps your data collection strategies risk going substantially over budget. This is an example of where the principle of justice comes in – your desire to gain knowledge must be balanced against what would be lost if the knowledge were not gained. That is, are your data collection strategies putting the potential learnings at risk in any way?

There are also risks to systems. Is it possible through participating in your evaluation, that participants may access additional supports or services, which may create system delays, and capacity issues or cause access trouble for others needing that service?

The consequences of including high risk in a project can be significant. Participants may experience extreme distress and require access to additional supports (and they’d likely look to you to connect them with those supports). There are also risks to programs including loss of trust, funding, negative publicity, or even fines.

Finally, but importantly, there may be there may be other ethical guidance available to you. For example, the First Nations Principles of OCAP ® offers support for information governance with respect to the ownership, control, access, and possession of data from First Nations.


Summary

All projects have some level of risk. Eliminating risk isn’t necessarily the goal. There is certainly a balance between collecting information you need to answer important questions that drive knowledge gain, best practice, and informed decision-making, with some acceptable risk. The question is, what tolerance for risk do you (and your clients or organizations) have? We want to not only minimize risk but also maximize benefit. The key is that the risks have been given deliberate consideration.


What are your top tips for assessing and managing risk in evaluation? Share your ideas below!

Written by cplysy · Categorized: evalacademy

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 89
  • Go to page 90
  • Go to page 91
  • Go to page 92
  • Go to page 93
  • Interim pages omitted …
  • Go to page 304
  • Go to Next Page »

Footer

Follow our Work

The easiest way to stay connected to our work is to join our newsletter. You’ll get updates on projects, learn about new events, and hear stories from those evaluators whom the field continues to actively exclude and erase.

Get Updates

Want to take further action or join a pod? Click here to learn more.

Copyright © 2026 · The May 13 Group · Log in

en English
af Afrikaanssq Shqipam አማርኛar العربيةhy Հայերենaz Azərbaycan dilieu Euskarabe Беларуская моваbn বাংলাbs Bosanskibg Българскиca Catalàceb Cebuanony Chichewazh-CN 简体中文zh-TW 繁體中文co Corsuhr Hrvatskics Čeština‎da Dansknl Nederlandsen Englisheo Esperantoet Eestitl Filipinofi Suomifr Françaisfy Fryskgl Galegoka ქართულიde Deutschel Ελληνικάgu ગુજરાતીht Kreyol ayisyenha Harshen Hausahaw Ōlelo Hawaiʻiiw עִבְרִיתhi हिन्दीhmn Hmonghu Magyaris Íslenskaig Igboid Bahasa Indonesiaga Gaeilgeit Italianoja 日本語jw Basa Jawakn ಕನ್ನಡkk Қазақ тіліkm ភាសាខ្មែរko 한국어ku كوردی‎ky Кыргызчаlo ພາສາລາວla Latinlv Latviešu valodalt Lietuvių kalbalb Lëtzebuergeschmk Македонски јазикmg Malagasyms Bahasa Melayuml മലയാളംmt Maltesemi Te Reo Māorimr मराठीmn Монголmy ဗမာစာne नेपालीno Norsk bokmålps پښتوfa فارسیpl Polskipt Portuguêspa ਪੰਜਾਬੀro Românăru Русскийsm Samoangd Gàidhligsr Српски језикst Sesothosn Shonasd سنڌيsi සිංහලsk Slovenčinasl Slovenščinaso Afsoomaalies Españolsu Basa Sundasw Kiswahilisv Svenskatg Тоҷикӣta தமிழ்te తెలుగుth ไทยtr Türkçeuk Українськаur اردوuz O‘zbekchavi Tiếng Việtcy Cymraegxh isiXhosayi יידישyo Yorùbázu Zulu