• Skip to main content
  • Skip to footer
  • Home

The May 13 Group

the next day for evaluation

  • Get Involved
  • Our Work
  • About Us
You are here: Home / Archives for allblogs

allblogs

Sep 22 2021

Gen Z are Values-driven: What does this mean for Cultural Institutions?

By: Sadiya Akasha

In the first post of this series, I described how the unique combination of nearly infinite access to information (and a global reach) coupled with a near-constant evaluation of the motives behind consumable content has caused Gen Z to become both global and critical thinkers in a way quite beyond the norm for previous generations. In the second post, I suggested that Gen Z is not only the most racially diverse and multiethnic generation to date but that their identity-building goes beyond clear demarcations of race and ethnicity. Exploration and constant self-examination is a foundational trait to Gen Z and is foundational to their values of freedom, equality, and healthcare for all. In this final post, I posit that Gen Z is values-driven and that cultural institutions will have to put their values at the forefront in order to connect with and engage directly with this generation.

Engagement Through Values

Market research makes it clear that Gen Z consistently chooses to interact with brands whose values align with their own. This values-driven approach is not limited to shopping choices but is expressed by members of Gen Z in their proactive approach to civic engagement as well. Everything we’ve learned about Gen Z indicates that values are central to their engagement. 

In this call to action to fellow Gen Z’ers, the author, Cameron Katz, describes her values in action, along with the kind of reception she often receives:

“When I and other members of my generation criticize July 4th, we’re met with disbelief and offense. It’s the ultimate taboo. “Can’t you just have fun?” “Why do you have to make everything political?” However, my criticism isn’t coming from a place of hatred. On the contrary, I’m interested in how we can honor our country by better upholding the promises made during this foundational moment in its history. Liberty, equality, the pursuit of happiness— these ideals were written into the Declaration of Independence in 1776, but they aren’t yet realized for everyone, even in 2021. 

I, along with many other Gen Z’ers, want to know why.”

The Weisman Art Museum’s student group, WAM Collective, along with the Student Advisory Council at the Tang Museum, and the Agents for Creative Action (ACA) of the Williams College Museum of Art (WCMA), together convened a virtual roundtable exploring the museum of the future. The student groups described the role of museums “historically, as gatekeepers of knowledge and history.” In this roundtable, the students gave clear guidelines for what the ‘Museum of the Future’ should be, not just look like, for it to engage members of their generation.

At the most foundational level, the Gen Z student groups demanded that cultural institutions should flatten their organizational hierarchies, disengage from the ‘cult of the curator’, refocus their programs to center humans rather than objects, and increase access to be more broadly and holistically inclusive. This missive from the student groups’ to museums is to simply put their values and principles into action. This is a direct message from Gen Z to cultural institutions and they provide ample examples and ideas for immediate action.

Collaborative Cultural Institutions

Made By Us, is a consortium of 100+ history and civics organizations that are collaborating with young adults to reframe historical events in current, politically-aware ways and support civic participation. They’re modeling a whole new approach of collaborative engagement where institutions have an opportunity to engage directly with members of Gen Z. Made by Us recently kicked off an inaugural tradition called ‘Civic Season’ taking place between Juneteenth and the Fourth of July. The season focused on celebration as well as criticism, on learning as well as the sharing of diverse voices. Events and activities were held across the country from regional institutions to digital platforms, creating broad access and a multitude of ways to engage. This new tradition is an amazing example of decentralized, democratized, human-centered, highly accessible, and values-driven programming that seeks to support and partner with Gen Z.

There are cultural institutions working hard to undertake radical shifts on their own as well. After leading community-based workshops, the Walters Art Museum has decided to publish a critical history of their founder, Henry Walters in a full-fledged acknowledgment of his past. This impactful action exemplifies the Walters Art Museum’s values and overarching commitments to its community.

In the academic world, the president of the Massachusetts Institute of Technology, L. Rafael Reif, recently published a letter acknowledging the role that a former MIT president, Francis Amasa Walker, had played in advancing the American reservation system and the complex legacy that this has left behind. To face this history head-on, MIT launched a class for undergraduate students to perform research in this area and based on the students’ findings is continuing the exploration in another new class “The Indigenous History of MIT”. These classes demonstrate how members of Gen Z  are driving forward change in every institution that they are deeply engaged in.

University and college museums may be the first to feel the impact of Gen Z’s strongly held values, but the resulting changes will set the standard across the museum world as a whole.

Towards the Museum of the Future

This post abounds with examples of institutions that are doing transformative work and that are likely very happy to share what they’re learning on their journey.  Although there is no 1-2-3 formula to follow, as a starting point, take some time to watch the “Student Roundtable: The Future of Museums” to better understand the values that Gen Z wants cultural institutions to exhibit.

For cultural institutions that are keen to engage with members of Gen Z, this is a pivotal moment to listen deeply and act collaboratively. This can start with a simple push to redefine audience segmentation by asking Gen Z how they would identify themselves, and continue by partnering with them to focus on ideas that they say they are interested in exploring. This really does need to be a grassroots effort to engage with Gen Z from ideation to research, then onwards to the development of relevant programs and initiatives, all the way through to implementation. To undertake this journey you have to accept that you will need to question the basis of your institution’s foundation, your mission, your values, and goals.  This questioning, when done in collaboration with Gen Z, will drive forward new avenues of growth and transformation across your organization that propel you to become the museum of the future.

About the Author

Sadiya smiles and looks at the camera. She is wearing red lipstick, a floral top, and dangly blue earrings. She's probably thinking about how cultural institutions can engage Gen Z!

Sadiya Akasha is the co-founder and Director of Product Development at Sitara Systems, a design and technology laboratory that creates interactive experiences with emerging technologies. Sadiya partners with cultural institutions to help them conceptualize and deliver technology initiatives by leveraging her background in human-centered design, agile thinking, and audience research. In her free time Sadiya enjoys exploring the rugged yet delicate landscapes of the great Southwest. 

The post Gen Z are Values-driven: What does this mean for Cultural Institutions? appeared first on RK&A.

Written by cplysy · Categorized: rka

Sep 22 2021

Comentario en Evaluador/a no hay camino, se hace camino al andar por Mar Herrera

me siento identificada, la viad del evaluador es dura

Me gustaMe gusta

Written by cplysy · Categorized: TripleAD

Sep 21 2021

How to embed Tableau dashboards, without hiding them from Google.

Okay, long story short.

If you are presenting to a public audience, don’t just plop the embed code on your website. And if you don’t feel like reading about why, skip to the bottom of the page for how I would suggest you embed Tableau dashboards (and lots of other embeddable things).

Short story long? Continue reading.

What you see is not what Google sees.

Freshspectrum Cartoon. "Putting the TV in the window is clever, but isn't quite like going on a vacation."

Your web browser translates html and other code into a screen that you can see and read. But just because something looks like it lives on your website, doesn’t mean it does.

Google calls this content rich media. The star of rich media is video. Most video lives on YouTube, we just watch it through our web pages.

But this is something we should really consider when sharing data dashboards. We tend to be really concerned with what goes into a dashboard, but how often do you think about how it will be shared?

So if I embed a dashboard, Google won’t be able to crawl it?

Freshspectrum Cartoon. "These are my new Google glasses. They let me see a website like Google sees it."
"You know if you right click and choose 'inspect' it does the same thing."

Not exactly. Google has gotten really good at crawling the web and indexing web pages. So it’s pretty likely that they can also parse out the information on the page. But that doesn’t mean it won’t cause problems.

The biggest issue is that if you just plop an embedded dashboard onto your web page, your dashboard content is less likely to show up high in Google search. It’s fine if you don’t care whether people find your dashboard when they Google. But if you want to reach a public audience, you need to compensate.

If you do plan to use rich media on your site, here are some recommendations that can help prevent problems.

Try to use rich media only where it is needed. We recommend that you use HTML for content and navigation.

Provide text versions of pages. If you use a non-HTML splash screen on the home page, make sure to include a regular HTML link on that front page to a text-based page where a user (or Googlebot) can navigate throughout your site without the need for rich media.

In general, search engines are text based. This means that in order to be crawled and indexed, your content needs to be in text format.

Google Developers – Rich media file best practices

For Example. Let’s look at North Carolina’s COVID-19 dashboard.

When I see the page, this is what I see.

NC COVID 19 Dashboard. Accessed 9/21/2021. https://covid19.ncdhhs.gov/dashboard

Behind the dashboard I see, in the code, I find an iframe link.

NC COVID 19 Dashboard. Accessed 9/21/2021. With the inspect button.

IFrames are sometimes used to display content on web pages. Content displayed via iFrames may not be indexed and available to appear in Google’s search results. We recommend that you avoid the use of iFrames to display content. If you do include iFrames, make sure to provide additional text-based links to the content they display, so that Googlebot can crawl and index this content.

Google Developers – Rich media file best practices

Here, on this page, Google is most likely indexing the lead in paragraph and the header. But not really the data.

But here is the positive with this example, NC’s “COVID dashboard” is an entire subdomain website. It’s not just a page with an embedded Tableau dashboard. So this produces a lot of text for Google to index. Subsequently, the dashboard ranks high in search rankings.

For comparison, you can check out another NC Tableau Dashboard. This one with opioid data. Unlike the previous example, the Opioid Action Plan Data Dashboard has just about all of the text based information embedded within the dashboard.

The NC Opioid Action Plan Data Dashboard.  Screenshot taken on 9/21/2021

Ultimately, when Google looks at this page. Instead of seeing all that text content, it’s going to just register a few lines of IFrame code.

For the sake of being able to Google this information, all of the text based information (and the tabs) should exist within the websites HTML, not the Tableau dashboard.

Counter Example: NY Times?

So here is the North Carolina COVID Coronavirus Map on NY Times.

NY Times Coronavirus Map, accessed on 9/21/2021. https://www.nytimes.com/interactive/2021/us/north-carolina-covid-cases.html

Let’s look at the code underneath the visual.

Unlike the last example, everything you see on the page can be found in the code.

If you have ever wondered why Google tends to spotlight data in their search criteria from places like the NY Times and Wikipedia, this is one of the major reasons why. The data crawled, indexed, and visualized on its own terms. This in turn, helps direct people back to places like NY Times and Wikipedia.

Screenshot from Google search of "North Carolina COVID" on 9/21/2021.
Screenshot of inspected Wikipedia COVID-19 data accessed on 9/21/2021 https://en.wikipedia.org/wiki/Template:COVID-19_pandemic_data

I know, this isn’t Tableau. But this embed thing goes well beyond Tableau.

When sharing information on the web, if we want people to find it, we have to understand what is shared behind what we see.

Bonus Tip: Yes, they are ugly websites, but they are trusted ugly websites

Freshspectrum Cartoon. "But our website is so ugly!"
"Definitely, but Google trusts it."

So another question I’ve seen is about domains and dashboards. Should we publish our dashboard under our state or organization domain?

If you work for a state government, federal government, or university you might get annoyed by your website. But even the ugliest of websites fare pretty well in Google, because these sites have a lot of authority.

So when faced with a decision of whether to put your public data dashboard up on an established organization/government public facing website versus sharing on a new domain, generally use the established site. Even if that means a few more bureaucratic hurdles, it’s going to give you the best chance of your content being found.

So, how do we embed Tableau dashboards then?

Freshspectrum Cartoon. "Don't do this. Just embedding a dashboard." "Do this instead. Header, Subheader, chart embed. etc."

Okay, here is the short answer.

If it is going to be public and on your website, stop thinking about your dashboard as a “Tableau dashboard.” Instead, think of it as a dashboard on your website that you created with the help of Tableau.

Put as much as possible into the HTML on your web page. And if that means creating multiple sub-pages using HTML, do that. Google is going to see it better, and connect it with you if it lives directly on your website.

Then embed the interactive charts/views intermixed with the text based HTML content.

Still Confused?

Let's create together. DiY Data Design.

It can get confusing. That’s why I run a workshop and hold Q&A portions with every session. Click here to join us and get $100 off the annual workshop cost.

Written by cplysy · Categorized: freshspectrum

Sep 20 2021

Evaluation Roundup – August 2021

Welcome to our monthly roundup of new and noteworthy evaluation news and resources – here is the latest.

Have something you’d like to see here? Tweet us @EvalAcademy!

New and Noteworthy — Reads

The Ultimate Guide to Effective Data Collection

A free 30-page eBook titled, “The Ultimate Guide to Effective Data Collection” can be downloaded from Atlan. The eBook focuses on survey design and administration. The nine chapters in the eBook are intended to help you design a survey that will give you high-quality data.

Evaluation Plan Rubric

EvaluATE recently released an Evaluation Plan Rubric. This rubric was used as part of study EvaluATE was conducting to assess evaluation-related content of its ATE proposals. It lists criteria to consider when assessing evaluation proposals and therefore is a useful resource for those of you involved in selecting an evaluator. However, the rubric is also a helpful reference document when writing evaluation proposals.

Defining and Measuring Diversity, Equity, and Inclusion

EvaluATE has published its survey findings of 210 ATE evaluators. One subsection of the survey examined how evaluators define and measure equity, diversity and inclusion in their projects. The survey found that more participants indicated that they measured diversity than equity and inclusion. Surveys were the most frequently reported data collection tool used across all constructs. Surveys asked diversity questions related to gender most often, then ethnicity, then race.

New and Noteworthy — Events

Most Significant Change

Organized by: Clear Horizon Academy

Date: September 10, 2021

Instructor: Jess Dart

Program Monitoring: The Key to Successful Implementation

Organized by: EnCompass

Date: September 21 & 23, 2021

Instructor: Kerry Bruce

Data Storytelling

Organized by: EnCompass

Dates: September 27 & 30, 2021

Instructor: Andy Krackov

Written by cplysy · Categorized: evalacademy

Sep 20 2021

But really, how do I use the RE-AIM Framework?

 

Early in my career as a consulting evaluator, I landed a major contract. The contract was to evaluate a nationally-funded province-wide quality improvement program in health care. The funder specified that I evaluate using the RE-AIM framework. 

Enter: Googling about RE-AIM. 

I know we’ve all had to-do items either personally or professionally where we just want someone to “tell me what to do.” Sure, I can read all the peer-reviewed articles and evaluation textbooks, but this framework has been used in projects all over the world – hasn’t someone put together the “RE-AIM for Dummies” book? Surely someone somewhere can point me in the direction of the first steps and key lessons learned. If it existed at the time, I didn’t find it. 

Years later, I was working on an academic research team using Framework Analysis to analyze a huge set of qualitative data. Same scenario: someone please just tell me how to start and what to do! The difference this time was that someone had: I found it, it did exist, and it was awesome. Parkinson et al had published a detailed description of their use of Framework Analysis, complete with missteps, backtracking, and all (Parkinson, Eatough, Holmes, Stapley, Midgley, 2015). 

I love these experiential descriptions. I love reading about moving from knowledge gathering to action. I love sharing failures. What better way to learn? 

So, having used RE-AIM a handful of times on a few major initiatives, here is my account of how to use the RE-AIM framework in your evaluation planning, implementation, and reporting. 


What is RE-AIM?

In quick summary, for those less familiar, RE-AIM was originally developed to assess the public health impact of interventions, based on five domains: 

  • R – REACH 

  • E – EFFECTIVENESS 

  • A- ADOPTION 

  • I – IMPLEMENTATION 

  • M – MAINTENANCE 

RE-AIM provides great structure to an intervention that is well defined, but it may not pair well with something like Developmental Evaluation. It can certainly be used with a Utilization-Focused Evaluation approach.  


Step 1: Do some basic research.

I recommend the following: 

  • Where it all started, the first paper to describe RE-AIM. 

  • The academic version of this article, describing what it means to use RE-AIM.  

  • A more recent update, describing the evolution and application of RE-AIM in 2019. 

  • And, lucky readers, now there is a full website dedicated to RE-AIM, complete with a comprehensive list of resources. 

But I did say I was going to be more practical than just telling you to do all the academic background reading I was so desperately trying to avoid so many years ago, but truthfully, you can’t get around needing some background knowledge. My goal is to share with you a summary of the content on the RE-AIM website but also share some learnings from my own experiences.  

Step 2: Build (and implement) your evaluation plan.

The good news is that the RE-AIM framework gives you your key evaluation questions. Of course, you can (and should) supplement and add detail. 

So, where RE-AIM says, “Have I reached my target population?” you may adapt to “How many clients participated?” or “How many patients had access to the program?”  

Where RE-AIM says, “Was my intervention effective?” you may add in detail “Did my intervention improve patient-reported outcome scores?” or “Did my intervention improve [insert primary outcome measure]?” 

I like to structure the data matrix section of my evaluation plan right around the RE-AIM domains, like in the table below. 

One thing to note early on here, is that RE-AIM is a clever acronym ordered so that it can be read or pronounced easily, but, in my opinion, it’s a little misguiding, giving a sense of purpose in order where there is none. From my experience, it probably goes something more like ARIEM, a little less catchy. (Interestingly, in researching for this article I found a small hidden footnote on the RE-AIM website that admits this exactly!) 

RE-AIM can be effective in the actual planning of the intervention. As an evaluator, I always advocate for being part of the design team. With RE-AIM your goal is to get the team thinking about these five dimensions: 

  • How will they be recruiting (Reach/Adoption)?

  • Will it be representative (Reach/Adoption)?

  • What are their goals or outcome measures (Effectiveness)?

  • Do they have a clear plan of how they will achieve those goals (Implementation)? And so on. 

I think RE-AIM lends itself well to formative and summative evaluations. Several times I have drafted formative and summative evaluation plans for a single project by splitting up the RE-AIM acronym.

Adoption, Reach, and Implementation are those things in an intervention that can be course-corrected. Think of these as your process evaluation metrics. If you aren’t reaching your target population, don’t have organizational adoption, or aren’t implementing according to plan, you won’t be effective or maintain anything of worth. This is your formative evaluation. Then, Effectiveness and Maintenance can assess outcomes and sustainability as part of your summative evaluation. 

REACH:

Reach is likely just a count, but can be supplemented with qualitative data captures for a deeper understanding. 

e.g., We designed a training program aimed at Grade One teachers. Our city has 500 teachers and 286 participated in our training. Our reach was 286, or 57%. 

You could then go on to describe the demographics and how they differed (or not) from those who did not participate. Be sure to be clear about any inclusion/exclusion criteria! Reach is also where you can include questions that address access, equity, diversity, and inclusion: are participants representative of the population? Are we reaching those who would benefit most from the intervention? 

EFFECTIVENESS:

Think of this as a “traditional” evaluation – this is: Did the program work? and What difference did it make? This domain is where you will report outcome measures. Any number of methodologies would be appropriate here, depending on your specific intervention. Effectiveness may well be the bulkiest section of your evaluation plan. As in any evaluation, triangulation is a good idea to aim for. 

e.g., Our training program had a goal of training teachers to use a new method of teaching reading to Grade One students. Our effectiveness measures may include: # of trained teachers using the method (process measure or output) and % of students with improved reading skills (outcome measure), or the actual % improvement in reading score. 

Side note: RE-AIM is not mutually exclusive with other frameworks. I have often–in the evaluation of training programs–embedded the Kirkpatrick evaluation framework into the “E” and “I” of RE-AIM. The RE-AIM website actually recommends layering with PRISM. 

ADOPTION: 

It is easy to confuse reach and adoption. I struggled with this at first. For me, it helps to think of them as the same concept but at different setting levels: Reach is about individuals or participants, whereas Adoption is about groups or organizations. Adoption is asking: What organizational support do you have? So, similar to Reach, this is likely also a count. And, like Reach, you can supplement with additional data for a deeper understanding. 

e.g., How many schools supported teachers to participate in the training. How many schoolboards supported the schools to support the teachers? Our city has 300 schools; 120 supported teachers to participate: 40% adoption. Our analysis shows that there was an underrepresentation of rural schools and overrepresentation of inner-city schools. 

Like Reach, you could go on to describe characteristics of these organizations and how they supported the initiative. I have often used Adoption formatively to understand why these organizations endorsed the project or bought in. This exploration can help with spread and scale, or, if things aren’t going well, it is a great way to course correct. I have also included interviews or focus groups with the organizations that did not engage, to understand key barriers. 

IMPLEMENTATION:

Implementation is huge. There are whole fields about Implementation Science. In RE-AIM evaluation, you are primarily concerned with fidelity to the plan: Was the intervention implemented as intended? What adaptations were made? How consistent was implementation? Completion rates may also be an appropriate measure here. 

e.g., Interviews or surveys with the trainers identified barriers and enablers for successful training sessions. Interviews or surveys with the operational team identified barriers and enablers for recruitment, training the trainers, developing curriculum, building engagement and buy-in, etc.  

You could definitely layer on any number of implementation science frameworks here, but likely this is not the key area of interest for your stakeholders and doing so would make this beast unwieldy and hard to manage. Key tips here include considering how each level contributes to implementation: What did the adopted organizations do? What did your organization do? Don’t forget that your own design and operational team are key data sources too! 

Your “results” here are likely descriptions of barriers and enablers along with formative lessons learned and resultant adaptations. 

MAINTENANCE:

I’ll be honest: I have definitely turned in a final evaluation report before the program has reached a stage to be evaluated for maintenance. New initiatives tend to focus on implementation and first-round outcomes. I have, however, been fortunate enough that this hasn’t always been true. In one initiative, I used annual data reviews to look at maintenance of outcomes. In this particular initiative, we were happy to see maintenance, but we also learned that there was a significant plateau or ceiling effect of both outcomes and reach. This isn’t a huge surprise given what we know about the Diffusion of Innovation. As an evaluator, I could then facilitate discussions like: How will (or should) we attempt to reach those laggards? Will they take up 80% of the resources? 

In fact, this is an example of why applying a framework to your evaluation is helpful. If you build in a maintenance evaluation from the start, your team will know that this is planned and you will have the capacity to do the work when the time comes. 

So, if you are fortunate enough to be able to evaluate maintenance, it is likely a repetition of many of the measures that came before – you may take a look an ongoing reach and adoption: have you plateaued or continued to spread? You may look at effectiveness outcomes: have you sustained the gains you made? 

e.g., Annual check-in of reading scores in Grade One (and now Two) children. Updated participation counts to assess spread and scale. 

There is a handy Checklist for some key questions and considerations for each domain. It’s also worth noting that there is nothing holding you to evaluate all five dimensions. I say that begrudgingly though because RE-AIM was developed so that evaluators didn’t overlook key dimensions that are essential to program success. But sometimes there are valid reasons that one of these dimensions may not be relevant for your intervention.   

Step 3: Reporting

The RE-AIM website asks you to consider quantifying or scoring the five dimensions for a visual display:

I’ve never done this. This may be informative to you, as the evaluator, but I find that most stakeholders are less interested in the details of the evaluation framework you’ve applied and more interested in the “So what? Now what?” I certainly have used the RE-AIM structure to guide my reporting, but I don’t think it’s required. The key here is to know your audience – how aware are they of RE-AIM? If you were involved in the planning and they built key evaluation questions into the RE-AIM framework, using the five dimensions in your reporting may be appropriate, but in my experience, you can also draft a really great evaluation report that was based on RE-AIM without being tied to the domains as your section titles. Eval Academy has some great articles on how to draft that killer evaluation report.


Things have changed in the 10 years since I first used RE-AIM. There have been a lot more examples published and a lot more content is available. My goal today was to try to synthesize the key points in one place for you, and to share some lessons from my own experiences. I have found RE-AIM to be both highly structured, providing directed guidance, but also flexible enough to allow you to explore in greater depth the key areas of interest for your evaluation. 

So, hopefully, you aren’t as in the dark as I felt when I was first tasked with using RE-AIM. It’s one of many tools for evaluators to consider and one that I’ve had lots of success with! If you want to talk more about whether or how to use RE-AIM in your next evaluation project, consider booking some time with one of our evaluation coaches.  

Speak with an Evaluation Coach


Sign up for our newsletter

We’ll let you know about our new content, and curate the best new evaluation resources from around the web!


We respect your privacy.

Thank you!


 

Written by cplysy · Categorized: evalacademy

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 162
  • Go to page 163
  • Go to page 164
  • Go to page 165
  • Go to page 166
  • Interim pages omitted …
  • Go to page 310
  • Go to Next Page »

Footer

Follow our Work

The easiest way to stay connected to our work is to join our newsletter. You’ll get updates on projects, learn about new events, and hear stories from those evaluators whom the field continues to actively exclude and erase.

Get Updates

Want to take further action or join a pod? Click here to learn more.

Copyright © 2026 · The May 13 Group · Log in

en English
af Afrikaanssq Shqipam አማርኛar العربيةhy Հայերենaz Azərbaycan dilieu Euskarabe Беларуская моваbn বাংলাbs Bosanskibg Българскиca Catalàceb Cebuanony Chichewazh-CN 简体中文zh-TW 繁體中文co Corsuhr Hrvatskics Čeština‎da Dansknl Nederlandsen Englisheo Esperantoet Eestitl Filipinofi Suomifr Françaisfy Fryskgl Galegoka ქართულიde Deutschel Ελληνικάgu ગુજરાતીht Kreyol ayisyenha Harshen Hausahaw Ōlelo Hawaiʻiiw עִבְרִיתhi हिन्दीhmn Hmonghu Magyaris Íslenskaig Igboid Bahasa Indonesiaga Gaeilgeit Italianoja 日本語jw Basa Jawakn ಕನ್ನಡkk Қазақ тіліkm ភាសាខ្មែរko 한국어ku كوردی‎ky Кыргызчаlo ພາສາລາວla Latinlv Latviešu valodalt Lietuvių kalbalb Lëtzebuergeschmk Македонски јазикmg Malagasyms Bahasa Melayuml മലയാളംmt Maltesemi Te Reo Māorimr मराठीmn Монголmy ဗမာစာne नेपालीno Norsk bokmålps پښتوfa فارسیpl Polskipt Portuguêspa ਪੰਜਾਬੀro Românăru Русскийsm Samoangd Gàidhligsr Српски језикst Sesothosn Shonasd سنڌيsi සිංහලsk Slovenčinasl Slovenščinaso Afsoomaalies Españolsu Basa Sundasw Kiswahilisv Svenskatg Тоҷикӣta தமிழ்te తెలుగుth ไทยtr Türkçeuk Українськаur اردوuz O‘zbekchavi Tiếng Việtcy Cymraegxh isiXhosayi יידישyo Yorùbázu Zulu