From 2016-2018 I was on the board of trustees of the Wikimedia Foundation. This organization oversees the thirteen Wikimedia properties including Wikipedia, Wiktionary, Wikibooks, Wikimedia Commons, and so on. It’s goal is to provide the infrastructure that makes free knowledge available to everyone.

There are more than 200,000 Wikimedia editors and contributors. More than a billion unique devices access Wikimedia projects every month. Wikipedia alone averages approximately eighteen billion page views per month.

The English version of Wikipedia contains approximately six million articles, and people add approximately 600 articles per day. In 2020 supporters donated more than $120,000,000 to the organization.

With firsthand certainty, I can tell you that leading Wikimedia is one of the hardest jobs in technology. This is because you have to make so many people happy: Wikipedia editors and contributors, end users, donors, partners, and employees and your tools do not include offering employees stock options, selling advertising, or collecting affiliate fees.

This episode’s remarkable guest is Katherine Maher. Until April 2021 she was chief executive officer and executive director of the Wikimedia Foundation—and she did a remarkable in this remarkable position.

She has a bachelor’s degree from New York University as well as extensive education in the Middle East.

She worked for UNICEF, National Democratic Institute, World Bank, and Access Now. She was chief communications officer of Wikimedia from 2014 until 2016 when she became the interim executive director.

Something that truly irritates me is that teachers tell students not to use Wikipedia as an information source, so this interview starts there. FYI, IMHO, Wikipedia is the best source of information in the world.

Listen to Katherine Maher on Remarkable People:

If you enjoyed this episode of the Remarkable People podcast, please head over to Apple Podcasts, leave a rating, write a review, and subscribe. Thank you!

Join me for the Behind the Podcast show sponsored by my friends at Restream at 10 am PT. Make sure to hit “set reminder.” 🔔

Guy Kawasaki's Remarkable People podcast Brought to you by remarkable tablet

Text me at 1-831-609-0628 or click here to join my extended “ohana” (Hawaiian for family). The goal is to foster interaction about the things that are important to me and are hopefully important to you too! I’ll be sending you texts for new podcasts, live streams, and other exclusive ohana content.

Please do me a favor and share this episode by texting or emailing it to a few people, I’m trying to grow my podcast and this will help more people find it.

Great episode of my favorite podcast, Remarkable People. Click here to listen 🎧 Click To Tweet

I'm Guy Kawasaki, and this is Remarkable People. From 2015 to 2016, I was on the board of trustees of the Wikimedia Foundation This organization oversees the thirteen Wikimedia properties including Wikipedia, Wiktionary, Wikibooks, Wikimedia Commons, and so on. It's goal is to provide the infrastructure that makes free knowledge available to everyone.
There are more than 200,000 Wikimedia editors and contributors. More than a billion unique devices access Wikimedia projects every month. Wikipedia alone averages approximately eighteen billion page views per month.
The English version of Wikipedia contains approximately six million articles, and people add approximately 600 articles per day. In 2020, supporters donated more than $120 million to the organization.
With firsthand certainty, I can tell you that leading Wikimedia is one of the hardest jobs in technology. This is because you have to make so many people happy.
Wikipedia editors and contributors, end users and consumers, donors, partners and employees. And your tools do not include offering employees stock options, selling advertising, or collecting affiliate fees.
This episode's remarkable guest is Katherine Maher. Until April 2021, she was a chief executive officer and executive director of the Wikimedia Foundation. She did a remarkable job in this remarkable position.
She has a bachelor's degree from New York University as well as extensive education in the Middle East. She worked for UNICEF, National Democratic Institute, World Bank, and Access Now. She was the chief communications officer of Wikimedia from 2014 until 2016 when she became the interim executive director.
Something that truly irritates me is that teachers tell students not to use Wikipedia as an information source. So this interview starts with that issue. FYI, IMHO, Wikipedia is the best source of information in the world.
I'm Guy Kawasaki, and this is Remarkable People. This interview took place during her last week at Wikimedia.
Guy Kawasaki: I'm the second to the last interview. Is that true? Tomorrow's the Daily Show and that's it?
Katherine Maher: Mmmhmm…
Guy Kawasaki: So it's Guy Kawasaki and Trevor Noah in the same breath?
Katherine Maher: That's right. That's right. Saving the best…
Guy Kawasaki: I have arrived!
Katherine Maher: Yeah!
Guy Kawasaki: Was Oprah busy?
Katherine Maher: I think, yes. I think she did her one interview for the year. Yeah.
Guy Kawasaki: We don't want to talk about how the Royal Family has treated Wikipedia. Let's just say you're at a family dinner and your cousin is a teacher, and she says to you, "I never let my students use Wikipedia because anyone can say anything. You can't trust Wikipedia as a source."
What do you say to her?
Katherine Maher: I'd say, "oh, that makes complete sense that you'd have questions, and of course your students are probably using Wikipedia anyway. And so I'd love to talk to you about how you should teach them how to use Wikipedia in a way that helps them learn.”
Guy Kawasaki: And then what?
Katherine Maher: And then what...I'd say," look, all the kids are using Wikipedia. We all know it. I was never allowed to use and cite an encyclopedia when I was writing essays, even when they were like hardbound versions of my school library. It's a great place to get started learning. It offers all those wonderful citations at the bottom for kids to do further research.”
“Once they've gotten some basic context, and it's also an opportunity to teach young people about questions of digital literacy, media literacy, how to do research on the Internet, what information to trust, how to be a critical reader. And so instead of being afraid of Wikipedia, maybe the best place is to embrace it and use it as a teaching tool for the students who are going to use it anyway.”
Guy Kawasaki: And then your cousin says, "but Katherine, anybody can change anything. How can they believe anything?" And you say...
Katherine Maher: I say, "Wikipedia is mostly accurate. Wikipedia sort of, statistically speaking, is that the majority of information in there is high-quality information. And so it is the case that sometimes we're going to get it wrong. Most places get something wrong from time to time. It's a good opportunity to think about how you evaluate the quality of information and teach folks how to do some fact checking and how to engage in critical thinking."
"But overall, if you're looking at the population of a city or a major sort of historical fact, Wikipedia is going to be correct on these issues. And really, if you're going to be making a life decision based on Wikipedia, then we'd ask you to consult your doctor or your lawyer, or perhaps look for a second opinion."
Guy Kawasaki: And if she says, "well, explain to me how it truly, really works."
Katherine Maher: Hmm. You can't just say anything on Wikipedia. There are a couple core policies that require you to cite your information back to a reliable source. So you have to have a citation. You may have seen those “citation needed” tags. That source has to be what Wikipedians think of as reliable, which means that it has to be respected and its fields.
If it's a media publication, like a national newspaper or even a regional or local newspaper, or a medical journal or some sort of other academic journal, all of the information has to come from that.
And if it's disputed, then Wikipedia will pause. They will ask for additional citations or additional sources before something can stay on Wikipedia. So generally speaking, the more controversial or the more highly viewed an article is, the more accurate it actually is.
In order to contribute to Wikipedia, people need to learn these policies. So yes, it's the encyclopedia that anyone can edit, but in reality, most of the people who contribute to it are pretty experienced in doing this kind of work and work really hard to make sure that the quality of contributions stay high and accurate so that everyone can keep trusting Wikipedia.
Guy Kawasaki: What would happen if "anybody" went to the Planned Parenthood entry at Wikipedia and said, "Planned Parenthood sells baby parts." How long would that entry last?
Katherine Maher: That would not last long. In fact, you probably couldn't make that edit because Planned Parenthood, along with many hundreds of other articles, my guess, if I were to look it up right now, is probably what is called a ‘protected’ or ‘semi-protected’ article, which would mean that you have to be an experienced Wikipedia editor in order to edit that article.
So it's what we think of it is there's a little lock icon on it, and it means that you have to have advanced permission rights in order to make changes. And that's how we help safeguard articles that are either long-term sites of controversy or debate or breaking news events, information that is sensitive around everything from pandemics to elections.
Guy Kawasaki: So it is literally not true that anybody can edit anything.
Katherine Maher: Anybody can edit anything if they learn. So I like to think of this as, in the pandemic, you may remember lots of people wanting to foster dogs. In order to foster a dog from a local shelter or an adoption agency, you had to apply to foster a dog, and you can't just walk right in off the street and take a dog home with you.
There's a level of screening that goes into even this really simple act that hundreds of millions of people do all the time around the world. With Wikipedia, everyone can edit, anyone can edit, but you do kind of have to go through the process of learning first. And in order to make a contribution to an important article, your peer Wikipedia editors have to trust that you're going to make a contribution that is, what we call, "good faith." That is to say that it's going to make the encyclopedia better and not worse.
Guy Kawasaki: This is the last question I have about editing. I faced this myself. So in my Wikipedia entry, there are places that I would make corrections or additions, but I can't even edit my own entry because I can't cite a source because no source may not--because no source may know this fact. So what do I do?
Katherine Maher: So generally speaking, one of the first things we'll acknowledge is the more well-known you are, the less complete your biography will feel because you may be notable for your professional accomplishments, but perhaps Wikipedia doesn't reflect the fact that you're actually an excellent cook and a wonderful friend and a caring father. And that's probably because maybe you're fortunate enough that someone's written that about you and the past and a magazine article or the like. But sometimes, all of the totality of who we are is not fully reflected.
And so for me, for example, Wikipedia thinks that I speak more languages than I do because I studied them at some point, but it's never given me a proficiency test. So it over represents some of the details of my life, and perhaps underrepresents other things that are true about me that I care about simply because some information has been published and some hasn't.
For people who have articles on Wikipedia, we encourage you not to write and contribute to your article yourself because Wikipedia's other really important policy, in addition to verifiability, is neutrality. And it's very difficult to be neutral about yourself.
And so when you edit Wikipedia, editors usually can tell that it's you trying to write about yourself and they generally politely ask you not to, but if you keep doing it, they might block you. So that's not a great experience. We tell people not to do it.
Guy Kawasaki: I think that one of your great accomplishments is that you have increased the amount of contributors. So how have you done that?
Katherine Maher: Yeah…hard work. So overall, when I joined Wikipedia, the number of editors per month had been in a slow decline over the course of the last few years prior to me starting at the Foundation. And it's always hard at the scale of something like Wikipedia to isolate exactly what the issues are.
You sometimes don't know what you've done right, and you sometimes don't know what you've done wrong. And sometimes you can do lots of things, and perhaps you've prevented something from being worse, and you'll never know.
In our case, there were a few, really, baseline sort of table stake things that we needed to do. We needed to make the editing experience more pleasant and seamless and up to sort of “best in class” experiences people have when they're in other document production environments, like whether it's Google Docs or something else.
We did that, and now, if you're contributing on a desktop, it's a great, super easy, very reliable experience--very few bugs, really consistent. You can save your work very easily, all these really basic things. The other thing that, I think, is really important is we launched a mobile contribution experience that was, like, a native experience to mobile--high functionality, high feature set, which when I joined the foundation, we didn't have the ability to really edit on mobile at all.
Maybe you could log in through the desktop experience, but it was super kludgy and difficult to be able to do any of the advanced functions. And so we've rolled out this mobile editing experience that is just a dream!
You're going along, remember when we used to ride around on, like, subways and public transit? Well, I used to commute to the office and I would read a Wikipedia article, and I'd find something that maybe wasn't right. And, all of a sudden, I can just fix it right there on my phone.
So that opened up a whole new set of potential contributors to Wikipedia who are mobile-first. And so that's the younger generation, people who have more free time that they spend on their mobile phones. Those are people in emerging economies who are mobile-first in terms of their contributions. And we saw a tremendous increase in people's usership of these features such as mobile editing.
The last thing that I think is really important is we have made a number of changes that have been designed to address the culture of contribution to Wikipedia. And what I mean by this is we want people to be friendly and welcoming and encouraging of new editors.
And so we rolled out a code of conduct--a universal code of conduct for all of our sites that would remind people that it's all about raising the floor of what the feeling should be when you're on Wikipedia, so that newcomers feel supported and understand how to best contribute and why and how their contributions are going to be valued and recognized.
Guy Kawasaki: What percent is do you think today is “mobile” versus “desktop”?
Katherine Maher: Oh, I'd have to go back and look at mobile editing. Mobile readership is more than fifty percent of our readership, and mobile editing was gaining rapidly, but they're almost just different use cases. People who write long-form articles are on desktop.
People who do this sort of “copy-editing,” which is really important, that happens more on mobile. So they're complimentary as opposed to sort of binary, one or the other.
Guy Kawasaki: And you've also done a great deal of work about gender diversity. So how did that work out?
Katherine Maher: Yeah, I'm really proud to say that in the last year, we received data, even just in this last quarter, that indicated that we'd had a thirty percent, excuse me, fifty percent increase in the percentage of people who identify as women contributors for Wikipedia. So the reason it could be fifty percent was that it was so abysmally low to begin with. It was at eleven percent and now it's at fifteen percent.
Focusing on the contributions of women, and not just women, but really, just any group that has been marginalized or excluded from Wikipedia, has been a big priority for me on a personal level and then for the organization as a whole, because we have a mission to support the world with all the world's knowledge. And all the world's knowledge means that it can't just be knowledge that men care about or acknowledge, that folks in the United States care about, or knowledge that English speakers care about because the world is very large and vast and rich and so it's important.
It represents and reflects the totality of the human experience. So women are a big part of that, right? Like fifty percent of it, a little more than that even.
And so we've been very focused on increasing women's contributions not because they only write about women or only write about things that women care about, but the expectation is, over time, that will ensure that we have better representation of all the things that all people care about. We have done a number of things to address this. I mentioned the universal code of conduct, which is really around ending harassment on Wikipedia.
Harassment is one of the biggest deterrents for women's participation online, on any platform that you go to. On Wikipedia, harassment takes a very different form than on some social platforms.
It tends to be less overt and more about creating an environment in which people feel unwelcome. And so we wanted to be very clear about what does welcoming behavior look like on Wikipedia, and what does harassment look like, and how do we ensure that it's not just about people protecting people on our platforms but also ensuring that bad behavior doesn't follow them off platform to other places where they may make their home online.
For example, social media sites and the like. This is really about creating an environment in which everyone feels welcome and everyone feels as though they can participate in. So that was a huge piece of work, Guy.
You might remember that we started working on that together in 2016. And now, in 2021, that is a fully-enabled policy for all of the Wikimedia sites. And so I’m really pleased to see that have come through.
There's two other areas that I think we've made significant progress for support for women. One is the Wikimedia Foundation. It is a grant-making organization. So we fund efforts that enable and support green knowledge around the world.
So we have gone ahead and invested very heavily in efforts by groups of women, non-binary folks, anybody who's really focused on gender discrepancies or other racial and ethnic discrepancies around representation, to enable them to have the support and resources they need to create partnerships around free knowledge, to build community around pre-knowledge, to do public outreach around contributions to Wikipedia.
And we've seen really good success here. We've seen that women organizers of events tend to--the women are overrepresented in those areas of organizing. They tend to engage and stick around longer and become community leaders the more support that they have over time. And so, that's a model that we're working to develop out and ensure has the resources it needs to be successful.
And then we're pairing all of this work with efforts within our actual editing interface to surface some of the things that make Wikipedia feel like a more social and community-oriented site because our research indicates that when women editors in particular know how their contributions are valued, when they have peer mentors that they can speak to in order how to learn the process of contribution, they tend to stick around longer and become more productive and more satisfied editors over time.
Guy Kawasaki: What happens to someone who violates the Universal Code of Conduct?
Katherine Maher: So, great question. Right now there are a number of different ways we're experimenting with how to best enforce this code of conduct. Because Wikipedia is so large, we have realized that what works for us is probably very different than many other social sites.
So in most places on the Internet, you have hired trust and safety professionals who review, indications of harassment or unfriendly or unwelcoming behavior. We're trying to work with our global community to build their capacity to identify problematic behavior and then apply solutions so that it's something that scales to the size of the community as opposed to trying to make our relatively small nonprofit grow to scale to the size of the world.
Wikipedia has always had avenues of enforcement. You can block a user, you can ban a user over time. These are pretty intense sort of interventions, though, and they don't necessarily help people learn how to work more successfully together. And so what we're focused on is building capacity within the editing community.
So we've got editors who are learning about how to identify what unfriendly behavior looks like. We are developing skills within our editing community around what would be called “allyships.” So how to step in and intervene if they see a pattern of unpleasant or unwelcoming behavior. And then of course having other ways to be able to sanction a user and having an appeals process so that if a user is sanctioned, there's always a way to be able to understand the thinking behind it.
And of course, if we're truly sort of problematic, users who continue to violate our code of conduct, the global permanent ban remains an option.
Guy Kawasaki: And I didn't even know such a thing existed until a few days ago. When does the Supreme Court kick in?
Katherine Maher: Yes. So you are referring to the Arbitration Committee of Wikipedia, and there's one in a number of languages.
Guy Kawasaki: Yes.
Katherine Maher: Those kick in when somebody makes a request that the Arbitration Committee hears their case. So it's not a default for every sort of incident that occurs, but a Wikipedia editor who disagrees with another Wikipedia editor over a matter of policy, not content, can go to this Arbitration Committee, which is a volunteer committee of experienced Wikipedia editors, and ask that their case be reviewed.
The challenge with this is that it's a very public process, and so it's not appropriate always for cases of harassment because sometimes that requires hearing evidence in public in a way that can really be a deterrent from someone bringing their case to that court, as it were.
And so we've been working with that committee to develop private processes that would enable people to be able to have their cases heard in a way that both is true to Wikipedia's commitment to sort of transparency and accountability, but also enables people who are victims of harassment to not feel like they have to unearth all of that in public in order to be able to have their issues addressed.
Guy Kawasaki: Perhaps you can help the U.S. Supreme Court in that regard, but we won't go down that path. So this is 2016 was when I was there. 2021, you've gone from eleven percent to fifteen percent. Why is it still eighty-five percent? Why isn't it fifty-fifty or fifty-one forty-nine?
Katherine Maher: So Megan Smith, who is the former U.S. CTO, was at Google for a long time, is very invested in women in technology and thinking about the ways in which to support women in the full sort of understanding of STEM, has a really nice list of issues that deter women from participating in tech and it tends to look at issues around do people feel welcomed? Do they understand how their contributions are going to be valued? Do they feel as though there is value in contributing? So is there a net-personal benefit, whether intrinsic or extrinsic, for that contribution?
Those are some of the ways in which women enter into technology and stay into technology. In the case of Wikipedia, there's a number of other factors that happen before you even get to our sites. So when we look around the world globally, women have less leisure time which means that they have less free time to participate in hobbies that are sitting perhaps alone with your computer contributing to Wikipedia.
They often have not just paid jobs but sort of jobs around the home. Most housework, child-rearing falls on women. The other pieces of this includes the fact that women are often deterred from participating in knowledge in a way that is about the construction of knowledge.
So while women are educated, at least in the United States at higher rates than men, what we often see around knowledge-seeking behaviors on Wikipedia is women come to Wikipedia less frequently than men do. So before we even get to a question of, why aren't women clicking that edit button, we have to understand why are women not seeking knowledge?
And is that because we don't have information that's relevant to them, or is it because there's an actual pattern that deters women from information-seeking behaviors that is rooted in other sort of cultural questions about who owns knowledge and who gets to be an authority in our society?
So what are the issues that are off Wikipedia, which are around how are women acultured to understand their value, their worth, their voice in society as a whole? And for all of those great stats on women graduating with more bachelor's degrees than men, we don't see that consistent through some PhD and advanced academic programs into tenureship, particularly in the areas of technology and the sciences.
So what are the sort of broader, social context then thinking about? What is the time that women have to participate in hobbies, including contributing to Wikipedia, and how does something like Wikipedia, which is quite abstract--you don't get to see the immediate effects of it in your community, in your kids' schooling and the like--how do we make that more tangible for people so that they feel as though there's real value there?
And then how do we make our platforms and sites feel as though they are welcoming and easy to contribute to in a way that once people do start, they're like, “oh, this is great,” and I really want to stay a part of it.
Guy Kawasaki: If you control for all the busy-ness and other things that might remove women from the use of information and much less contribution, I think you said an interesting thing that women even use Wikipedia less than men. Do you have any theories taking off busy-ness and all that? Is there anything else that why women would not seek information as much as men?
Katherine Maher: So there are a lot, and they're a number of things that we're trying to understand about this because we commissioned this research, globally, and we found that even in societies that are considered incredibly egalitarian, like the Scandinavian nations, for example, women still visit Wikipedia less frequently throughout the course of the week than men visit Wikipedia.
So we can control for leisure time, for example, as you rightly said, but there's a question of aculturation around knowledge-seeking and the ways in which, perhaps, maybe we don't have the right type of knowledge that women are looking for--that's a potential product market fit issue. We see that in emerging markets where we know when we talked to African Wikipedians, we have a real challenge here where many Africans are looking for information about their local pop stars and, “you don't have it, so, you guys just aren't that relevant.”
So maybe there's a product market fit there that's true for women as well, but what we actually suspect is that it's something a little bit more pernicious, which is that, in general, following formal education, women tend to drop out of the information-production space outside of formal workplace environments.
And so within the context of when a woman is employed as a professional, producing knowledge, producing and contributing, it can be very successful. The idea of engaging in long form, individual knowledge-seeking and production is something that we just see a tremendous divergence where men are bloggers and sort of sub-stack writers and the like, and you don't see that same representation as an overall gender balance in those spaces.
There's also this really interesting thing because when I say this, people go, "but what about, like, Pinterest and Facebook. Women are on these sites, right?" And the answer is: absolutely.
But very often, that is a form of communal production and social labor, whereby women take on the majority of the responsibilities in their household or in their community of maintaining social ties, and those platforms are very productive around performing communal or domestic relationship labor—“I'm going to make sure that I'm friends with the kids' friend's parents,” or “I'm going to use my Pinterest board to be able to organize what do I need for this upcoming birthday party for what's going on in my community or my family.”
So these are forms of unrecognized, or labor, that are really important that women perform online that are distinct from the ways in which men perform productive labor online or social labor online. So it--there's just really interesting, sort of, theories that underlie all of this and, unfortunately, they're all horrifically gendered because that's kind of the world that we live in.
Guy Kawasaki: You're trying to get corporations to pay for some of the information that they get from Wikipedia. So can you explain that?
Katherine Maher: Yes. We're asking people to pay for a thing we give them for free!
Guy Kawasaki: What a concept!
Katherine Maher: We are our own greatest competitor. So one of the things that I think is fascinating about Wikipedia is that most of us know it as, first and foremost, the encyclopedia itself. You click on the article page, you read the article or you skim for the information you're looking for. And in that sense, we're really an end-user consumer product.
But over the course of the last ten years, we've seen new emergent use cases for Wikipedia that are more backend. What I think of as “knowledge” as a service function, they look at Wikipedia as a database for computational science, research, natural language processing, training databases, machine translation, taxonomy building, so there's that piece of it.
And then there's the reuse of Wikipedia, which is, if I have my voice assistant, or if I'm searching for information through a search engine, where is that sort of content forward results coming from? So over the course of the last decade, search engines have gone from simple snippets to much more complex results.
Voice assistants now have a whole category of utilization that didn't exist ten years ago. Wikipedia is massively embedded in these experiences that are provided by these other large platforms that you're more familiar with in sort of the consumer space. So the Amazons, the Apples, the Googles of the world--we recognize this as a good because it means that free knowledge is getting out to more people.
So if you're sitting there, doing your homework, and you're like, "Hey Alexa, I've got this question," and you get Wikipedia as knowledge--fantastic. We're thrilled.
Same thing is true of search results, but there are some challenges around this as well, which is that it means that people aren't coming to our sites, which means that they're not necessarily editing or donating. We are still a nonprofit. It means that if information is cached on one of these companies, servers and re-distributed, and that information is out of date, you're getting bad information or inaccurate information, and we don't want that either.
So we have been working with these larger tech companies to say, “what would it look like for you to have great, high uptime access to our APIs at an enterprise level, where you can have the most up-to-date information from Wikipedia?” If you have data on your side about where something might be inaccurate or wrong, you have a dedicated way to get that back to us.
And what would it mean if you paid for it? Because one of the things that we think is true is that these companies are getting real value out of the product that we offer. If we're building an enterprise-level API, we're going to have to maintain that service's uptime and the like, and so we think it's just fair to say, this is a transaction that benefits everyone.
It gets more free knowledge out there. You know the quality and the assurance on our side of things and we're going to use that money to be able to support the expansion of free knowledge and in the sustaining of the high quality of free knowledge. So we're going to keep doing outreach and work with women.
We're going to keep growing into emerging markets and languages that are going to be valuable to you. At some point in time, we're going to sustain the commons and invest in the growth of that sort of free knowledge base, which is ultimately going to have a net benefit to you because that's the information that people are looking for.
It has value to your users. And so this is a win-win all around.
Guy Kawasaki: And have they said, “okay”?
Katherine Maher: We are still in conversations, but yes, the indications are positive that this is a service that people really would like to use.
Guy Kawasaki: Because if they don't say, “okay,” they are morally reprehensible in my opinion, but we don't need to go down that path.
I am more interested, actually, in the Wikipedia community's reaction to that because I would guess that there are some people saying that we are selling out to big business.
Katherine Maher: Of course. Definitely. There've been questions on both sides--or not both sides. With Wikipedia, there's no both sides, there's a thousand sides.
Guy Kawasaki: Yeah. Yeah.
Katherine Maher: So we think that all about how you do it, right? So it is, for us, it's not that we will never sign, like, a non-compete. This is a service that's available to everyone. You can still get the product for free if you want. There are non-profit educational, free use exemptions.
So if you're a university, if you're researchers, if you're the Internet Archive, we want you to have that. That's just part of our mission. This is really around an elective opportunity for folks to buy access to a service with a dedicated, sort of, relationship person that you can contact at any point in time in the day. We'll keep you up-to-date on the architectural changes that we're making.
When we're making adjustments to the service as a whole, so that folks can plan for that with their engineering teams, all of that good stuff, we're going to do that anyway. This is just a more formalized way of doing it. And I think that abiding by some of those, sort of, core commitments around transparency, around there is still a free alternative, around this idea that this is a service that offers a net benefit to a certain type of reuser, is something that helps us navigate what is, honestly, a challenging landscape, you know?
We want to be able to be as sustainable product as possible over the long run. We want to be able to build out Wikipedia in a way that it is more resilient, more enterprise-friendly, so that when, over the course of the next decade as interfaces experiences, end user needs change, Wikipedia is changing too from a sort of static, HTML based website product into what I really think of as a corpus of knowledge; an enormous database of knowledge and insight that, frankly, we don't make the most of right now.
And so how can we invest in our own underlying platform and architecture in order for everyone to make the most of Wikipedia in the long run, in a way that is sustainable so that we can continue to thrive and people can continue to have access? When you start to talk about it like this with the community, and when we start to say, "Hey. The revenue that's going to come from this is not meant to go into the coffers of the Wikimedia foundation, certainly not into my pocket, it is meant to go back to the community.”
It will increase the amount of resources that are available to grow our presence. In south Asia, it will increase the amount of resources that are available to invest in Wikidata and Wikimedia commons and other Wikimedia products that are of tremendous value to the world.
People are able to say, “okay, I can see the benefit in that. Let's just make sure we keep it transparent. Let's make sure that we keep it non-exclusive. Let's make sure that we're continuing to have a conversation about how we govern this incredibly important resource for the world, not just as an economic asset, but as something that has a much broader and more resonant public good.”
Guy Kawasaki: I would make the case that you are one of the most experienced people in the world at managing communities and fake news, real news, semi-fake news, those kinds of issues. So if Mark Zuckerberg or Jack Dorsey called you up and said, “Catherine, can I have some advice here? What would you do to fight fake news at Twitter or at Facebook?” What would you say?
Katherine Maher: I would say this is a really hard problem that is, unfortunately, rooted in structural incentives, and those structural incentives are encoded into the platforms and the way they've been built over many years. But hard problems are also exciting challenges. I think that there are a couple of interesting things that are happening in both of those platforms currently around what are ways that Facebook has experimented with the Oversight Board and some interesting models of governance.
I think that there's a lot more to do, and by no means is it a sure success at this point in time, but I'm interested in this idea that, I think, is very resonant with Wikipedia which is, “how do you bring in differing perspectives and opinions and create more accountability in larger groups of people to decision-making?”
I think that that is a good direction for them to explore from a policy perspective, from a content moderation perspective, and there are probably ways in which that can be applied even to product development questions themselves. When we think about when Facebook thinks about its “groups” products or its “feed” products, one of the things that we often think about is, we don't try to build for end-user outcomes. We try to build for community-based outcomes.
And so while I could build a product that keeps you on Wikipedia indefinitely, just clicking away from link, to link, to link, that might not be the best, sort of, collective outcome for the product.
With editing, we could do certain things similar and keep people editing individually in their own silos if we built the product in a way that supported that. But what we're constantly trying to do is actually increase the amount of interaction between ideas and between users because that sort of interaction creates a productive friction that allows us to say, “these are our community norms. These are our community standards. This is how we get to high-quality content. This is how we get to outcomes that benefit a broader group of people rather than an individual group of people.”
Those would be interesting questions, I think, for both Facebook and Twitter to really think about as we're building product and as we're building policy. Are we building for end user outcomes, which is how our entire revenue model is structured, or are we building for community outcomes, which is looking at overall communal satisfaction, overall, sort of, net-positive social outcome within a language environment, a market environment, a regional environment? However you care to sort of segment that out.
Guy Kawasaki: If I understood what you just said…
Katherine Maher: I'm sorry!
Guy Kawasaki: Let me paraphrase. So basically, as I understood it, that Facebook and Twitter, their entire business model is to keep you on-site, clicking, so they can have more time, more mind space, and obviously, more opportunities to advertise to you.
Katherine Maher: More data. Yeah.
Guy Kawasaki: Now you're saying that instead of addicting these end-users, you should do what's right for the community, which may in fact mean that some things that would be extremely sticky would not make the cut in terms of being on the site.
Katherine Maher: Yeah. Yes.
Guy Kawasaki: Is that what you said?
Katherine Maher: I mean, that's definitely a part of the argument, and I think you've seen that advanced in other places and by other people. I think it's also around one of the questions that keeps coming up around content moderation and therefore “fake news” is: what are the community norms around what accuracy is? Around what misinformation actually is?
And those aren't just about, “are we keeping people on the site, clicking away, endlessly.” You've had these platforms actually come out and said, “We don't want that. That's not good in the long run for our business model. We don't want to be this “pit” of information online that harms us with our advertisers, that harms us overall with our public valuation,” et cetera.
So I think that's certainly a piece of it, but the other piece of it is, how do you actually build a sense of community norms to building community outcomes? And so community norms around what kind of content do we want on this platform can be segmented down to what kind of content do we want on this platform for this group of language speakers? What kind of content do we want on this platform for this interest group?
And then, how do we build policies that support those norms based on those community expectations? Right now, one of the biggest challenges the internet has is we have, well, I mean, there's lots of challenges, but one of the biggest challenges around content moderation is, really, this idea that we actually don't have very good normative understandings of what I find acceptable and what you find acceptable.
And a lot of the ways in which we historically built those normative understandings around questions of speech, around questions of privacy and the like, were over many decades of jurisprudence, many hundreds of years of code civilization evolution in towns and in villages and, nations in cultures.
We have telescoped that down into ten years, or fifteen years, or twenty years of social media. And, at the same time, we have telescoped the normative timeframe down. We have expanded the networks so massively, that now we're trying to create norms around what we accept and what we like and what we value at the scale of the globe.
And that is a really hard thing to do. So what I would be looking at is, how do we actually try to bring that back in, into smaller communities that can create the norms around content moderation, fake news, privacy, in those smaller communities and build product that helps create community outcomes as opposed to product that is focused on individual end-user outcomes.
That's how Wikipedia does it…and it works!
Guy Kawasaki: In that theory, is it feasible that there is a “community norm” for a community that doesn't believe in vaccinations, that believes the election was stolen, that believes that there's a lot of election fraud so we shouldn't be giving people water in line?
I mean, are you saying that that's an acceptable norm in some communities or that just is not, that's just beyond everything?
Katherine Maher: I think that that actually becomes an interesting question for a platform norm. So one of the conversations that's in the news right now, and on the lips of legislators, is this idea of platform liability.
And, this CDA 230 is often what you hear, but it's a conversation here in the United States, and in Europe under a different name. There's this very interesting conversation that's happening right now around what are the rights of platforms to moderate their own content and what are the rights of platforms to set their own policies and what are the responsibilities of platforms to keep up information that they may find distasteful.
So for example, when we talk about anti-vaccine information, missing misinformation, many of the platforms have taken a more aggressive approach to taking down false information about COVID-19 vaccines or treatments, and they are able to do so because the platforms themselves have a right to speech because that is recognized within U.S. jurisprudence.
That right to speech can completely be congruent with what I was saying earlier. A platform can have a right to speech that says, “we're going to community norms around privacy issues or speech issues.” So for example, “we don't want nudity in that sub-smaller subset of the community,” and the platform can still also have a right to say, “we don't want hate speech. We don't want Nazis. We don't want proto-Fascism. We don't want explicit forms of racism,” hopefully racism in general, but most platforms are really focusing on the explicit kind right now.
The platforms can have those norms that they create as their standards and say, “this is the floor and we don't want content to fall below it.” And then, anything above that is something that our communities within those platforms can help set as appropriate to the community itself.
So you've got communities of folks in the United States on the same platform as a community of folks in Bangladesh. Those two communities may decide that they want different types of norm expectations around information sharing, freedom of expression, privacy and the like, and that can be okay.
Guy Kawasaki: Based on what you've seen, do you think that machine learning and AI can set the floor or at least enforce the floor, or does it take rooms full of people looking at every post and every picture?
Katherine Maher: Neither solution is great right now for lots of reasons. So AI…it's been promising for so long and remains promising, but has yet to live up to the promise. And that's because some of the issues that we find most challenging are the issues that really involve human discretion and discussion.
And so what constitutes something that is outside the bounds of my standards may be something that is inside the bounds of another country's use of the exact same word, right? The English language enables us to take, have different meaning in different places for the exact same word, so it is highly contextual, and often that requires a human in the loop. Closed loop systems and a human in that loop.
So AI can be useful as a heuristic, but it is not sufficient as a solution. On the other hand, you have this idea of hundreds of thousands of people screening everything from nudity to offensive language, and that rigorous application of a few baseline policies, or even hundreds of pages of baseline policies, is never going to be consistent and applicable, and is going to require systems of appeal and transparency around how those policies are being applied.
So AI is great, okay? I’ll give you an example where AI is great. AI is great for identifying content related to violent extremism.
Once it has been determined that an image or a video constitutes violent extremism. So you can have, essentially, the digital watermark equivalent, which enables systems to be able to flag and identify that video or that audio track or that image, and take it down. This is used very widely across platforms.
The same is true for child abuse images. So child porn, which is a thing that every single platform that I am aware of is very active about saying, “It doesn't belong. It's illegal. We're taking it down. We're investing in taking it down. This cannot stay on the platform.”
These sorts of digital watermarks work really well, and AI is the sort of thing that flags it and enables us to review it and take it down right away, and ensures that repetitive signatures of that content come down as well so it can't get uploaded. That's fine for that particular type of content. For issues of speech, and issues of political discourse, and issues of social discourse, AI is just not sufficient to the challenges that we have today to be able to address those conversations.
Guy Kawasaki: Let's suppose that the owner of the New York Times or the Washington Post says to you, "Katherine, I'm amazed at the ability of Wikipedia to get people to voluntarily pay tens of millions of dollars. So now I want you to come over and help me figure out how we change our business model to adapt to the times." Mr. Salzburger or Mr. Bezos calls you up and says, "Katherine, I need your advice. Tell me about my business model." You say…
Katherine Maher: I would say both seem to have been doing very well recently, actually, with digital subscriptions. I think that it's a slightly different question than how we've addressed this at Wikipedia because we are meant to be a general knowledge resource. And so, one thing that is of critical importance to us is developing knowledge that's acceptable to as broad an audience as possible.
And because of our commitment to neutrality, regardless of my own political leanings, the idea is to have as broad a base of people who want to engage with our product as we possibly can, and that works really well for us and it works really well for knowledge. And I think it works really well for people who use and read Wikipedia because you do avoid that audience-targeting approach.
I think both the Times and the Post have taken an approach that is really about developing a demographic that can support their product. And that demographic is necessarily going to be more niche than a mass market audience because the overall media landscape is now so fragmented that you do need to be much more intentional about audience development. And so for the Times, I grew up in the Tri-State area, it's like that stereotypical Northeast liberal, now increasingly coastal, national approach that has a certain income point because that tends to be people who actually subscribe.
And so you're creating a product that is specific to an audience. And I think the Post is doing that as well, right? A slightly different audience that is more political in nature.
So I think the bigger question is, can you actually have a national media product that is appealing to a truly national demographic that is a truly representative national demographic, and I don't think we have that at the moment in the media landscape, and I think that that is one of the biggest challenges with regards to polarization and fragmentation today.
Guy Kawasaki: Okay. Let me ask that question in a little different way then.
Katherine Maher: Sorry. I'm sorry, Guy.
Guy Kawasaki: Let's say either of those guys call you up and say, "Katherine, from the outside looking in, what mistakes do you think I'm making?"
Katherine Maher: Hmm…I would say, "What is your reader's loyalty to your mission, and how do you make it clear that you're creating value for them every single day?" So what we know about Wikipedia is that people get value out of it. It's why they donate. It is useful to them every day. And it is useful on a range of different issues.
It is useful when you have a question about Beyoncé's discography. It is useful when you want to know what's going on with the volcanic eruption on the other side of the world. It is just a useful, in-your-pocket product, and it doesn't judge you if you have a dumb question, and it doesn't judge you if you spend hours getting lost on it, reading about esoteric stuff. It is just there in a very, sort of, neutral way.
And people love that. They develop a real relationship with that. And the same way that if you were a kid, you went to the library and you could get lost in the stacks, wandering around, that is the kind of relationship that people have with Wikipedia.
Then you layer onto that the mission and the, sort of, the fact that we don't sell your data, and the fact that we don't sell you ads, and the fact that we are here for the public, and the fact that we're very transparent about where we get it wrong. And the sort of product has a humility and an accessibility to it that, I think, is very different from other large Internet products today or platforms today.
And so you have a really deep relationship between donors and Wikipedia because they value it so much, because it's been there for them in a pinch, because it's never laughed at them for asking a dumb question. The thing that I would want to know if I were a major publisher is, how am I creating value for people every single day over time and building a relationship that is meant to last past the subscription cycle?
Guy Kawasaki: When you look at your incredible run at Wikipedia, and Wikipedia truly is a phenomenon. What's the big lesson?
Katherine Maher: Hmm. That's such a good question. I'm going to try this. It's not a product lesson for me. It's actually a lesson that speaks to an essential goodness and human character, which is, if you give people a reason to do something together and do it well, they will and can. And I think, to me, spoke to this fundamental generosity that many of us have and a spirit of curiosity that all of us have.
So when I think about what makes us, what the beautiful parts of our humanity--and there's terrible parts of our humanity--but the beautiful parts of our humanity is that we are fundamentally curious beings, and Wikipedia taps into that curiosity in a way that creates a benefit for everyone because I get curious about trains and I researched them and I write an article. Or you get curious about some sort of farming and you research it and you write an article, and then we collectively win.
And so I think building these sort of systems that tap into things that are innate to our humanity and enable us to then have something as a collective good is possible. It's not easy, but it is possible. And Wikipedia is evidence of that.
Guy Kawasaki: What's the biggest threat to Wikipedia?
Katherine Maher: Complacency. The idea that it's always been there and so therefore it always will be, is the biggest threat. We operate in the most highly capitalized, highly competitive sector that has perhaps ever existed. That is a threat whether we realize it or not, we don't have a direct competitor, but if we don't keep changing our product, our stack, our offering, our community, all of it, we will eventually be less relevant as the world keeps changing around us. And so I think complacency is the biggest threat.
I like to say that if we're not out there in the top ten, then we're just a bunch of encyclopedia enthusiasts. And the reality is that we have to be just as good and just as ruthlessly efficient, and just as perfectionist in the products that we build and ship, and just as thoughtful in the policy decisions and advocacy that we engage in as anybody else out there but we have to do it on a fraction of the resources, and we have to do it faster and better, and we have to do it in public. And so it's really hard.
And so it's easy to be complacent because it's easier to be--to not pull your head out of the sand and look at all these challenges, but we have to be moving fast because the world is moving fast around us, and we owe it to the world for our future and for what we offer.
Guy Kawasaki: My last question for you is: how does Katherine, who ran Wikipedia, how does she do her best and deepest thinking?
Katherine Maher: It's usually when I'm not actually working. It's when I'm able to go and take a few days off and go to a place where there's no internet and de-connect for long enough to be curious. And when I can create that “step away” from, sort of, exhaustion or the day-to-day of the Zoom calls. To be able to have silence, I can be curious about the ideas that I've been exposed to and create the connections that create the next insight.
And so I am a really big advocate that everyone should take downtime because that's when we--that's when we do our best thinking.
Guy Kawasaki: Have you figured out what's in the future for you?
Katherine Maher: Not yet. I am, have been so deeply in the Wiki world with so much of my heart that I need to step away to think about what comes next, but there's so much out there. I'm excited to take my own advice and maybe go for a walk in the woods and figure out the thing that is most exciting to me right now.
Guy Kawasaki: Would you cross over into the for-profit world or are you just a not-for-profit kind of gal?
Katherine Maher: For the right mission, I think so. I think that there are challenges that can be solved in both spaces. And I would be really interested in working on some of them with more resources, you know? But I think it would all depend on what the challenge was.
Guy Kawasaki: What's a challenge you would face--what's a challenge you would enjoy facing in the for-profit world?
Katherine Maher: I'd love to work on anything related to climate and sustainability from a market-driven approach. I think that there's so much to do there, and that the tech is there and the sort of product market fit might not be in the policy space is not where it needs to be, but that is climate is one of the most compelling and challenging issues of our time. It's the existential issue so that would be something that I think that actually only the market can solve.
Guy Kawasaki: If Elon Musk subscribes to some of my podcasts and calls you\
Katherine Maher: I'll let you know. I'll say “thanks.”
Guy Kawasaki: And get me a deal on a Tesla.
Katherine Maher: I want one first. So lovely.
Guy Kawasaki: Thank you so much for doing this.
I hope you have a greater appreciation for the power, complexity, and quality of Wikipedia. You should not hesitate to use it as a source of information.
Kudos to Katherine, the employees of the Wikimedia Foundation, and the over 200,000 Wikimedia editors and contributors. The world would not be the same. without the Wikimedia foundation
I'm Guy Kawasaki, and this is Remarkable People. Kudos and gratitude to Jeff Sieh and Peg Fitzpatrick for producing another remarkable episode of remarkable people. This podcast would not be the same without the two of them. Until next time. Mahalo and aloha