Welcome to Remarkable People. We’re on a mission to make you remarkable. Helping me in this episode is Meredith Whittaker, President of Signal and a transformative voice in tech ethics.
With a humanities background and art school education, Whittaker rose through Google’s ranks to found an AI research group. Her leadership in organizing the historic Google walkout challenged tech industry practices and highlighted the growing disconnect between Silicon Valley’s stated values and actions.
As Signal’s President, Whittaker is now building a compelling alternative to surveillance-based tech models. She explains how Signal’s nonprofit structure enables genuine privacy protection, contrasting with the data collection imperatives driving most tech companies. “We need to shield Signal from pressures of surveillance, from the pressures of data collection,” she emphasizes, demonstrating how organizational structure directly impacts mission.
On AI development, Whittaker offers a critical perspective, urging us to examine power concentration in tech giants rather than getting lost in hypothetical scenarios. She advocates for structural changes in how technology is developed and deployed, challenging superficial solutions like “pausing AI.”
For aspiring tech leaders, particularly women, Whittaker shares powerful insights: don’t wait for permission – enter rooms you’re “not supposed to be in,” take initiative, and create your own path. Her success comes from persistent curiosity and willingness to challenge norms.
Whittaker envisions a tech future where privacy isn’t sacrificed for profit and ethical considerations drive development. She calls for “rewilding the tech ecosystem” to create space for solutions that serve human needs while protecting fundamental rights.
Please enjoy this remarkable episode, Meredith Whittaker: Revolutionizing Tech Privacy and Power.
If you enjoyed this episode of the Remarkable People podcast, please leave a rating, write a review, and subscribe. Thank you!
Transcript of Guy Kawasaki’s Remarkable People podcast with Meredith Whittaker: Revolutionizing Tech Privacy and Power.
Guy Kawasaki:
I'm Guy Kawasaki, and this is the Remarkable People Podcast, and we're on a mission to make you remarkable. And today we have a really remarkable guest, total badass. And I say that as a compliment, total badass. So this is Meredith Whittaker. Meredith, welcome to the show.
Meredith Whittaker:
Thank you, Guy. It is a delight to be here with you.
Guy Kawasaki:
Even though you're on Paris time, right?
Meredith Whittaker:
I am on Paris time, but that just means I'm likely to say something more interesting, because who knows what hour it is.
Guy Kawasaki:
We offer you full editing privileges, however, though.
Meredith Whittaker:
Oh, bless you. Thank you.
Guy Kawasaki:
I just, when I read about your background and I heard about you, the most fascinating thing... Well, not the most fascinating, because there's many fascinating things, but tell me about the Google Walkout. I just love that story.
Meredith Whittaker:
Have you ever been so indignant that you had to just pick up and do something? Because that's kind of the origin story of that.
Guy Kawasaki:
Every day.
Meredith Whittaker:
Yeah, right? And you're a doer, so I think we relate, there. I was at Google since 2006 and that was a really different time in tech. It was a really wild and fun place, and I learned most of what I know about tech at Google. I was taking classes, I shared an office with Brian Kernighan, so I was asking him questions and from the author of The C Programming Language.
He was explaining, very kindly, to a novice. So I really came up inside of Google in terms of my tech education, and in some sense, I think I really believed it. There was, "Don't be evil," there was these kind of principles, and there was this idea that you could be virtuous and you could be successful. And I was part of that wave.
And as Google grew, as I could use a ungenerous term metastasized, it became just this giant with so many different divisions, that culture didn't always follow. And I think very dedicated, almost naive in some sense, really trying to make change. I had founded an AI research group.
I was speaking around the world about some of these harms, and earnestly trying to change the conversation and the direction in ways I thought were beneficial, based on the force of ideas. And there was a time around 2017 when I realized a lot of this, they seemed to clap at the end of my speeches, but then the decisions don't change. And the walkout was born out of that.
It was Google moving forward to be an AI military contractor, a lot of bad cultural trends where just, toxic behavior was rewarded, which you would see just eroding your teams, eroding the work that you were doing. Just really this kind of unforced error, I felt. And I wasn't the only one who felt that. So the walkout was a collective action where we all just said, "Enough."
Guy Kawasaki:
Was that around the time where our best friend got like ninety million dollars after a sexual harassment? How do you even explain that to yourself? You just fired a guy, and he got ninety million dollars.
Meredith Whittaker:
That was the spark that lit the fire on the mom's messaging group at Google. And people were just like, "Are you kidding me? I have been working nights and weekends. I had not seen my baby." All of this. And then, this guy? Who everyone knew, he drove success sometimes, but you work at a company you know the stories, right? And it became the representative for a lot of smaller harms and problems that people felt.
But yeah, ninety million dollars. And we did the math, and we were like, "A person, a contractor working minimum wage would have to work something like 3,000 years, every single day, to make that." So there was also just a feeling of inequity that really cut against the grain of the Google that a lot of people felt they were promised.
Guy Kawasaki:
Now wasn't the walkout only about half an hour, though? Why such a short time?
Meredith Whittaker:
I don't actually remember the reasoning for some of that, but I think it was, we're efficient people. And it was, how do you stage an action in a bounded way? It was around the world. We started in Singapore, and I remember going to bed in New York, and we had this Instagram account set up to share photos from the walkout, because every office was going to walk out at eleven:eleven in the morning, and we had rolling thunder.
So you start in Singapore, and it moves through the time zones, as the sun rises we're walking out, we're walking out, we're walking out. And I remember looking, we didn't know if it would work, right? "Maybe thirteen people will show up, but we're going to do this. There's momentum, there's energy. Let's go."
And I looked at the photo, going to bed in New York, and there were hundreds of people in Singapore. And I was like, "Okay, Singapore popped off. This is a thing." And I was trying to sleep and then got to the park in New York in the morning, and that was the day. And now it's five years ago.
Guy Kawasaki:
And did Google retaliate?
Meredith Whittaker:
Yeah, they did.
Guy Kawasaki:
How did they do it?
Meredith Whittaker:
It basically reshuffled my job responsibilities in a way that I couldn't accept if I wanted to pursue what I was doing. And it's classic, I think there's, constructive termination is the labor law term for it.
Guy Kawasaki:
Constructive termination, that's an oxymoron.
Meredith Whittaker:
Yeah, it's lawyer language where you're like, "The literal term doesn't make sense, but I'm sure you know what that means." But I was given the choice to basically become an administrator who did budgeting for the open source office, and I was like, "I did found an AI research group, and I have all of this work, and I'm recognized publicly for this, so I'm obviously not going to take that option."
And at some point you get tired of fighting, right? It wasn't personal to me. There are a lot of people I really love and value at Google. I was just like, "If you, any human being, however much I love them. If you have that type of responsibility in the world, and that type of power, you need to live up to it." And if you have real friends, they're going to tell you that.
Guy Kawasaki:
This do no harm thing is just total bullshit, at this point.
Meredith Whittaker:
I don't know if it's total bullshit. I think there are a lot of people who feel it sincerely, but ultimately, if the incentives of these companies are driven by forever growth and forever profit, and you have a board with a fiduciary duty that is not to harmlessness but to growth in profit, you are going to prioritize those things.
And at some point the far horizons of those ethically dubious choices, that felt so far away in 2006, like military contracting, like building a surveillance version of the search engine for the Chinese market, et cetera. Or maybe the American market in the future, whatever it is, those horizons got closer and closer.
And the people filling Google, again, "What is the objective function here? Profit and growth." And I think that's with the type of power that the tech industry has now, with centralized surveillance, centralized information platforms, we really need to be looking at that incentive structure, because it's not healthy.
Guy Kawasaki:
I feel like the moment when I figured out there was no Santa Claus.
Meredith Whittaker:
I know. But you still got presents.
Guy Kawasaki:
Or a gift certificate at Amazon. So now your Signal, tell us about Signal.
Meredith Whittaker:
Signal is so cool. I think it is the coolest tech organization in the world. It is the world's most widely used, actually private, communication service. And you can think about it almost as a nervous system for confidential communication. Militaries use it, governments use it. Any CEO who has a deal to make uses it. Sports stars use it to broker their deals. So it's truly private, and we stay truly private in an industry that pushes toward data collection by being a nonprofit.
Guy Kawasaki:
But how do you put two and two together? It's totally private, but they're helping militaries. So you're helping the same military that might be killing people.
Meredith Whittaker:
Look, this is one of the facts about communications networks. And the facts about encryption and privacy, we could also talk about in this context, but Signal has to be available for everyone to use, for it to be useful for everyone.
Guy Kawasaki:
Whether you're a priest or a pedophile.
Meredith Whittaker:
Whether you're a type of human, or another type of human. Because ultimately, I can't sell a telephone network just to the good guys. I don't know who's going to be using that. And I think we also have to separate the infrastructure from the action. There's a human being making choices.
It's not the roads that drove the car to commit the crime, it's a human being in a car who rode on those roads to go commit the crime. We don't license Signal to any one of these entities. Signal is free to use for everyone, and we don't have a military version, or a CEO version, or we don't have an enterprise business model, any of that. But the fact is that anywhere where confidentiality is valued by anyone, Signal is valuable.
Guy Kawasaki:
And if I'm listening to this, and I'm just a husband or a wife, and I have teenage kids. Is there an argument to be made that I should be using Signal?
Meredith Whittaker:
Yeah, there is.
Guy Kawasaki:
And what's the argument?
Meredith Whittaker:
I think the argument is one, Signal is pleasant. It is lovely. We're not selling you ads, we're not farming for engagement. You're not going to get on a feed and fall into a hole of Instagram ads. So there's something crisp and clean and elegant about Signal that I just want to forward, before we talk about any of the values, because actually it's a lovely, simple app, and it takes us back to the days before that oversaturation of everything's a bot, everything's an ad, everything's a feed.
Beyond that, I think we do need to be serious about the times we live in. Data can be indelible. The data, my Gmail from 2005, is stored in a Google server somewhere. But our political context has moved pretty dramatically since 2005, and we're living in a time right now where there is a woman living in prison because Facebook turned over messages between her and her daughter in Nebraska, after the Dobbs decision.
And the Dobbs decision is what kicked abortion rights down to the state level and allowed states to criminalize it. And they're living in prison because they discussed accessing reproductive care and dealing with that over Facebook Messenger. Those messages turned over to law enforcement, and she's probably cold right now, in prison, waiting to get out.
There is a reason here that is really deep, and it isn't just a reason that exists in the present moment. We need to recognize that we are in very volatile times, and I think of it simply as hygiene. Why would you want that out there? Why would you want every thought you thought ten years ago, in a database that now may be leaked or breached, or turned over?
Guy Kawasaki:
But I still need a network effect, right?
Meredith Whittaker:
Yeah.
Guy Kawasaki:
I have to convince the other five people in my family.
Meredith Whittaker:
Exactly.
Guy Kawasaki:
So you went around the edges of this. So basically, if you are a woman in Florida or Texas and you're considering an abortion, you shouldn't use Gmail or Facebook Messenger, or anything, right?
Meredith Whittaker:
No. I wouldn't Google it.
Guy Kawasaki:
And what about the relationship with WhatsApp? Isn't your technology part of WhatsApp?
Meredith Whittaker:
Yeah.
Guy Kawasaki:
So is it safe to discuss abortion on WhatsApp?
Meredith Whittaker:
This gets into threat modeling, and all the nuance there, but let's quickly go through the differences and similarities between WhatsApp and Signal. So WhatsApp licenses the Signal protocol, which is the core encryption that protects what you say. So anything you write in WhatsApp, WhatsApp can't see it, only the people you're talking to can see it. And that's great. But what they don't do is encrypt metadata.
They aren't keeping facts about who you talk to, who's in your contact list when you start talking to someone, when you stop talking to them. Who's in your group chats? Really important, intimate information is not protected by WhatsApp. And that's a key differentiator between WhatsApp and Signal, minus the fact that WhatsApp is also part of Meta, and Meta, you could database join with Facebook data, you get into some other shit.
Guy Kawasaki:
Here comes Cambridge.
Meredith Whittaker:
Yeah.
Guy Kawasaki:
And how do you pay the bills at Signal?
Meredith Whittaker:
Donations, and that's going out to all the listeners who might be looking for a righteous tech cause. We are a nonprofit, we are funded by donations, and this is not a nice little philanthropic outfit. This is because we looked at the hard facts, and we realized the core business model of communications tech in this day and age, is collecting data.
You model that data to sell ads, you sell that data to data brokers, you use that data to farm engagement to sell ads, or train AI models. And so if we were governed by that type of fiduciary duty to the shareholders, and we were trying to stay private, we're basically rowing up against this business model.
And it would be beholden on our board, probably at some point to be like, "Hey, can you cut some of this privacy stuff out? Because we got to make some money." So we're very self-aware in that regard, and we're looking to shape new models for tech that are actually breaking that surveillance business model, and creating a more independent ecosystem.
Guy Kawasaki:
In a sense, are you like the Wikipedia of messaging, then?
Meredith Whittaker:
I think the similarities is that we're a nonprofit, and Wikipedia is a nonprofit. I think of Wikipedia is an online library, so it's different.
Guy Kawasaki:
In terms of revenue model, donations.
Meredith Whittaker:
Donations. And we're exploring other models, as well. Are there endowments that we could set up? Are there hybrid or tandem forms, a profit nonprofit. But the absolute primary objective here is we need to shield Signal from the pressures of surveillance, from the pressures of data collection.
Guy Kawasaki:
And what happens if the Koch brothers or Elon Musk or the Gates Foundation decides, "We want to give you a one billion dollar donation"?
Meredith Whittaker:
I sit at the table and we talk it through. So, "What are the terms of that? Signal must stay open source, it must stay independent. It must stay private. It is laser focused on its mission. And all right, let's talk about where that billion dollars go. Is that an endowment? How do we work with that?" But it wouldn't come with control or the ability to inject code into Signal, and our principles would remain really steadfast.
And one of the things that's super lucky about us is, we came up in a very different tech era. Moxie founded Signal in the 2000s, and it developed because of the virtuosic work of Moxie and Trevor on the protocol, and a lot of the development work. A community of experts have formed around Signal, so our protocol and the encryption implementation that we use are open, and they're scrutinized by thousands and thousands of hackers and InfoSec folks.
We have a security community that is like a train spotter for Signal codes, when we cut a new branch and it's in GitHub, we have people on the forums and Reddit looking at it. So there's an immune system that we really value out there, that would call BS on any move, and we would listen to them because we really value that.
Guy Kawasaki:
I was on the board of trustees of Wikipedia, and what you just described is very similar, though. The Wikipedians are like, I thought I knew evangelism because of Macintosh people. But oh my god, Wikipedia blows them away.
Meredith Whittaker:
Yeah. And we should trade notes sometimes, because there's some men in my mentions. Very passionate.
Guy Kawasaki:
No, so now, if I'm a young, I don't have to be young, but if you're an engineer and you're thinking, "Wow, I'd like to go and join this Signal team." I know the answer, but I'm going to ask you. Should I think, "Oh, not for profit, based on donations, means I'm not going to get paid a lot. So how does this work?"
Meredith Whittaker:
Let me assure you, young engineer, we pay very well. We pay as close to industry salaries as we can. We have certain benefits that I think are non-monetary. So we are a remote organization, we work in US time zones, but remotely, so you have that flexibility. And I do think it really matters, we're a very high caliber team. So you enter in there, you're literally shaping core infrastructure that human rights groups and journalists rely on.
I got off a phone with a publisher this morning, I was talking about a book I'm working on, and she was talking about, she published Snowden's book. And she said, "We couldn't have done that without Signal, because we had to be communicating sensitive information."
So you're contributing to something that really matters, and you personally, on this small team, have a real impact. So I think there's experience you gain, and then there's just, we have one magical life in this world. So how do we want to spend it? And I think a lot of people are weighing that right now, in this weird time we live in.
Guy Kawasaki:
In a very sick way, I would make the case that if Donald Trump got elected, Signal will explode. Because Donald Trump gets elected, he makes Elon Musk Secretary of Efficiency, and all of a sudden, man, you really got to be careful what you say.
Meredith Whittaker:
Yeah, yeah. And what you said. Because that data is still there, on those platforms. And this is why I say, "Moms, teens, it doesn't matter if you have anything to hide, you don't want to be that weak link in your network."
Guy Kawasaki:
So speaking of this kind of threat, I understand the difference between Signal and other things, but A CEO was just arrested in France, of all places. And now, couldn't some government make the case, "Well, there's drug dealers and pedophiles on Signal, so we're going to shut that down, and we're going to arrest you"?
Meredith Whittaker:
They do make that case, sometimes. Not arrest me, we haven't gotten to that point. I think, just addressing the Telegram situation quickly, there are vast differences. So Telegram is basically an un-moderated social media platform, that has messaging bolted onto the side. They're not encrypted, they do have the data.
And under European regulations, there's a very particular threshold that social media platforms of their size, with the data they have and the public broadcast features they offer, have to meet. Signal is only messaging. We don't have any social media broadcast functions. You don't have a directory, you can't find your friends. So we actually think about, "How do we avoid those thresholds? How do we avoid culpability here, so that we are building something that's very pure, and not subject to those laws."
So the terms of the arrests are very unlikely to hit Signal. However, I think it is notable that executives of core infrastructure tech companies are now on the playing board, in that way. And we do have to be aware of that in a geopolitically fractured world. And I think it's notable that there is a war on encryption.
There's been a war on encryption since 1976 and before, when Whit Diffie and Martin Hellman tried to publish their paper on public cryptography and the US government freaked out and tried to stop the publication.
We went through the Crypto Wars in the 1990s, after Snowden you saw full disk encryption on iOS and Android. And then suddenly, 2015, there was a kind of manufactured crisis where James Comey tried to browbeat Apple, and the pretext then was terrorism. And now you have new pretext around child safety.
But again, the target is always encryption and the ability for everyday people, dissidents, organizers, anyone, to communicate privately outside of government and corporate scrutiny.
Guy Kawasaki:
You're not giving me peace of mind, Meredith.
Meredith Whittaker:
It's not what I serve, Guy.
Guy Kawasaki:
I want to know, going back into your checkered past, tell me.
Meredith Whittaker:
Houndstooth, please.
Guy Kawasaki:
What is it like to testify in front of Congress?
Meredith Whittaker:
It's really stressful. Your amygdala is firing, you're thinking about your posture, and then suddenly it's over and you're toddling out onto the street trying to find an Uber. But you prep, you prep, you prep, you prep, you prep. You have all of your answers there. You have to make sure your numbers are right. So I remember having these little sheets, I had my flashcards with, "What are the things I'm going to cite?"
I felt, when I did it, a real responsibility, because I'm not just speaking for Meredith, I'm trying to get this message through. And I'm trying to get it through to people who are probably not thinking that hard, they probably didn't read my opening statement, their staff did, they handed them some talking points.
But if I mess up, that's the clip they're going to pick, right? And that's the narrative they're going to remember. So it does feel, to me, it's something you really prep for. And then it's over, and then you sleep on the Acela, back to New York.
Guy Kawasaki:
I would love to see you testify in front of Jim Jordan. I think you would have him for breakfast.
Meredith Whittaker:
I did, actually.
Guy Kawasaki:
It was Jim Jordan. Oh.
Meredith Whittaker:
Yeah, Jim Jordan was there, and he was trying to get at, he was like, "Is Amazon political?" Because he was trying to get at, "Tech is anti," whatever, this thing. And I remember hearing that question. And I was like, "Oh, he wants me to say tech is biased against the right," or what have you.
And I sat with it, and I was like, "I don't know how to answer that. That's a trap." And I was like, "They do hire many lobbyists," and he got so mad at that answer, he just stopped questioning me. But yeah, I'd go for round two.
Guy Kawasaki:
Let's talk about AI, now. All right, so first of all, what was your reaction when there was this movement? "Let's pause AI for six months."
Meredith Whittaker:
I think unserious is the word that comes to mind. I get it, right, but one, what is AI? We got to answer that question. And as a pedant, I often start there. And two, it was a very splashy statement, that certainly it generated a lot of heat and a lot of smoke. But the reasoning for pausing AI, what a pause meant, in a world where is it developing a core library, that then becomes part of an AI system?
Is it collecting data that then is aggregated and cleaned to train an AI model? Is it developing an NVIDIA chip that is going to accelerate training? What is AI, here? And what do you mean by pause it? Again, it didn't feel serious to me, and clearly it wasn't. Because here we are, in the moment of increasing acceleration.
Guy Kawasaki:
And the people talking about that were seemingly intelligent, experienced people. Seriously, what does it mean to pause development? You turn off all the computers, what is it? It's not like, "I'm going to stop harvesting redwood trees for six months." That I can understand.
Meredith Whittaker:
That was my question as well. Are we talking about Larry and Jihad? What are we talking?
Guy Kawasaki:
No, okay.
Meredith Whittaker:
And again, this isn't a game, right? Are you sincerely worried about these threats? Then a five line letter, that is sent to the New York Times with no specificity, is not actually a theory of change. What are we doing here?
Guy Kawasaki:
So I have a marketing question for you I want to tap, that no one has been able to answer for me. I have a folder of Claude and Perplexity and Open AI, and all that, like five of them. And I have zero brand loyalty to any of them. So how do these companies create brand loyalty to an LLM?
Meredith Whittaker:
I think the AI market is interesting, because ultimately it's contingent on, one is compute, so servers. And two, data. And so, we know this well, there are a handful of large companies that came up, 2000s, established platforms and cloud businesses, and they now dominate. And ultimately, the path to market, whatever the app or the LLM, is through them.
So a good example is Mistral in France. They build open source models, large language models, open source. They're kind of a national champion in France, I'm sure many of the listeners know them, they do really interesting work. But they can't just IPO, right? There's a model. What are you going to do with that, right? You can post it on Hugging Face, but that's not a business model.
What do they need to do? They need to find market fit. And you either do that by licensing it to one of the cloud giants, so Google, Amazon, Microsoft. And what they did is license it to Microsoft, and now people who want to sign an Azure contract can sign up for a Mistral API. Or you go through one of the platforms, and this is Meta.
You could be acquired by Meta, and they integrate your AI into their platforms for newsfeed calibration, for advertiser services, for whatever it is. And this also helps explain the open-closed AI debate. Of course the platform companies want proprietary models, because if they have them and they're licensing them to you, that's a market advantage.
Or sorry, the cloud companies want proprietary models. You're signing up for an Azure contract, you can only get that through Azure, or what have you. but the platform companies want to integrate this. Their market is integrated into the platform that you're looking at all day, so they want open models. They want to be able to harvest from the work of people who are building on top of these open LLMs. So I think there's interesting market dynamics that help us dig into questions like that.
Guy Kawasaki:
A few minutes ago you alluded to the fact you're writing a book.
Meredith Whittaker:
Yeah.
Guy Kawasaki:
So what is this book?
Meredith Whittaker:
I just sold it, and it's an alternative history of tech that starts with Charles Babbage.
Guy Kawasaki:
You're going way back.
Meredith Whittaker:
Yeah, you did ask if I were crazy, and I'll say maybe this is a slight symptom of that. But I've done a lot of research on this, actually. And for me, I get a lot of joy in spending time in the archives, spending time with ideas. It's an itch that gets scratched every time something comes together that I didn't understand. And I'm like, "Wow, okay, now I get something." Or, "Whoa, I had completely misunderstood that."
I spent some time looking at the relationship between the Industrial Revolution, as it's called, computation, and the age of abolition. And this sort of period when Britain was looking to, and then did in some sense, abolish slavery. And plantation technologies that were then imported into the Industrial Revolution, and actually informed the blueprints for computation. As a side project, as a treat.
Guy Kawasaki:
And you're doing this while you're running Signal.
Meredith Whittaker:
I've done some of this research before, so I spent about two years reading through this, and getting that. But this is how I spend my weekends. And I love it.
Guy Kawasaki:
I'm an author, also, and I'm just curious about your attitude. I have written sixteen books, or some people say I wrote one book sixteen times.
Meredith Whittaker:
I love it.
Guy Kawasaki:
But I have made a concerted effort to get them into every LLM that I can. There are many authors who have the exact opposite reaction, which is, "I don't want my stuff in LLMs because I'm not going to get rewarded for it. They're just going to take my work and intellectual property." So where are you on this kind of, is your book going to be out there in LLMs, so that when people ask ChatGPT, "What did Charles Babbage do," it's going to cite you?
Meredith Whittaker:
I imagine the second someone posts it to Library Genesis; it will be in an LLM. So it's unclear that I would have that much control over that, given the web scraping that is generally creating the datasets that LLMs use, and the fact that they're too big to be auditable. That aside, I think the question generally to me is less like, "Is it in, is it not in? What is the intellectual property argument? Is it fair with use?"
That's fine. I'm really just interested, are we cultivating an economic system in which creativity, and intellectual work, and art, continue to be rewarded? Who is getting paid, is my question. And if the answer is, "Only Microsoft," then I don't really care what we call it. That is not a system that I want to endorse.
Because I went through art school, I think art and writing and language are the way that we're able to express our place in the world to each other. It's so core to human life and human flourishing. And I dread a world in which there's no reward for that work, in which it's just reproduced by massive companies, or a simulacra of that work is produced by these models, because I don't think we thrive as human beings without that.
Guy Kawasaki:
You could not find someone who is more optimistic about the impact of AI on society. I, in fact, think AI could save society. So that's where I'm coming from.
Meredith Whittaker:
Cool.
Guy Kawasaki:
But I want you to explain, "Guy, you're being naive. These are the existential threats that AI provide." If you believe this. So what are the threats?
Meredith Whittaker:
I want to look away from AI as a technology in a vacuum to explain where I come from, here. Because obviously finding patterns in large amounts of data, super useful. Right?
Guy Kawasaki:
Sure.
Meredith Whittaker:
Assuming the data is good, assuming the decision makers who are acting on those patterns are benevolent, all of that. But right now, when we're talking about AI, we're talking about these massive models. They rely on huge amounts of compute, and you see this sort of bank busting, data center, build outs, all of this. We haven't even talked about the environmental impact, but that is, we're reopening Three Mile Island. Like it's, we're in some weird waters.
Guy Kawasaki:
What a concept. Microsoft, running Three Mile Island.
Meredith Whittaker:
As a New Yorker, I'm like, "Excuse me." And then we're talking about the need for huge amounts of data, the kind of data that the platform companies and a handful of other companies have, and most people don't. So my concern is with centralized power, and the way that AI as a general purpose utility, threaded through our lives and institutions around the world, could enable those with that power to shape, reshape and control our lives in ways that are not beneficial.
And so I want to look at that, and think about, "Is this healthy and safe, given that the incentives driving these companies are profit and growth, and not necessarily benefit"?
Guy Kawasaki:
At some level, Apple has one of the most compelling stores of AI, because literally, it can be at the system software level. It's not something people go out and get. It's in every phone. So is Apple the best thing for AI, or the worst thing for AI?
Meredith Whittaker:
Apple is doing the on-device model, which means that there's less leakage, let's say. But I think, let's get into that. What is the core of what is happening there? So Apple trains a model, it's small enough to run on your device. And we need to be clear, these large LLMs and generative models are not that small, which is part of Apple trying to figure out a server side, private server arrangement, for the open AI deal.
But nonetheless, there's on-device AI, it's small enough to run, but it's also oftentimes making really key decisions. So it's, "Do you want to read this email or not?" Or I’m not coming up with a hypothetical, but scanning your photos and saying, "This is bad. Maybe you don't want to send this," whatever. And I think it's more private, but you're still giving Apple sort of an obscure power to make decisions and determinations that I think we need to look at in the context of this privacy and agency conversation.
Guy Kawasaki:
If anybody believed that Google was going to do no harm, you should ask yourself if you believe Apple's going to do no harm, too. Because obviously things can change.
Meredith Whittaker:
I know WhatsApp, Signal, and others were pulled from the app store in China. And I completely understand, companies have to work within the laws of certain governments. But nonetheless, I think we can't treat these companies and their incentive structures as good or bad, we have to recognize they're going to be compelled to do certain things under certain conditions.
And we need to create structures and systems that act as prophylactics from the harmful decisions. And this is why I'm always looking at the structural level. I'm always looking at the system. I love people. I'm very easy on people, I'm very hard on ideas and systems, because I think we need to build it for robustness, and we need to build it to make sure that massive power and responsibility is not misused.
Guy Kawasaki:
Okay, last general topic. Let's suppose that I am a parent, or I am a young girl, and I'm listening to this podcast. And I'm saying to myself, or I'm saying to myself for my kids, "I want my daughter, or I want myself, to be like Meredith." So now, with everything you know, how does a woman become a leader like you, today?
Meredith Whittaker:
I will just say, you all can become better leaders than me. I think, there's no recipe, but I was very lucky to get a lot of good mentorship. And I think, find your mentors. And then, I entered into tech not knowing that much about tech. I have a humanities background. I went to art school for most of my life.
I still love that world, but I said yes to everything. I signed up for everything. I tried to learn everything, and I didn't quite understand, there were rooms I wasn't supposed to be in. I'd just walk in, right? There were tables I wasn't supposed to be in.
And I think there's, it's not a secret, but if you can get in, figure out where you fit. I would take notes at meetings I wasn't supposed to be in, or I wasn't invited to, or I would join initiatives to try to figure out, "Is there a place I can help? Okay, I can help by ordering the catering." Right? That's a helpful thing. And then suddenly I was in the other meeting, I'm like, "Hey, do these things connect?" "Oh, that was a good idea.' Someone recognized I had a good idea. "Okay, now I'm going to be part of doing this."
And I think there was a stubbornness and a sort of elbowing everyday mentality. I didn't grow up in the elite, and so I think, if you don't come from this world, don't worry. There's a lot you can do with a little bit of street smarts and a willingness to wedge your way into the room.
Guy Kawasaki:
Did you have to get over the imposter syndrome?
Meredith Whittaker:
In a sense, I was an imposter, because there's no reason I was supposed to be there, right? An art school kid with a humanities degree, who plopped in the middle of Building Forty at Google. But then I was like, "Okay, I'm going to just be my own version of myself, because clearly I don't fit in with the hoodie Stanford culture. I'm going to try to figure out what I bring." And so I also don't think, don't let the normative terms of whatever environment you're in define whether you fit or not. Make them fit to you.
Guy Kawasaki:
This may seem like a dumb question, but then, if you were not the Stanford hoodie culture with a PhD in computer science, how the hell did you get into Google at all?
Meredith Whittaker:
Well, at the time I was hired, if you had a super high GPA from a social sciences or humanities background, they were hiring us to do, I was basically hired for customer support. Although they didn't call it that. They called it Consumer Operations Associate. And I remember getting the call and I was like, "I don't have any idea what that is, but it sounds like a business job. Okay, and I need to pay rent."
I had just graduated Berkeley and I was like, "All right, I'll take this call." And then I had, I think, seven interviews. Two kind of personality IQ tests, and a writing test. And I knew I was getting close to being hired, because the size of the diamonds on the engagement rings of the women who were interviewing me kept getting bigger. So I was like, "Must be going up the chain."
And then I got in and then I was like, "Oh, there's all kinds of stories." But again, I was like, "I don't know what this job is supposed to be, so I'm going to try to make it. They have 20 percent projects; they have all this stuff." I'm taking the bike around campus, meeting everyone during the office hours.
Guy Kawasaki:
Did you have to answer the question, how many manhole covers are there in the United States, or?
Meredith Whittaker:
No, it was ping pong balls in Boeing 747. And I was like, "What the fuck? Is this a cult? What are we doing here?"
Guy Kawasaki:
And now, at Signal. How does Signal interview? You're not doing stuff like that.
Meredith Whittaker:
No, we don't do that. It's a little excessive, and I think it's more of a flex than an actual methodology for rigorous talent finding.
Guy Kawasaki:
Flex, as in flex in a pejorative sense.
Meredith Whittaker:
Well, flex in, "We are big enough and desirable enough, we can make you jump through as many hoops as we want, even for an entry level job." And look, that job was really cool. They hired really rad people. So it was this bullpen of hyper achieving humanities and social sciences kids, who would get all their work done in an hour and then just bounce around. It was a wild environment.
Guy Kawasaki:
There is some irony, you sitting in this chair at this conference about the masters of scaling, because one of the concepts of Read is that when you're scaling, you put in these things that are not exactly humanitarian and warm and fuzzy. And you look at GPAs, and you look at degrees, and you look at that and you just higher, higher, higher, higher, to scale. And in a sense you're saying, "Look at all the problems that can create."
Meredith Whittaker:
Yeah. I think so small is also a scale. So what is the problem we actually want to solve? What is it we want to do in the world? And how do we be discerning about that? One-size-fits-all doesn't always fit.
Guy Kawasaki:
This interview has turned into a big ad for Signal, which I'm okay with.
Meredith Whittaker:
I'm doing my job.
Guy Kawasaki:
That's right. You're a Signal evangelist.
Meredith Whittaker:
I genuinely love it.
Guy Kawasaki:
So since we've gone that far, I'm going to just let you for the next, I don't know, thirty or sixty seconds, just give this plug for Signal. Why people should use Signal, or join Signal, if they are looking for employment.
Meredith Whittaker:
Sure.
Guy Kawasaki:
So this is your ad.
Meredith Whittaker:
Amazing. Signal is a really special organization. I think it takes us back to an earlier day in tech, where the cool people and the weirdos had their moment, and I think we're going back there by the way. I also think Signal is at the cusp of a movement, and a growing awareness, that we need to change the models for tech.
So if you want to be part of building a new model for healthier, cooler, more private, less surveillant, less harmful tech, Signal is not only doing that, but it's shaping a model for how we can do that across the ecosystem. To quote my colleagues Maria Farrell and Robin Bjorn, "to re-wild the tech ecosystem."
So we don't just have a handful of giants consuming all of our data, and producing products we may or may not like because they have OKRs they have to meet. We actually have a teeming ecosystem of really smart solutions built on open source and open protocols, that are actually private, that are swimming upstream and doing it successfully. If you're into badass shit, I think you're into Signal.
Guy Kawasaki:
I must say, this is about the 260 episode of Remarkable People, and this is the only one that basically turned into an ad for the guest company. You should be proud of that.
Meredith Whittaker:
Thank you. I'm glad it worked out that way, because again, the ad is coming from the most sincere place of my being.
Guy Kawasaki:
Your PR person over there is going to say, "Oh my God, Meredith, you just hit it out of the park."
Meredith Whittaker:
Yeah, thank you. That's for you. You know who you are.
Guy Kawasaki:
All right, listen, this has been the Remarkable People podcast, and I hope you appreciated this ad for Signal and Meredith Whitaker. And particularly if you are a woman, and want to emulate her, I think you'll learn a lot about climbing the ladder. And kicking ass.
Meredith Whittaker:
Thank you, Guy. It's been just a delight, and thank you everyone who listened to this non-targeted ad.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.
Leave a Reply