Full Q&A: Former Google lawyer and deputy U.S. CTO Nicole Wong on Recode Decode - Recode - All

Full Q&A: Former Google lawyer and deputy U.S. CTO Nicole Wong on Recode Decode - Recode - All - Hallo friend Tech Times, In the article you read this time with the title Full Q&A: Former Google lawyer and deputy U.S. CTO Nicole Wong on Recode Decode - Recode - All, we have prepared well for this article you read and download the information therein. hopefully fill posts you write you can understand. Well, happy reading.

See also


"If you can order a chocolate cake on Amazon and have it delivered that day, then you ought to be able to get your Social Security benefits just as easily."

On this episode of Recode Decode, former deputy CTO of the United States Nicole Wong talks with Recode's Kara Swisher about the future of tech policy and why content moderation is more complicated than many people think.

You can listen to the entire conversation right now in the audio player below. If you prefer to listen on your phone, Recode Decode is available wherever you listen to podcasts — including Apple Podcasts, Spotify, Google Podcasts, Pocket Casts and Overcast.

Below is the full transcript of the conversation. You can read a more condensed, lightly edited version here.


Kara Swisher: Today in the red chair is Nicole Wong, the former deputy chief technology officer of the United States. She served under President Obama from 2013 to 2014, but before that worked for Google and Twitter in very critical positions. I've known her for a long time. Thank you, Nicole, for coming onto Recode Decode.

Nicole Wong: I'm delighted to be here.

I have dragooned you here because there's so many topics that you are an expert on, and you've been inside the belly of the beast. You've been outside in government. There's a lot of things we've got to talk about, everything from election issues to content moderation to what these companies are doing to the legislation that's maybe coming for these tech companies — probably not, because Congress is a bunch of idiots. That's my opinion.

So, let's start first by talking about your background, so people get an idea of where you've been.

Sure.

So, you started ... You're a lawyer by trade.

I am a lawyer by trade. You know what? When I was growing up, I wanted to be you, Kara.

A journalist?

I wanted to be a journalist from like the time I was in sixth grade. I ended up going to ... I only applied to law schools that had joint law and journalism degrees.

Oh, what happened? Where did you ...

I had a graduate degree ...

Did you work for the school newspaper and things?

... in journalism. I did the school newspaper. I was an intern at the Anchorage Daily News.

Wow.

I did bear stories.

Wow, okay. So, you're from Alaska.

No, I'm from San Diego. But like, that was where I did my internship.

Okay, you went up there. Okay.

At the end of it, I kind of decided that, actually, law was going to be more for me.

Yeah, 'cause we're useless. You got that? You figured that one out?

It's something about tactics, the law part appealed to me. But I ended up doing First Amendment law when I came out of school. So, the first part of my career was representing newspapers, TV stations, radio stations in the Bay Area on First Amendment issues, which was fabulous. But in the mid '90s ...

Give me your biggest First Amendment ... What was your ...

I worked on ... I don't know if you remember when the Ninth Circuit ordered ... they reversed not doing the death penalty in the United States, or in California, excuse me. And so, we did our first execution in, I think, 1994 in California. And then there was a case in California where we were going to do an execution at San Quentin and the afternoon paper, the San Francisco Examiner, got locked out. So, we went in and made the case that if you don't have an afternoon paper, if that execution happens overnight, no one's going to know about it the next morning because of the way print cycles worked, right?

Right, right.

Before the days of the internet. So, that was one of the cases I worked on. A lot of them were small. A lot of them were like, "We need access to that courtroom." "We need access to public records." Some were defamation cases, some were libel cases, some were reporter's privilege cases.

But in the mid '90s, those newspapers started going online. Then the partner that I had, and I followed them there. And in 1997, we started getting our first pure-play internet clients. And so, Yahoo was one of the first clients. Some that you wouldn't remember, maybe like Silicon Investor, Hotmail before it was acquired by Microsoft, Netscape ...

So, you worked for a lot of them, yeah.

A bunch of them, right? It was such an exciting time to be both in the Valley but also a lawyer where the law was still unclear. So, that's how I got into that. In 2004, I left the law firm, went to Google.

Right. So, why did you do that? A lot of people have done that, a lot of people haven't.

Yeah.

I know lots of different lawyers who have done that and moved along.

Yeah.

But you were covering lots of tech companies and in startups in making law, like really, because a lot of those deal-making, there's all kinds of different legal issues to deal with.

Right, right. So, the thing that I most enjoyed about counseling clients was actually helping them build the products. And so, when we first started representing Yahoo, they had only two lawyers in their legal department, neither of which had a background in sort of publishers' responsibilities and rights.

Right.

During that time ... And I think that a lot of people forget, a lot of the folks who were moving onto the internet thought of themselves as software developers and didn't appreciate their role as publishers.

Right, and they still don't, Nicole.

There is that. So, there's that consistency. Although, I think they've gotten better in a lot of cases, 'cause they talk about issues of free expression.

Nicole, they're not media companies, they're platforms and they deserve the immunity.

We're totally going to get to that, right?

Yeah, yeah.

But I think a lot of the work that I was doing at the beginning was helping those companies appreciate what it means to be a message board. And it's not just software, right? It has responsibilities to the people who are on it and to the people who read it. So, that was a lot of the first part of my work. But I missed, I think I recognized as a lawyer, and especially when you're the outside lawyer, that you don't get to have a say on the design of the product. You don't get to have a say about, like, what is the business responsibility around this product? And that, you get by being in-house.

Right, right.

That's how you get to the table. So, I had an opportunity to go to Google in 2004.

Small, still small.

Yeah, it was like 1,100 people, I think, when I joined.

Yeah.

They hired me as "senior compliance counsel," which we immediately changed to "product counsel," because nobody invites the compliance guy to the meeting.

Yeah, yeah.

Right?

Wickedly compliant.

But if your job is styled as like, "I'm here to help you get your product out the door," then you're in the meetings.

And making it legal to do so. Yeah.

Exactly, exactly.

So, you worked on what? Talk about it.

I was responsible for the launch of all of our products globally. And so, in 2004 it was just search. But after the IPO, it very quickly became much, much more than search, right? Street View was a big thing, the acquisition of YouTube, the acquisition of DoubleClick, all of those areas that Google started to grow in over the next seven-and-a-half years were the things I was involved in.

I grew out the team of product counsel with my good friend Alex Macgillivray, and they went on to do amazing things. Then some of my responsibilities expanded, so I had responsibility for, at various times, litigation, patent, trademark, copyright, privacy and broad regulatory-type stuff.

Which grew and grew and grew at Google.

Which kept growing.

Right, right. Because Google moved into more and more things.

Yeah, yeah.

And what was the thinking at the time about lawyers? Like they just wanted to make and not think, if I recall. But they did have lawyers around all the time.

They did have lawyers. And I have to say, I think, my experience was, the lawyers who were there at the time, Kulpreet Rana, David Drummond, Alex Macgillivray, Miriam Rivera, we were all really good at finding the place where we could add value and be valued by the client, which is the thing that you have to do. You can't be the lawyer that says no all the time. You have to figure out a way ... "Not this way, but what if we designed it a different way?"

Right.

And I think that's the part that made it the most fun, which is, how do you figure out how to achieve the business and the product ends while being legally compliant?

Right, right, right. But you never got called a compliant anything, actually, though, right? So you were there for how long?

Seven-and-a-half years.

Seven-and-a-half years, which was this big go-go days, really, for Google.

It was, yeah. I mean, it was all of our big products, it was in and out of China. It was lots of new possible businesses, areas that we then closed down. I don't know if you remember Orkut, the social network?

Yes, I remember Orkut. I know Orkut. I just ran into him recently. Yeah, he's still bitter that he's not Facebook.

Yeah, so they were into all of this.

Big in Brazil. I'll never forget. I know it was Larry or Sergey. He's like, "It's big in Brazil."

Right.

And I'm like, "So what? But the rest of the world, not so much."

It was like huge in Brazil, India and Iran, right?

Right. I know.

And no one in the United States knew about us.

You know, they kept saying, "Big in Brazil." I'm like, "You keep going with Brazil. I'm sure you'll somehow catch Facebook with that." I'll never forget that, "Big in Brazil."

So, you then went to Twitter, then, after ...

So I took a year and a half off to hang out with my kids and my family, and then at the end, like around the ...

What got you out? Why did you leave?

Part of it was I was ready, and I had a team that was ready to take on the responsibility. A lot of it was, I don't know that I've ever said this to people who are not just friends. My youngest daughter one Christmas said to me, I was like, "What do you want for Christmas?" And she said, "More time with you."

Oh, wow.

And there's only one answer to that.

Oh, God.

It's like every working mom's nightmare.

Yeah. Oh no, not that, not time!

But I was able to do it and totally enjoyed the year and a half. And after a year and a half, my daughters were literally like, "What do you do all day?"

Yeah, all right, get out of here.

I was like, "All right, so it's time for me to go back to work." So I went back. I went to Twitter to work with ...

Yeah, so how did you get there? What was the ...

Alex Macgillivray was the general counsel, and he and I had worked at Google together, and it was a pleasure to work with him again. The job was to build up their product counsel team.

Right.

And so, I was there, but I was there for literally, I think, six months before I got a call from the White House to join the Obama administration as the deputy CTO.

And why did you want to do that? Policy, right?

Part of it. And part of it's super personal. So they had actually caught me right after my grandfather had passed. And he had instilled in us the sense that like, "You have every right to be here, and you have a right to participate in your community and your government. And if you get the opportunity, it's a responsibility, and you're privileged to do so." Knowing that, when the call came in from the White House, there was no way that I can't walk that walk, right?

Right.

So, the opportunity to serve, and candidly, and I don't know if you know Todd Park well ...

Yes, I do.

He's phenomenal. The vision that he had for bringing all of government services up to another level, to be able to participate at such a high impact, that was huge, and it wasn't something I could turn down.

Right, and you focused on what there?

So, my portfolio was innovation, internet and privacy policy.

That's kind of big.

Huge, and like when you go into government, they tell you ...

"Fix the government, Nicole."

Right? "Don't break the internet."

Right.

They tell you like, "Focus on three things. You gotta have three priorities, 'cause every day's an emergency, and you're gonna get distracted if you don't." So, going in April, I was getting ready to move, and I was like, "Where are the kids going to go to school?" And I was like, "Okay, so I'll do maybe internet governance and free expression, which is sort of where my passions lie, and maybe I'll do some privacy." We were planning to move in July of that summer. And June 9th, if you remember, is when the Edward Snowden disclosures started.

Yeah.

And they were like, "How soon are you getting here?"

Right.

So, I started the next week. And honestly, the first half of the time I was there was spent on privacy, surveillance issues and most importantly, I think, the work that I'm most proud of was the public policy implications of big data.

Right, right, which has been the story ever since.

Yeah.

And also, it was interesting, I was writing a column today for the New York Times that I write for, and one of the things that I left out was the damage the Snowden thing did to the relationship between the government and Silicon Valley, which had been relatively cooperative until then. And then it was broken rather badly ...

I think that's right. Like the notion that I would go and talk about internet governance or free expression at a time when the Snowden disclosures were happening, that was done.

Yeah, and they'd been spying on you.

Like you weren't going to talk about anything else.

"You've been spying on us."

Exactly.

Which the government assisted the government, the companies were cooperative with. They said they were not.

That's right.

It just went on, it just seems like a lot of what's happened in the Russia thing, you can draw a very bright line, is that they were not cooperating in the way they used to, which is interesting. But that's a topic for another day. We may get into that. So, you did that, and what was the experience like? You're in the middle of that, that's all there is. That was all there was.

Yeah, yeah. No, it's all-consuming. It was fascinating.

Right.

And there's no other experience like it.

What did you take away from it?

The most important lesson and the one that I continue to try and work on is, I think that we have not done a good job of filling the ranks in government and in the public sector with technologically capable, savvy people.

Yes. Or they're in a backseat position, or they're brought in like the help, like the air conditioning repair.

They think that like you're there to fix the email servers, right?

Right, right.

As opposed to thinking about forward-thinking policy and the ramifications of what it means to use all this technology in our world. And I think we are doing ourselves a disservice by not doing that. I think that there are big and small things that I've been talking with people about. Like what should be the tech agenda for social impact? Part of it is government. Part of it is like, "Just make shit work," right?

Right, right, right.

Like if you can order a chocolate cake on Amazon and have it delivered that day, then you ought to be able to get your Social Security benefits just as easily, right?

Right, absolutely.

So there's that. Then there's the broader vision, which is like, "What's our moon shot?" Knowing what all of our capabilities are, how do we power the next ...

Why is that, from your experience in government? I have some thoughts about it, but I mean, that they just don't think about it. They're not technologically literate people for many of the people that go into it. They operate in a very old system that resists that kind of change. And also, they're fearful of it and wary.

Well, because they don't understand it very well.

Right.

Right, so there's some wariness. I think that's going to change over time. I think that the growing ranks of those at the staff level and at the congressional level are much more savvy than they used to be. If you even just look at, not all the questions in congressional testimony are great, but they're way better than they were five years ago.

Okay, if you say so. Oh, the Zuckerberg ones, I was literally like, I was screaming at the TV set.

They're like, like they're reading Wikipedia page? Yes.

Yeah.

But they're way better than they once were. And I think we're getting better at educating them. I think that we still face a lot of competition. If you're coming out of school with a CS degree, are you going to a private company or are you going into the public sector?

Right, you're not going ...

And there's a money thing, right?

Right.

We haven't instilled in people the sense that they should serve. Like, that this is our government, and it's only as good as we are.

Right.

And they should serve.

So, you were there for a couple of years and then came back.

Yeah.

So, what have you been doing since?

See, I knew you were going to ask me that, and I wish I had a really crisp answer.

Driving your daughters crazy?

Yeah, exactly.

"Mom, please leave." What are they, teenagers now?

They are.

"Mom, please leave."

They are both in high school.

Yeah, yeah.

It's awesome.

Yeah, my son left the other day, and I haven't seen him since. Like he was like, "Bye, Mom." I'm like, "Where did you go?" And then he showed up again for lunch.

It's an amazing time.

Yeah.

I love them both dearly, and it's an amazing time. So yeah, so part of it is being home. Part of it is, I consult for some companies. I do some work with nonprofits like Mozilla Foundation, which is doing a lot of work on the area of internet health.

Right.

Witness, which is a human-rights organization that was founded like 25 years ago by Peter Gabriel to do ... it focuses on training people to do video documentation of human-rights abuse. And in this era of everybody has a camera, right, and you can use this data in so many new ways. Just forward-thinking about that has been a real joy to work with them. Some of it is literally still supporting folks who are in government and giving them some guidance and trying to get more people into their ranks.

Right. And then, and working on other things. So, what was really interesting is Nicole sent me a whole list of things to talk about when she was here. She thought I prepare for these shows, but I don't. I'm like, "My whole life is preparation."

I was like, "Give me something to focus on, Kara."

No, I don't. That's my thing! I don't help ... Someone on the Twitter today was like, "What do you do to prep these people?" I was like, "I don't prep anybody." Well, talk to Nicole.

I will vouch for you on that.

There's so many topics we have to talk about, Nicole. I think we're going to start with this: What's going on right now in Congress? And I want to get to the wacky stuff at the end, the content moderation, things like that.

Okay.

But let's talk about the Russia stuff.

Yeah.

Because that's what Sheryl and Jack are there to talk about, and they're going to talk to a chair from Google, some Google chair. Why didn't Google just drag Larry or Sundar there?

I don't know. I don't have any reason to believe they have something to hide.

No, I know. Yeah, why? Can they just like put them on one of the Google rockets in Google time and whatever, the machine, the transporter.

Yeah, I know, right? Teleportation?

Teleportation machine they've got in the back there and just move them on over?

Yeah, I don't know why ...

Not Larry, 'cause you got to get him out of cold storage for a while and heat him up and stuff like that, but — you don't have to say anything.

Sundar.

Yeah, Sundar. You don't have to say ... so, they're going there. Not Google, but they did. Kent did some testimony.

Oh, did he submit testimony? I haven't seen it.

They don't want it. They don't care.

Yeah, yeah.

Mark Warner's like, "Screw you." Like, "You either show up or you don't." And they're going to have a wooden chair there, apparently. That's what they tell me.

Right.

Talk about the implications of this. These hearings, I think, are actually more important because they're actually ... there are serious politicians who actually know a few things about things.

I think that's true, but I think it's important to think about what are hearings good for and what are they not good for?

Right.

And I've done five, right? Hearings are super good for putting executives' or companies' feet to the fire, right?

Yes, and it's always a good thing.

It's always a good ... and educating the policymakers and their staff as well as the public about a thing, right?

Right. "Look at this."

And so, like educating on Russian disinformation campaigns, that's good for all of us, and that's an important reason to have them there.

Yeah.

It's good for, again, holding them to a schedule. So, my assumption is, if Jack and Sheryl are showing up, they have some good news to report, right? Like they've made progress since the last hearing that they did.

Right, so what they want to do is say, "Here's what we did wrong, and here's what we're doing to fix it."

"And here's what we've done to fix it."

Right.

And that's also a really good ... knowing that they'll be held accountable, that's a good function of a congressional hearing. What I think it sometimes get used as is a platform for assigning blame, a platform for a political grandstanding. That's un-useful, and anyone who thinks someone's going go and they're like, "We found the silver bullet," that's so not going to happen, because this problem is so complex and so beyond just what the tech companies can do, right? Any expectation that that's going to happen, we should kill that part now.

Right, right.

Right? To me, optimal results would be that you find some agreement about, "What's the easy stuff?" Alex Jones, whatever it is. Find the easy stuff and decide we have agreement on how we're going to handle that easy stuff, whether it's by legislation or something else. And then lean into the hard questions because there's lots of hard questions, and figure out what can we make progress on, even if imperfect? What can we not ... that this is not a tech-company solution, it's a different solution.

Right. Now, this has been a relatively new thing for tech. Now, Jerry Yang did go and get his head handed to him during ...

Got called a moral pygmy.

Moral pygmy, if you recall.

I do.

I remember saying, "Don't go. They're going to something bad to you there." I'm like, "No, there's no winning on this one, you really did screw it with things going on there." This was China, but it's been very little. I think Mark was the first really big ... correct?

CEO-level, yeah, that's probably right.

CEO-level big names, because most of it has been ... I think Jerry Yang was the last one I can remember that was. What else did you ...

I dunno. I got sent out a lot.

Yeah, yeah but not, not that you're not big, but you know what I mean?

No, no, but totally. Like, you send your vice president of public policy or you send your head of comms or head of product or whatever.

Right, so Mark was the first big one. How do you assess that encounter? I think it's just the beginning, this is gonna go ... I was like, "Strap on your wooden chairs, people."

That he was gonna have to do it many more times?

All of them, all of them, on everything. And by the way, it's not just Russia, it's gonna go to AI, it's gonna go to IOT, it's gonna go to ... everything, everything. Cars.

Oh yeah. For sure. But whether it'll be a CEO or not I think is up for debate. And here's the thing, you don't create solutions in a hearing. And so all the hard work and all the ... you can use the hearing to get a commitment that something will be done, but you can't actually devise the solution.

My point being is that these companies have operated largely unfettered for a long long time. And they would say not, but I don't know. If you were a broadcast company or a media company you'd be like, "Hey, get on the legal train that we've been on for years." So, how did you assess the Mark hearings?

How he did?

I thought he did well, only 'cause they were terrible. That's really pretty much a low bar.

Certainly the first day. He did worse the second day and their questions were better the second day.

Yes, yeah. For some reason the House, yes, I agree.

Which like, 10 hours in that seat, that sucks for anybody, right?

Yeah, yeah.

So I thought he did fine, and I thought ... the thing that I think really kept sort of bothering me during that was, I think on the second day, and maybe it was 'cause he was a little bit more down, he kept referring to AI as the solution. Like, "Oh, we're gonna start handling this using more AI," and this notion that we can resolve content and disinformation problems just by throwing some AI up at it, yeah it can help, but it's not gonna solve the problem.

Yeah, yeah.

I think that if you were not well-informed about how AI works, just how machine learning works, you thought that was supposed to be a silver bullet and it's not.

It's not. Absolutely.

And it could go really wrong, right? If we do it poorly, we will replicate all the mistakes we're currently making.

Absolutely. My issues with them were they, the water under bridge ... let's put the water under the bridge and let's focus on solutions. I'm like, "Let's reflect on the problems." I think there's something very good in thinking about why you went wrong.

Yeah. Although, I also remember at the beginning, Mark I think said, ""We accept responsibility for ..."

"We have a broader responsibility."

Yeah, but there was also, it was as if he accepted the outcome of elections both here and in Europe as Facebook had ... that was Facebook's issue, and it's not. Let's not, and you and I may end up disagreeing over this, the way this election turned out is not because of tech or Russian disinformation. We had 63 million people vote for a man who was blatantly misogynist, racist ...

Right, agreed, agreed.

... anti-Semitic, intolerant, right? All of those things, and he wasn't hiding it from us, and 63 million people didn't find that disqualifying.

Right, absolutely. I agree with you, I just literally made that argument. I actually agree with you. Someone was, we were talking about talking to Bannon, and I'm like, "Bannon got him elected." That's what you need to ... like, no amount of amplification by the media will make it any worse or less, 'cause he did it. Which is interesting. But, I see why people are offended by him, obviously. But one of the things that I thought was interesting though, even though 63 million people did, it's where those edge cases where things could've shifted ...

The few states where it was a few thousand ...

And I think we will never know that. I think that's the problem, is there's no way to trace it anymore. There's no way to find out. It's almost like an episode of "Scandal," it's like you're never gonna figure it out. And so, I think that will haunt this entire election process, 'cause it's never gonna be known if there was 50 ads that changed everything or not. Or not. Or you could also blame it on six things Hillary Clinton said that could've shifted it too, the "deplorables" thing didn't work very well.

Or how it echoed into the media, right, there's like a huge number of factors.

Exactly, or was it the New York Times writing this, or was it James Comey, there's all kinds of things, but there's definitely a place for ... Facebook could've been one of the shifts.

Absolutely. I think that's absolutely true. And I think figuring out how to ... what is tech's role? Right? For the coming election, especially 'cause we're less than 100 days out now. What is the role? That's super important, but I just didn't wanna lose sight of ...

No, I get that. But I do think that three things for them are:

One, clean up the fake accounts, which they let grow rampant because they wanted growth, growth, growth, fake accounts and anonymous accounts that we can't track in any way.

Two is clean up transparency in political ads, they just didn't do that. They just didn't do their job, that's it. And they're transparent everywhere else, they should be transparent there.

And the last thing is the fake news. The allowance. Again, the sloppy management of fake news.

Yeah. Yeah. And I think if that first category, I don't know if you were including these, bots. I think these companies ought to be labeling bots. I should know if I'm interacting with a human or an auto human.

That's right.

And I think that that's not just like a bot, but a robot. Because I think Google did a demo of its new voice, like human voice that was super convincing and eerie.

Yeah, it was creepy.

We should know that we're not interacting with a human being, because as we go forward and as this human AI becomes more and more prevalent, there outta be a clear understanding of when I'm dealing with a human and when I'm not. When I have to care about the response from the other side, or not.

Right. But that's called anticipation.

Yes.

You're anticipating. Lemme just say, tech companies did not anticipate any of this stuff, and ... didn't anticipate, and they were sloppy. Those two things are enough to — fix that, please. Which is interesting.

So these hearings. What do you expect to come out of them? Because there's one in October about antitrust. There's one later in the afternoon tomorrow which will already have been news, with Jack Dorsey by himself around content moderation. Let's talk about that one. And then I wanna know what to expect, actually. C ontent moderation, which is a nice way of saying censorship, or possibly not? Or First Amendment. You're a First Amendment lawyer. How do you look at this incredibly fraught situation.

Super complicated. No silver bullet.

None.

And I think I'm somewhat frustrated by the level of conversation in each of the countries that is trying to wrestle with it because they are all dealing with it as if it were not global. Right?

Explain that.

So, Facebook's in this really awkward position where it's trying to have a global platform and one set of rules imposed consistently. And the fact of the matter is that every understanding of content is incredibly nuanced from a perspective of what it is in the culture, what it is in the political system, how the legal environment handles a content problem. And so I know what they're trying to do, and I understand that that's the only way to scale it, I just think it's really hard.

And I will say that as someone who gets to say that in hindsight, 'cause I'm not that decider anymore. And in the days when I did it, it was millions, not billions of users, right? It was hundreds of ... I don't remember, it was like ... in the tens or scores of hours per minute on YouTube, not in the hundreds of hours of content on YouTube. And so, I actually had the time to say ... my folks would level up something for me to see, and I would get a day to sort of think about it and get some more information about like, "Well, what does this mean in India? What are the ramifications?" and to touch base with people in India to say, "Should I do this or that?" They appear not to have that latitude anymore, and what I'm hearing is that they have four or five seconds per piece of controversial content to make a decision.

Right.

You are gonna get so many mistakes doing that.

It's the life they chose, Nicole.

It is the life they chose.

And the billions they accepted for doing that job.

So the question is, do we wanna slow that down? Is this the moment where we have kind of like a slow food movement for the Internet?

Oh, that's a great idea.

... and just slow everything down.

So how does that work?

Well, so, here's ... I was thinking about, and I'm not sure I'm gonna directly answer that question, but when I first started at Google, I remember having conversations around the pillars of design for search. I don't think they called it exactly that, but it was like the principles on which you design search. And it might have been Matt Cutts that said there's comprehensiveness, we want all the information we can get; there's relevance, meaning we deliver the right response when someone asked a question; and speed. Those were the three pillars of search.

And then in the mid-2000s, when social networks and behavioral advertising came into play, there was this change in the principles that ... we just weren't as concerned about search anymore, instead we were focusing on this other part of the platform. And the dynamics were around personalization, which is not relevance, right?

Right. It's what you wanna see.

Not what answers your question, but what's more stuff that you like? Personalization, engagement ... what keeps you here, which today we now know very clearly. It's the most outrageous thing you can find. And speed. Right, so speed's still there, but the first two have changed, and that has, I think, propelled this crazy environment that we're in now.

You're absolutely right. That's an incredibly intelligent way of putting it.

So what if we change the pillars again? What if now everything that we've learned in the last two years, we say, "That's not the internet we want to live with"? So this is just personal for me, like, what if the pillars were accuracy, authenticity and context. And maybe that slows it down. Right? So maybe that means that things like Black Lives Matter, or Tahrir Square have a little bit more trouble getting off the ground quickly. Right? Maybe the Ferguson thing, you don't hear about that as fast as you do now, and as quickly among your ...

Which some would say is a terrible thing.

Right? So some things are gonna, there's gonna be cost to refocusing those principles, but maybe that's a different world that we actually ought to be trying to build.

Yeah. Yeah. Do you think they're thinking about it like that?

I have no idea. I hope they are.

Yeah. I don't think they are. Unfortunately. I think they're ... they've got their hair on fire. What they're thinking is, like, "Let's put out the hair."

What do you make of the ... Jack will be in front of the House members who will only talk about Diamond and Silk and everything, like being pushed down. Laura Ingraham talked about nationalizing Google and Facebook. I know, you're rolling your eyes, I'm rolling my eyes, too. But, these are being talked about by people who have serious impact on — potential people who could have serious impact. So, I'm not just gonna roll my eyes, I'm troubled by this.

I'm totally troubled, and here's what I've ...

Can you just ... Google and Facebook do not discriminate like that.

No.

They just don't. PageRank, right? From what I can understand.

Exactly. It's algorithmically-based and it is not about, like, "Hey, I like this political decision better than that political decision." No one's got time.

Right. Right, right. So how do you get rid of that if you're the tech companies, besides just saying it over and over again, without saying, "You're an idiot, stop saying that."

Yeah, yeah. I don't know, and in this environment, and because they are put on their back foot I think it's gonna be super hard. And what that might mean is that folks like me or you or others outside of that environment say, "Hey, that's not actually the thing that we think is happening, nor are we worried about that."

Do you feel there's an actual risk when you have Orrin Hatch all of a sudden, who didn't talk about antitrust, [now] he's talking about antitrust, or maybe the president tweeting that Google's trying to skew his search results?

I do worry about it, but I also think that we have to have an honest conversation about what they are looking for. Right? Because some of the solutions that I see get bandied about by folks who are not as sophisticated about understanding what's happening. Like, "Well, there should only be verified users on these systems." Right? Or, "We should have these really blunt instruments ... we just don't allow that type of content at all." Verified users, large blunt instruments of censorship, those are authoritarian government tools. Right?

And so at some point, both we and these companies are gonna need to stand up and say, "The things that you are asking us to build in service for this democracy are tools that will be used in China, in Russia, in Turkey, in Saudi Arabia," right? "Appreciate the fact that we are the global platform, and that what we build, everybody will have the right to demand."

Right, we'll get to Google in China in a minute, but ... be careful there. But when the Alex Jones thing came about, what did you think? You know, I thought they should remove him.

Yeah. I didn't ...

They've removed other people.

Like, how many strikes do you have to have before you get to it? It felt to me like they had to update their policies to mute the level of vitriol he was propagating.

How would you have dealt with it as a lawyer? They dragged their feet for a while, then all of a sudden all flipped.

Yeah. I think it is hard ... here's what I think they were struggling with.

Actually I think they were struggling, right.

I think they were struggling.

And they were like, "Those damn Apple people did it before us."

I think part of it is this notion of how do I think about being responsible for understanding stuff happening off my platform? Like, how do I incorporate that into a policy? Because again, we're talking about a scale issue, and I may have, in the old days — I feel like I ought to be in an old rocker with a cigar or something — but in the old days, yeah, maybe I had a day to think about that, but they don't. And so how do you do the diligence of understanding the off-the-site ramifications of what's happening, or incorporate it into your policy?

What would you have done with Alex Jones?

I probably would've done the research to figure it out.

And then?

And then made a call. But I also think I probably made inconsistent, bad decisions while I was there. So your thread is that you will make it inconsistent ...

What would've been your Alex Jones call, knowing what you know?

I probably would've removed him earlier.

Yeah. Yeah, yeah. I think so, because it just got worse and worse.

Until someone's yelling at you.

So you talk about this documentary called "The Cleaners." Explain that, and then in the next section we're gonna talk about techlash and diversity and social issues.

There's a new documentary by two directors from Germany who wanted to explore what is this content moderation industry. And it was born of I think research done by a professor down at UCLA named Sarah Robertson. So she is the one who uncovered, there are thousands of contract workers in the Philippines cleaning up all of these social media platforms, right? And making calls that like, if we were here in the United States, we might not quite make the same call.

So, this documentary, they actually go back and interview a bunch of contractors about what they understand to be the rules, the decisions they've had to make on terrorist content, child pornography, self-harm content and all this stuff. It's fascinating to hear from the contractor's perspective what they think their obligations are.

And what do they think?

They are trying really hard to follow the rules, they have seconds to make these decisions on thousands pieces of content in an hour. Right? And the interesting overlay to me was like, this is a very Catholic country, they bring a lot of their person and their identity to work with them, about making decisions about this content.

Wow.

So when you think about the complexity of these takedowns, it is actually really hard to create rules that, when an American user posts something that's visible in Turkey and reviewed by a Filipino contract worker ... like, what is that? Right? Who's winning in that scenario?

Okay, we should just shut down Facebook ... right? And Twitter. Let's just do that.

It's really, really hard.

Let's just go back. I had Jaron Lanier in, and he was like, "There's never been a human experiment on this level, of people talking to each other in this fashion." But when you think about it, the idea of "The Cleaners" is a really interesting one because it's sort of that back room ... like, you don't wanna see how it's made. You know, my son just got a job, he's a chef, and he's young, but he cooks and he's in a restaurant and he was like, "Mom, you don't wanna eat there."

How the sausage gets made.

No, I was like, "What are you talking about?" It was over a small pickle incident, but it wasn't that bad, I was like. "Oh, I'd eat there, it's no problem." But it sort of was like, when you see how things get made, it's really ... and they are trying to rely on AI when they don't even know what that means, I think.

Yeah, and I think getting that right is gonna be so important. So, two thoughts on that. One is, when you understand the complexity of this, I think that it is super hard to hold the tech companies fully responsible and insist that they make no mistakes, which I feel like some of the rhetoric is definitely ... like, you never get to make a mistake about that piece of content, and that paralyzes a company that's trying to do the right thing.

Yeah, they're still back on that My Lai, that mass ... Vietnamese picture of the little girl running.

Exactly. And so that's one thought out of that. The second thought is on the AI piece, which is ... I do worry, and this might just be rhetorical, I worry about leaders who are saying, "We're gonna have AI fix that," and they may be like, "As protection, this is my flak jacket ... the machines will fix it."

My experience is that you need to see the content. You need to make the moral call on the Rohingyas or the child porn or whatever, because if you don't you have delegated your morality to a machine and that is wrong. We shouldn't do that.

That is really a fantastic point, Nicole. Do you think that they get this now? Because I think they don't want to be seen as media companies. They don't... They want to keep that immunity. That's a law. They're not held responsible for it so it makes you lazy and sloppy.

I think they're struggling in the sense that like, I don't think that they want to abrogate all of their responsibilities, but they don't actually want to take on all of the baggage that comes with being a media company.

Is that just too bad now? Will they lose that immunity, do you think? I think they will.

I think it started — the SESTA-FOSTA thing.

That's was the recent thing around ...

Prostitution and sex trafficking. I think there are people who are gonna start chipping away at it. I think you were interviewing, was it Warner?

Warner, yeah.

About ...

Wyden.

It was Ron Wyden, I'm thinking about like redoing section 230, about this immunity. So I think that's certainly a conversation that's happening and has been happening for years. I feel like the current designations we give to these companies don't fit and so maybe we need to find a different one.

Yup. They may not be "media" but it's some kind of media. It's still social media.

But it's not a platform, right, it's not as if they were under the impression that they're just a dumb pipe. That's not it. So somehow we need to rejigger that.

Let's talk a little bit about this thing you call the "techlash." You wrote this in a note to me — techlash and tech's ethical issues that, we were just talking about this. These are ethical issues that I don't think they want to take responsibility for, never understood.

I wrote a column, I think, saying Mark never took a humanities course and perhaps he should have taken one or two, a little Kierkegaard might have done them a little good. He was the one that brought up Holocaust deniers and walked right into a bad bunch of quotes about it. How do you look at this techlash? Where is it going, from your perspective? It's easy, because not just that, but addiction about automation, robotics, it goes on and on and on.

Yeah. It was funny, I was doing a talk for women in STEM recently, I had to do this ... Okay, I have a little bit of impostor syndrome because I'm here as a former deputy CTO, but in fact I have like every liberal arts card stacked against me. I'm an American Studies major with a minor in English and a poetry fellowship, right? Like I don't cover any of the spaces.

But maybe that's actually what tech needs right now. We need more sociologists and ex-ists coming into the tech sector to talk about the broader implications of how tech gets used. And I think we're starting to see that.

So a group that I'm on the advisory board for called AI Now, which is run by Kate Crawford and Meredith Whitaker out in New York, and they are all about what are the social implications of AI? And I think we need more of that conversation. dana boyd's Data and Society likewise, having this really interesting hardcore research on the implications for humans when we use these technologies. That, I think, is super important and will help turn the cultures in some of these tech companies around if they're smart enough to grasp, grab onto them.

You've been close to these leaders. Are they amenable to it?

I think they are but I don't think they know what that role looks like. Right? Where do you put an emphasis ...

Well, they never liked the non-tech people, really.

Yeah.

I mean, they pretend to. They tolerate them.

I know, so I got tolerated in a lot of meetings.

Yeah.

But I think they know they're missing something, right? All the smart ones know they're missing something, and so I think they are looking to figure out how to put that into the system. But it is a system that's built on very straightforward, fast, iterative development, that doesn't necessarily do a lot of stop and think about the human aspects of it. Nor could I imagine a world right now, knowing how a lot of products get launched, that you would say, "Hey, we shouldn't do that until we understand the ethical ramifications." Right? I just don't see holding a launch for that one.

Yeah, you wouldn't get a lot of things done, would you?

Yeah, so I think if we want our companies here to be super innovative and productive ...

Because they're not doing that over in China, for sure.

Exactly. Right?

That's their argument now.

China is going to sweep the table with us.

Exactly, that's their argument now, right? When I had Mark on the podcast he's like, "Well, if I get hindered, what about China?" And I'm like, "What?" That's my choice? Xi or you? I don't feel like that's the choice I need to have to make."

But I do think that thinking through how do we get ahead of the standard-setting that China will do otherwise, because it may not be hampered in the same way.

Talk about that sweep the table.

AI is actually the place where I'm thinking about this.

In surveillance and robotics and facial ware.

Right? To develop really strong AI, you need a lot of data. Well, if you have an authoritarian government that says, "Hey, we're now all doing facial recognition," you suddenly have a lot of data.

Which they're doing in China.

Right. Which if you're in the United States or Europe or whatever, you have to get consent, and that consent can be withdrawn. All kinds of hurdles to collecting the data, which means we'll be slower. And I don't have an issue with that except for the fact that China has the ability to employ technology that will simply be the dominant technology if they get there first. Right? Or could be.

And what does that technology look like? It tells you everything.

Right. Exactly. What is it they're doing ... your social currency, or something like that, based on your facial recognition.

That was a "Black Mirror" episode, do you recall it?

It's terrifying, and so the question is how are we weighing in on that? And can we weigh in if our technology can't meet theirs?

Right, right. So what do we do?

I'm not sure. I mean, part of me says it means that all of the countries that are doing this work need to get together and say, "This is acceptable use." And China's gonna be the outlier.

So a cyber doctrine.

Yeah.

What can be used, and Warner talked about this in my interview, there's cyber doctrine over what we can do and what we can't do. And if people violate it, we will lose our tools to stop it.

Which means we need to start using more of our global forums, right? To make policies.

As we did on many other issues.

Yes.

Pollution, or things like that.

Nuclear warheads, all kinds of things.

So, who's the leader then if we have an administration currently who doesn't have a CTO... and make that note, how do you feel about that?

I'm so sad for that part of our government.

There was a real estate guy there, but he's gone. No, I'm serious, he was, he's very nice.

Yes, he was.

There's nobody there, like empty. It's fascinating.

Right? At a time when we are ...

They actually let me in the White House, I can't believe it, but they did.

And did you steal M&Ms?

No, I didn't, I brought my phone in and taped everything, like Omarosa. But you could, by the way! I was such a good girl I put in the box, and nobody checked it, and I was like, "Wow, they used to check it." It was really kind of shocking.

Totally separate ... the internet policy issues, which I used to cover ... gone. Right? Nobody's watching that. Cybersecurity? Gone, no one's there. But honestly, I'm thinking about biotech, Zika, Ebola, climate change, all of these things where there is no ...

Input.

Scientific input. Right? Much less policy input on what to do about those things.

Thank goodness we don't have a Zika, who would tell them what to do?

How would we know?

How would we know, which is interesting. There's a new book coming out from Woodward which is frightening actually, because they're basically taking paper off his desk so he doesn't see things. But was that a good thing to have a CTO, do you think? Or deputy CTO, because why?

Because I think you need a voice for tech at the table. And this goes back to where we started the conversation. Technology is not someone who provides your email, right? Technology is actually a moving force that can deliver services well. But we're not just fixing websites, this is about reaching millions of people. Right? To make their lives better, that's where the possibility exists and you want people who are in the CTO's role who can envision that.

And give advice.

And have a seat at the table of the president. Tell him that that vision exists.

And he would just want you to help do Twitter right? Trump, yeah. He does a pretty good job, actually.

Yeah, he seems to be a master.

He's real good at it. But getting back, I want get back to China before we go and then talk lastly about diversity. Google and China, you worked on the coming out of China, going in and coming out.

Yeah.

Which was a big call at the time.

It was, and I think the company struggled with it going in and coming out. And candidly, I would stand by both decisions. I would, to this day, stand by the reasons we went into China.

Which were? Explain them.

To make the services more broadly available to a significant part of the population that was — I don't think that Google would have ever said this — but was information-poor. They just didn't have access to as much as you could get, so even when you censor, and I think that they way that we manage search results, when we knew that there was an area the government would not let us show.

Mm-hmm. Falun Gong.

Right. Exactly, but we would show at the bottom a disclaimer that said, "Some results have been withheld because of the government restrictions." But then, at least Chinese users knew what they were missing. And before they had no idea, they thought they were seeing the whole world. So I think that giving them that instinct to have a sense for what is missing was an important ...

Although the government now might [not] let you put that warning...

Well, I think that's right. So now that we're gone, is that still the way things work? I think that that was actually a really important thing. I also think candidly, the way that Google ...

That's the optimistic ...

I know, but ...

"Candide" version.

That might have been the time that I was there, and the things that I believed it.

But then coming out ...

But coming out, over the course of the years that we were there ... increasingly repressive. Increasingly censoring. And then, intrusions onto the Google systems by Chinese hackers. All of us said, this isn't worth it, right? There's no way we can protect our users and defend our mission with this happening, and so we left.

And I think that that was the right call too. And I thought what we did was we left in a noisy way, to make sure that people understood what was at stake. I think it's really interesting, the questions that I got from Congress during that period, some of them were around this issue of, "How can you undermine democratic values by acceding to that government and censorship?"

When you were there?

When we were there, and now the questions are flipped. "Why aren't you censoring more?" Right?

Yeah, for our stuff.

For our stuff!

You know what I think they're mad about, is Facebook has pulled away a lot of political and news stuff and put in more cat videos again. And they don't like that.

And they don't like that. It's taking time out.

Yeah, exactly. But that's my theory. That's my working theory.

I don't think it's wrong that they've changed their focus. I'm saddened that that's the place where it is right now.

What about going now to China? It's gotten a backlash within Google.

Yeah.

Because it has worked for the Defense Department, and has worked on all kinds of things.

I don't know enough about their decision-making around that.

What do you imagine? It's the same criteria, you're going to help China be better?

I haven't heard what they want to do. Yeah, but I think it matters a lot how. I think when we first went in it was with this notion of, "Hey, we're gonna have disclaimers about what we removed," but we would not have authenticated users in China, so that we would never being in a position of having to deliver a user to the Chinese government. I don't know whether that's the constraints, but I would hope that they are thinking through their ...

Why did they want to go in now? They just can't miss this one?

Yeah, I mean, data, right? Huge swaths of data. And I don't disagree with its value either, right?

Dirty data. Sorry, I don't know, I know it seems ... they haven't gotten better, that country hasn't, it's gotten more authoritarian. And so has the rest of the world.

And it's not gonna get better.

There is no way they're gonna do it, and so they're just gonna have to admit what they're doing. Which is interesting. What changed in that regard, just, "We need the data"?

I don't know. I totally don't know. I will say, for China, and I don't know any background to it, Kai-Fu Lee, who actually used to work ...

Who I'm talking to tomorrow.

Oh you are? So, he used to work for Google, and now he's there. And I think he's being saying some really interesting things about AI and the proper role for AI and the proper role for humans. I actually think that that's a more interesting conversation than I'm hearing in a lot of the United States.

There's a lot of cool stuff going on in China at the same time. I do think they're gonna clean our clock in many ways.

I'll be fascinated to hear ...

But authoritarianism helps, helps to be able to do that.

So last thing, we just have a few more minutes. This diversity issue in tech is something ... you have this group of women, mostly women in AI now, is that right? Or it just happens to be ...

Oh no. I think it's across the board.

So another thing. California may rule that we have to have so many women on the board, things like that.

Yeah, super interesting. After I left the administration, the Obama administration, they kicked off an initiative, I think under Megan [Smith], which is diversity in tech, which makes total sense. I think it was also coinciding with the Ellen Pao case. So there was lot of movement at that point. Pinterest was talking about, "Everybody publish their numbers, let's put more money into it."

And then it all went quiet, and then there was #MeToo. I think that there's been a huge amount of progress by folks like Aileen Lee and All Raise and Sukhinder's theBoardlist. And Megan because she's in everything, continuing to promote diversity across the board and STEM. I have heard at the CEO and board level that the right questions are now being asked by white male folks ... "What am I gonna do to change the diversity equation in this company?" That's awesome. And I find myself impatient that it's not happening faster.

To finish, make the product case for it, for a lawyer that should have this.

Yeah, I think particularly for those products that are very social, to not have the perspective of women and marginalized people telling you "what feels dangerous to me," or "what would I really like to see" — you are missing an opportunity. I completely buy all of the data that says the financial performance of more diverse companies is significantly better. I completely believe that, because you are actually looking at your user base, within your company. And having them co-design with you a better product. That, to me is a no-brainer. The fact that we cannot seem to retain those people and get them to higher levels, I'm so ...

Because now they're just letting them in, they're not helping move them up.

Exactly, they're stuffing the pipeline. Their numbers look good there but they ... I am still hearing ... people are in rooms where they feel like they're alone and they have no voice. And whatever we've got to do to change that, we have got to do it faster.

Absolutely. This has been a riveting discussion, Nicole, I'm having you back. You my new Chamath.

How nice.

That's a compliment, even though we all know Chamath, that's a compliment. In any case, this has been great. These are great topics, I'm gonna have you back to talk about more, thank you for coming on the show.

Thank you so much.

By the way, Nicole was pinging me on Twitter, right?

Yes, that's right!

And then I said come on the show and tell me that. So anybody who does that — except for that real mean person today — that would be great.

No mean people.

No, I'm okay with mean people.

Okay.

I like mean.

Really? I don't need any mean people in my Twitter

I don't mind, I want to talk to Steve Bannon, so what do I know?

That's true, you are a glutton for punishment.

Yes, I am.

RSS Feed

RSS to Email Formatted


Unsubscribe from these notifications or sign in to manage your Email Applets.

IFTTT

"get car quotes online","get car insurance quote online","get car insurance quotes online","car insurance get a quote online","car insurance get a quote online","get a car insurance quote online","get a quote on car insurance online","get quote for car insurance online","get online quotes for car insurance"

Thus articles Full Q&A: Former Google lawyer and deputy U.S. CTO Nicole Wong on Recode Decode - Recode - All

that is all articles Full Q&A: Former Google lawyer and deputy U.S. CTO Nicole Wong on Recode Decode - Recode - All This time, hopefully can provide benefits to all of you. Okay, see you in another article post.

You now read the article Full Q&A: Former Google lawyer and deputy U.S. CTO Nicole Wong on Recode Decode - Recode - All with the link address https://8techtimes.blogspot.com/2018/09/full-q-former-google-lawyer-and-deputy.html
Share on Facebook
Share on Twitter
Share on Google+
Tags :