Cookies   I display ads to cover the expenses. See the privacy policy for more information. You can keep or reject the ads.

Video thumbnail
Welcome to Hot Topics in Computing at MIT.
I'm Dan Huttenlocher, the dean of the Schwarzman
College of Computing.
We're really delighted to have Brad Smith, the president
and also the chief legal officer at Microsoft.
While many of us see technology as a double edged sword,
as technologists we often tend to see the opportunity
and the potential for good in new technologies,
maybe blinding us to the other edge of the sword.
Brad really stands out in his ability
to articulate the balance of positive and negative impacts
from technologies and how to get better outcomes
overall from understanding and addressing these in advance.
He's really transformed how Microsoft
and much of the tech industry see computing technology.
Growing out of the Department of Justice actions
against Microsoft in the 1990s, he
began framing the challenges as well as the opportunities
of new technologies in ways that technology leaders could
appreciate.
He successfully brought Bill Gates,
one of the great technology leaders of the 20th century,
on this journey, and I would say instilling
in him and the company the important roles
of many societal and government actors
as we think about technology.
I think that we have a lot to learn from Brad.
While I know that we're going to learn
a lot in today's fireside chat, I also
really highly recommend his recent book, Tools and Weapons.
It has compelling, even, I would say,
page turning vignettes, each of which
makes deeper and important points about technology
and its positive and negative impacts.
We have about 250 questions that participants
submitted in advance.
We'll be getting to some of the more frequent ones of those.
And you can also enter questions at the bottom of your screen.
Brad is going to be joined by Daniela
Rus in this fireside chat.
She's the deputy dean of academics
here in the Schwarzman College and the director of CSAIL.
I'm sure we're in for a great conversation.
Daniela and Brad.
Thank you, Dan.
We are so excited to welcome Brad Smith
to our virtual campus today.
Our teams first discussed this event in the winter.
And this feels so long ago by now.
Because Brad, there's been so much happening
in these past few months that has
had a big imprint on technology, so I'm really
eager to get into this conversation.
We have many MIT students with us today.
And they submitted over 200 questions.
So let's get started.
Now first, I saw that your CEO, Satya Nadella,
said that Microsoft has seen two years
worth of digital transformation in two months.
And all sorts of organizations are working
on COVID-related research.
I'd like to start by asking you, what
do you view as the biggest challenges
Microsoft has faced so far and what lessons have you learned?
Well, first, thank you, Daniela.
Thank you, Dan.
And thanks to everybody for watching this.
I do want to just open by saying how much we
at Microsoft appreciate the opportunity to work with all
of you at MIT and, frankly, how much we
learn from working with of you.
It's such an important relationship and partnership
for us.
Eric Horvitz leads a lot of the work
here to work with you all, including on the issues
that we'll be talking about today around trustworthy
and ethical artificial intelligence
and other technology.
But I do think-- look, let's start with the obvious.
This is a time unlike any in our lifetime.
You literally have to go back 102 years
to find a pandemic that has had the reach that COVID-19 has.
I think it's interesting to think
about how different the world's response is in the year 2020,
not just compared to 1918, but just imagine if this
had happened in 2010 or 2005.
In so many ways the world has gone digital.
And this event itself, I think, really illustrates that.
You know, interestingly for us at Microsoft,
having our own people work from home
has not been that difficult. Trevor Noah
joked with me-- we work together on a number of things--
that just we were sort of practicing for this every day
just by being Microsoft, by developing this technology
that we use every day.
But our role throughout this has really
been to support all of the other organizations and institutions,
governments, nonprofits, businesses, literally
on a global scale that have had to transition
to so much more remote work.
And one of the interesting things
you realize, and I think it speaks
to sort of the multifaceted aspects of technology
that we're talking about today, is that of course you start
with the technology itself.
What is it that people want to deploy?
Is it Teams?
Or is it more cloud services?
That is sort of a natural starting point.
But then you immediately realize that you
can't deploy that without fundamentally changing business
processes, government processes.
We've seen the Ministry of Justice in Italy
take all of its trials online.
We've seen so many companies take so many
of their business processes, customer delivery
processes online.
And so we have to work hand-in-hand with them.
We're not a company that can just say,
here's the tech, good luck, let us know how it turns out.
We work with them to then customize their solutions that
work on top of our platform.
And then there's one final aspect
that I think has been so interesting.
It's the old adage that you need a clear strategy,
but ultimately, culture eats strategy for breakfast.
I think this has been illustrated
really interestingly in our schools,
including our public schools in the United States.
I think they've struggled in many states.
They don't have the connectivity for students at home.
They don't have the devices.
But even when they do all of that,
you really realize you have to work with teachers.
They need to learn not just how to use the technology
but to create a culture for a classroom
when the students aren't in the room.
And so it's not just a huge technology challenge,
it's an enormous human challenge.
And I think the great opportunity
for us has been the ability to just stand next
to our customers and sort of walk a mile in their shoes
so that we can try to explore all of this together.
You know, that's been just an extraordinary opportunity
for learning for us.
Are there particular efforts you have
been most proud of with respect to this work?
Well, I think in the short run, this is a public health crisis.
It's a public health crisis that has created
a profound economic crisis.
But I'd start with public health.
I think one of the things that has emerged
is that data has become the indispensable tool to address
the public health aspects of this.
Think about the decisions that governors
are making in the United States that they've never
impacted our life even remotely as much as they
have in recent months.
They're running the economy.
They're deciding who can leave their house
and who needs to stay home.
But they are decisions that need to be based on data, data
about the spread of the infection,
data about hospital capacity, data
about different counties in the state
and the mortality that is being seen and the like.
And so we've worked very closely with the counties and states
in the United States, with national governments,
and with the World Health Organization
to really transform the way they are collecting data,
the way they're reporting it, the way they're sharing it
with other leaders, the way they're
sharing it with the public.
So that has been just sort of one of the slices
I've seen in the public health space.
I think another really interesting example
does come from the AI kinds of technologies
that you all at MIT have been so involved in.
It has been sort of a moment of real importance
for conversational AI.
We had developed an AI, a chat bot that was being deployed
by a hospital here in Seattle.
Well, it's now being deployed by over 1,500 institutions
in 23 countries.
That's amazing.
Yeah.
And basically, what it does is it allows an individual who
may think they might need a COVID-19
test to go through a conversation that
will begin to determine whether they in fact do.
And if so, it can hand off the individual
to a human via telehealth service.
But in April, that was used 190 million times.
So just think about that, that explosion.
And that is, I think, a great manifestation
of two years of digitization taking place in two months.
That's truly extraordinary.
And now let me stay with COVID data for a minute and ask you,
so I know you've said that you are
skeptical about the use of contact tracing.
And I'm just wondering what leads you
to believe this is the case.
Well, first of all, contact tracing
is of fundamental importance in the case
of any communicable disease.
Now I will be quick to say, I say
this as a person who on New Year's Eve
could have written down perhaps everything I
knew about public health on the back of a postage stamp.
This has been a crash course for all of us.
But certainly what we learn from the present
and from the past is that for over a century
whenever anybody has a disease that
is spread in an infectious way, they're
interviewed by a public health officer.
And basically part of the routine
is, where have you been, who did you see.
And the public health officer then contacts, typically
by phone or email, each of those individuals.
That is a process that is fundamentally important.
Now we're talking about deploying apps for that.
I think that's a good thing.
Don't get me wrong.
I think that app-based contact tracing is an important tool.
But I think we should recognize that it is a tool and not
a panacea.
It is, in my view, sort of a belt and suspender approach.
We still will need public health officers
to interview each individual who tests positive over the phone
or in person, even if they've been using
an app on their smartphone that has,
say, something like Bluetooth-based contact data
on it.
Why?
Well, the first reason is not everybody
is going to walk around with an app on their phone.
In Western Europe, in North America, it's very clear,
governments are not going to make this mandatory.
Even in Singapore, where it's not mandatory
but they have exhorted the population
to use this technology, they've struggled to get it
above a 20% adoption rate.
The second thing to just remember
is even if you could get everybody
who has a phone to use the app, not everybody has a phone.
Some of the most vulnerable people in this pandemic
are homeless people.
There are people who are elderly in nursing homes.
So we just can't base our planning on the assumption
that everyone will have this app.
Finally, we need to recognize, this
is a smartphone we're talking about.
It's not an ankle bracelet.
People forget to take it with them to the grocery store.
The battery dies.
So I think it's an important tool.
I think it's very valuable.
I think it can make life easier and it
can make this whole approach to contact tracing more effective.
But it won't replace the humans who need
to do this contact tracing.
And frankly, it won't replace the need
for individuals who test positive still
to go through a lengthy interview.
What about privacy with respect to contact tracing?
Is that a concern?
Well, I think it absolutely is.
And I think it's an important thing for us to focus on.
And I think it's the kind of thing that should call on us
both narrowly and broadly to remember,
we need to address some new problems
but we don't need to throw what I would call timeless values
out the window to do so.
A few weeks ago we published seven privacy principles.
And in some ways, I think some of them
reflect the degree to which you can harmonize, say,
the protection of privacy and the protection
of public health, and then you can get to the issues
there where there may be some real tension between them.
But for example, if we are using this app and this data
to protect public health, I don't
think it's a stretch to say then the only proper use of the data
is to protect public health.
And I don't think it's a stretch to tell people
that their data will not be retained
beyond the point in time in which it's actually
useful to protect public health.
Those are two principles where I think we can actually advance
both causes at the same time.
There are other areas where there are certain tensions,
but we believe these can mostly be reconciled
and we think we can do this in a way that,
by and large, strongly upholds the protection of privacy.
And I think that should be our goal.
So this is a great segue to talking about your book, Tools
and Weapons, which I finished reading over the weekend
and I enjoyed it very much.
The book read to me like an action adventure book
with geeks as the heroes.
So thank you for that.
So to me, the book summarizes your views
on the roles companies should take
in having a more moral stance on how
technology is viewed and used.
And the book also includes some great historical context.
I loved the backstories for events
I remember, for instance, failing to track Daniel Pearl's
kidnappers, successfully finding Charlie Hebdo attackers,
and hearing about how your CEO, Satya Nadella,
decided that the security patches
for WannaCry should be broadly available to the public.
And so my first question is, what
inspired you to write this book and what did you
want to communicate to the public with it?
Well, first of all, thank you for reading it.
I do find that when you do an event like this,
it encourages people to finish it the weekend before we talk.
So thank you.
I love your description, because I do think you're right.
In our minds, geeks are the heroes of a lot of dramas
that involve very important inhuman problems.
Carol Ann Browne is my co-author.
We work together at Microsoft.
And we really wrote the book to try
to reach a broader audience, to bring these stories to life,
to let people see how they have been addressed in practice,
not just at Microsoft but in many other places as well,
to draw on some lessons from history.
Because in our mind, the tool of the book
really captures the challenge we think we face.
Digital technology, I think, is the world's most powerful tool
for so many huge problems, but it has become
a formidable weapon as well.
And so our ultimate goal is to broaden the conversation
that we think the world needs to have about where technology is
going, how it needs to be managed,
and how it needs to actually build
on the geeks who are the heroes, but now recognize that we're
at a point where these issues are
of broad importance for everybody.
You don't have to be a student at MIT.
You don't need to have a PhD in computer science
to have an important role to play.
These issues affect all of us.
And if nothing else, as consumers,
as citizens, and as voters, we're at a point in time
where I think we all benefit if everyone is better informed.
So let's dig a little bit deeper.
In your book, you say that the two inflection
points of this decade are Cambridge Analytica and Edward
Snowden's revelations.
Can you elaborate what makes these events so
important in your mind?
Well, if you look back at the period from 2010 to 2020,
I do think these were two inflection points.
The Snowden disclosures happened in the summer of 2013.
And what they did, more than anything else, I think,
is awaken the public, not just in the United States,
but around the world, to the degree to which governments
had access to so much data about each and every one of us.
And it led to a effort to reform government surveillance
and create more protection for people.
One of the fascinating moments for me
is a story we share in the book.
It was being in the Roosevelt Room in the West
Wing of the White House at a meeting in December of 2013
with President Obama, Vice President Biden, the White
House senior staff, and 18 leading tech executives.
And as each of us were pushing President Obama to reform
government surveillance, there came this interesting moment
when he looked around the table and he said,
I just want you all to recognize,
you all have more data about people
than the government does.
And he said, I have a suspicion that the guns will turn
and there will be a moment when the demands that you're
placing on the government will be placed on you as well.
I thought it was a very insightful comment at the time.
I wrote it down, and I looked around and I was struck,
no one else was writing it down.
And that moment came.
It came-- it took five years, which
would have been, frankly, longer than I predicted at the time.
But it took 2018, and it was Cambridge Analytica.
I think for a variety of reasons that we describe
in the book, that is the moment when then the public focused
more on the amount of data that tech companies have on them.
And it's really a watershed political moment, in my view,
because it started to unleash more scrutiny of tech
companies, certainly in the United States,
but equally important, perhaps even more important,
on a global basis.
So data bring us back to privacy considerations.
You have a wonderful chapter in your book
that presents privacy as a human right.
And you start this chapter on privacy
by describing your experience in Berlin when you visited
the headquarters of Stasi.
And for me, this was a very powerful moment of the book,
since I was born in Romania during Stasi-like times.
You also say that this was one of the most memorable moments
of that year for you.
How is this?
How did this experience of visiting the Stasi headquarters
impact your views on privacy as a human right?
Well, it really was a moving moment for me, for Carol Ann,
and for others from Microsoft who were on this visit.
In what was East Berlin, the former Stasi prison
is now a museum.
And we had the opportunity to walk through the prison
with an individual who was a prisoner there in 1968.
You know, he's a lot like a lot of us
who are all connected here today.
In 1968, he was 24 years old.
His name is Hans-Jochen Scheidler.
He was pursuing a graduate degree in physics at the time.
He was studying at the University of Prague.
And 1968 was the year of the Prague Spring.
Well, he was back in Berlin when the Warsaw Pact tanks rolled
through Prague, just shut down the liberalization
that was taking place there.
He and three of his friends were very upset about this.
So they printed up these little leaflets
and they handed them out to people one evening.
He was reported to the Stasi.
They picked him up that night.
They took him to this prison, where for many, many months,
he never saw another human being except for the individuals who
were interrogating them.
He was never given the opportunity to read anything.
He was never given the opportunity to write.
His parents had no idea what had happened to him.
But when you think about what put him there
in the first place, it was this extraordinary degree
of surveillance that the Stasi had unleashed
on the people of East Germany, using all of the technology
and hundreds of thousands of informants.
And when you think about what he did, he wrote a few lines
and he handed it out to people.
Well, in our day, that's called sending a text message.
It's called writing an email.
And with cloud computing and online services,
we now live in a world where if we're not careful,
any government can use the power of technology
to unleash surveillance on a scale
that the Stasi could not have imagined.
And you see what can happen when that kind of power is abused.
And I think it's just really important for us
to use that as inspiration.
One of the most interesting moments for me
was to share a short video that we produced.
We had a camera crew with us on that visit.
And we have an annual sales meeting at Microsoft.
It's in T-Mobile Arena in Las Vegas.
You're speaking to 15,000 people.
And as you can imagine at a sales meeting,
mostly it's, this was a great year, we're doing really well.
Everybody's so enthusiastic.
And I showed that video.
And I frankly worried.
You know, I was like, well, this is a bit of a downer.
Is this going to land well?
But I thought it was so important for our employees
to see that through their own eyes.
And I will frankly always remember
not just the visit to the museum and the prison,
but the standing ovation of all of our employees.
To me, I think this is the message for everybody who's
watching now.
It is our time.
And it is, frankly, going to be your generation that
will decide how this technology impacts peoples'
rights around the world.
And I think we have to learn from the past
and, frankly, just resolve together
that we won't allow that kind of thing to be repeated.
Brad, that's so powerful to me.
Let me follow up with a question on where
you see the role of technology and of the legal system
to ensure our privacy as a human right?
Well, I think two things need to go together.
First, just as technology can be weaponized,
as that last example showed, it is a powerful tool
for the protection of human rights.
We're using it, for example, with George and Amal Clooney.
Amal Clooney is one of the great human rights lawyers
in the world.
And with the Clooney Foundation for Justice,
we've created a cloud service and an app
so that monitors to go into courtrooms
around the world, courtrooms where
journalists, or women, or people from the LGBT community
are potentially being persecuted and then prosecuted in ways
that put their rights at risk.
And you can capture data, you can share it with the world,
you can ensure that that trial is
evaluated by judicial experts in other countries.
All of that is happening.
That's just one small example of all of the ways
that technology can be used, I think,
to promote the protection of human rights.
But I do think at the end of the day,
there is no substitute for protection
under the law itself, protection by judges and courts.
And so a lot of what we need to focus on
is how technology and the law can advance together
to meet these broad needs.
That's really one of the other things that
inspired us to write this book.
Yes.
And technology and law need to understand
each other's language and capabilities in order
to make that path forward together.
Now a related issue to privacy is security.
And you are instrumental in guiding Microsoft
toward building a strong security foundation.
And you decided that Microsoft needed
to be the most secure company and held your ground on this,
which was visionary and courageous, especially
at a time when people did not really
care about the privacy of their data
and other companies were betting on advertising.
I'm just wondering about the back story.
How did you decide on the importance of security?
Well, I think security became so fundamentally important
beginning in about the year 2000.
You know, software was designed for the personal computer,
but no one envisioned that the personal computer would always
be connected to the world.
And so we recognized that that was going to take place.
And we saw the attacks that started to take place.
And so we started to harden computers,
we started to harden network architecture and software.
And what we really then learned was
that we needed to keep pace.
And we needed to rely not just on technology
to protect people, but we were going
to need to work with governments in new ways.
We were going to need to work with the rest of the industry.
We were going to need to advocate for stronger
laws around the world.
And so it was really that combination that I think
led us to put ourselves on a journey.
It's a journey that continues today.
We continue to see nation states becoming more aggressive.
And so we're having to figure out
exactly what new initiatives we need to take.
It's institutions at MIT that, of course,
play an extraordinarily important role as well.
We're all in this together and we're all
focused on acting together.
Brad, with elections coming up this fall,
a lot of states and counties are talking about the possibility
of voting apps.
What's your take on the security of electronic voting?
Well, I think the interesting thing
is in a world where we take for granted security
for financial transactions every day, we're not yet in a world
where we have security for elections and voting the way
we would like.
And so what I think we're going to need to do
is look towards the future, where we frankly
can have online voting.
Why not?
But it is going to take time to be able to deploy it well.
I think one of the things to just recognize
is we engage in financial transactions
every moment of the day.
There's billions every day.
Voting happens infrequently.
It's a powerful reminder that just in life we
get better at things that we do frequently
than things that we do rarely.
But I think we need to put ourselves on an ongoing quest
to develop more secure online voting solutions.
In the meantime, what we urgently need to do
is secure the voting systems that exist today.
In so many places in the United States,
we have voting machines that are 10 or 15 years old.
And so one of the really urgent needs,
including for this election year,
is to keep working, typically with municipalities,
with counties, with states, to harden their voting systems.
Now you have a powerful chapter on AI
and democracy in your book.
And you also warn about a potential digital 9/11.
What are you doing to prevent this, meaning what is Microsoft
doing to prevent this?
Well, first of all, I would just say
if you think about what a digital 9/11 would
be, I think it's two things.
One is it would be a government attacking.
Attacking, say, so that the electrical grid goes down,
hospitals go down.
That is what happened in 2017 with WannaCry.
The National Health Service in the United Kingdom, a third
of their hospitals went down.
Cash registers, ATMs, everything stops working.
That's what happened in Ukraine with the NotPetya attack.
But ultimately I think one of the worst attacks
could be if we learned a week after an election
that voting had been tampered with, that the voting rolls had
been tampered with.
I think if something like that were to happen,
then I think we'd have a real issue.
I mean, just think what it would mean
if no one had confidence to know who had been elected president.
That would be a fundamental problem.
Our entire basis for society, I think, would be put at risk.
I couldn't agree with you more.
And I'm so glad that Microsoft and many other organizations
and researchers, including a lot of researchers at MIT,
are working on the challenge of security.
I'd like to turn it over to a different topic,
and that is regulation.
And I was very intrigued.
Because in your book, you make an argument
that I think is a bit rare among executives of major tech
companies.
And that is that Silicon Valley needs more regulation.
And so I wonder, can we get specific?
In what particular areas do you think companies
like Microsoft need more regulation?
Well, first of all, I think we should recognize broadly
that a well functioning market typically
has room for lots of innovation, lots
of room for entrepreneurialship, but a layer of regulation.
We have more confidence in the safety of food
and drugs because of the FDA.
We have more confidence in the safety of our cars
and aircraft because of regulatory authorities.
So when you put things in that lens,
I do believe that digital technology has reached
a point where it would benefit from more regulation,
not the kind of regulation that would stifle creativity,
but I do think to protect privacy,
I think to protect security, I think
to ensure the responsible use of artificial intelligence.
That's why we've called for regulation
of facial recognition technology.
We now have the first example of that,
a state law passed in Washington state,
I think not coincidentally, where Microsoft is based.
So we have all of that to learn from.
I think digital safety, the protection of children,
that's another obvious area.
All of these I think are ripe for regulation.
And that's why we're seeing more regulation in it.
I will say-- you're going to get a kick out of this--
Microsoft is so good at protecting security
that my company's policies are telling me
that they're going to reboot my computer in seven minutes.
So while I'm listening to you, I am
trying to find a way to disable that, which I may or may not
find in the next seven minutes.
So this will be one of the more memorable security
updates in my life if this, in fact, happens.
So let's see what we can cover.
I hope not.
I have not gotten to many of the 200-plus questions.
I know.
But let me ask you something about China.
So Microsoft has had a direct presence in China
for a long time.
Your R&D initiatives there are significant.
Can you tell us a bit about your experience there
and how the current and foreseeable political disputes
between US and China are affecting you, if at all?
Yeah, no, they definitely are.
I think they're affecting all of us.
I think it really has a three-part impact.
First, it's having an impact on supply chain stability.
And I think there are real questions for the future as
to the degree to which American tech companies will
be able to rely on the Chinese supply chain for hardware.
Second, it's having an impact on access to the Chinese market.
I think the Chinese market has really
provided only limited access to most American tech companies.
And that doesn't look like it's going to change.
In fact, I think it may get harder rather than easier.
And finally, I actually think the single most important thing
for us to think about, the thing that's
most important for Microsoft, I would argue for MIT,
it's the intellectual interaction.
China, as we all know, has some of the world's greatest
engineers, software developers, academics.
You know, software is not created
in only a single country.
Especially with open source techniques and the like,
it's created by people working together.
And even if supply chains change,
even if access to the market is limited,
I think it's of profound importance to the world
that people be able to learn and study
and research and think together.
And I think preserving that, amid these other challenges,
is really an important priority for all of us.
What about a digital iron curtain?
You talk in the chapter about China
about the possibility of a digital iron curtain.
We've also heard former Google CEO Eric
Schmidt talk about splintering the internet in two,
one Chinese led and one non-Chinese led.
What would be the geopolitical and economic impact
of this splintering?
And how do you see the creation of this digital iron curtain?
Well, I think the world has been splintering since 2010.
You know, the internet in China is not the same
as the internet in the United States.
But I think what we now see more and more
is obviously a degree of technology
competition between the United States and China.
We see public policies in both countries
focused on this competition.
I think increasingly, American companies
have an advantage in the United States,
Chinese companies have an advantage China.
They compete with each other in Europe and elsewhere.
We'll see.
It is increasingly a bipolar technology world.
That's the phrase we use in our book.
This will be, I think, one of the defining
issues for this decade.
And a lot remains unknown about how exactly
this is going to unfold.
We've talked about a lot of challenges.
There is one big challenge that is impacting everyone,
and that is climate change.
I personally believe that our only way out
of this big challenge is to figure out
how to cool the planet by 1 degree
rather than figuring out how to not warm up by more
than another 1 and 1/2 degrees.
And I was inspired when earlier this year Microsoft
announced ambitious plans to go carbon negative by 2030.
Not only that, but also to turn back
the clock on all the carbon Microsoft
has produced since 1975 by counterbalancing that number
by 2050.
What do you see as the biggest challenge
as you proceed with this objective?
Well, we're going to, first of all,
need a broad public understanding and commitment
to the goal.
I think the goal needs to get to be carbon zero, meaning
net zero by 2050, meaning as a planet we're
removing every year as much carbon as we emit.
Now to do that, then two things become obvious.
We need to emit less carbon and we need to remove more.
We need a number of techniques to emit less carbon.
Digital technology plays a powerful role
in our ability to measure this, to create standards,
ultimately to reinforce this through private action
and I think public regulation in a way
that will drive emissions down.
But I think we also need a massive investment, perhaps
the biggest R&D challenge of the century,
to develop new techniques to remove carbon
from the environment.
Things like direct air capture are so clearly
in their infancy, they're not yet cost effective.
And we're going to need just huge breakthroughs
in that space over the next three decades
if we are going to achieve this fundamental goal of protecting
the planet the way we need to.
And we have a tragedy of the commons here.
Many more companies have to step up and follow Microsoft's lead
on this.
Because everyone needs to pitch in,
otherwise we won't get there.
No, we do.
And I'm just going to say now, in case you lose me
in one minute, thank you for the 45 minutes rather
than the hour.
But let's hope that I keep going with my update.
It's put off a few minutes.
So I'd like to switch to some questions by--
Go for it.
This will be the ultimate speed round.
OK.
So here's one.
How can workers proactively expand on their skills
to adapt to the increasingly digital AI-driven environment?
Well, I would say two things.
People who are not in the computer and data science field
need to master more digital skills.
But my plea to all of you, because I think so many of you
probably are more in the computer and data fields,
is don't neglect the other aspects of the liberal arts.
Make sure that you're in a position
to not only help drive technology forward
but to think more broadly about the use cases,
the fundamental societal problems
you can help solve, the ethical, almost
the philosophical issues, the policy issues, the legal issues
that will also need to be addressed in order
to advance technology in a way that genuinely is
going to best serve the world.
I think only if we're able to think
in a more multidisciplinary framework than we have
in the past are we going to be successful in ensuring
that technology serves the public in the way it really
broadly needs to do.
This is definitely the spirit of MIT's College of Computing.
OK, next question.
Does Microsoft have plans to allow
people to work from home permanently like Twitter
recently announced?
[COMPUTER TONES]
OK.
I wonder if we should wait for a minute
to see if Brad can rejoin us.
Otherwise, I hope you've enjoyed the conversation.
I certainly have learned so much from Brad's wisdom.
I encourage all of you to read his book.
It's really a tour de force through many ethical challenges
that we as a community face around the use of technology.
So let's see.
There's some questions posted on Q&A.
And I am going to wait for just a minute to see if anybody is--
oh, look, Brad is back.
Excellent.
We are so excited to see you back, Brad.
[LAUGHTER]
I just hope I can stay.
OK.
Oh, I asked you a question that made you run away.
There's a lot of benefits from working at home,
but having your own home computer managed by the company
may be one or not.
It definitely keeps it more secure
but it adds this level of unpredictability.
Let me just say, at Microsoft we first
told people, hey, feel comfortable working
from home through October, until the beginning of November.
I think your real question is, what does the future look like
when COVID-19 is a memory?
I think we'll have more people working from home.
I still think we're going to have offices.
You know, Satya Nadella, our CEO, I think,
put it really well at a conversation he had yesterday.
He said, let's not replace one dogma with another.
The old dogma was everybody needs to be in the office.
Let's not create a new dogma that
says no one should ever see each other in person,
we can all do things through these screens.
I think that what we're going to find
is more choices for more people.
We'll see the technology continue to advance.
Certainly we're working to accelerate
the pace of innovation for our products, like Teams.
But I do think that at the end of the day,
there is something important about seeing
each other in person.
There is these conversations that
happen before a meeting begins, after a class ends.
I think there are still important things that
happen when people sit down and have lunch together.
So we should keep all of that in mind.
I think we're going to live in a world that has more of what
we call remote everything.
There will be more remote digital services.
But I think our ultimate goal should
be to bring out the best of what that offers
and the best of what we as human beings
get by being able to spend time together in person.
So the next question is how you envision the world
in the next 10 to 20 years.
You touched upon how we're going to work post-COVID.
What about living and playing and learning?
Well, in addition to working at Microsoft every day,
I'm on the board of Netflix.
So I get to see things from that perspective as well.
And let's face it, this isn't just
something of the last couple of months
where we're all relying on remote entertainment,
but we've been seeing this change over the course
of a number of years.
Yeah, clearly the use of digital technology
is going to be one of the defining trends of the 2020s,
as it really was for the 2010s.
But what I would say is that while it has so much potential
to bring improvements to people's lives,
we also do need to think hard about the inequalities
that technology can create.
You know, income inequality has become
one of the really important issues
in the United States and many other countries.
And digital technology, to some degree,
has exacerbated that, in part because we're
seeing tech companies do so well, in all honesty.
Businesses that are not grounded in technology
have had a harder time.
But I think it is broader than that.
I think we live in a world with internet inequality.
You see this very sharply when you
see the tens of millions of Americans
that don't have access to broadband.
I think we live in a world of educational inequality.
And digital skills and the heightened need
for digital skills is shining a spotlight on that.
I think we live in a world where digital technology can
do so much but it can exacerbate the inequalities, for example,
that people with disabilities may confront, especially
if we don't do a great job of designing technology
to meet their needs.
So I think that these inequalities of the world
will be one of the important issues of this decade.
They're all going to be more pronounced because
of this extraordinary recession, if not depression,
emerging from COVID-19.
And I think for all of us who aspire
to have geeks as the heroes, as you said, I think we need geeks
to be heroes and we need teachers to be heroes
and we need public officials to be heroes.
Because if we're not careful these advances in technology
will make the inequalities greater rather than smaller.
So let me try to get a few more audience questions in.
What do you see as the role of Microsoft
in the open source community?
Well, you know, one of the more interesting few pages
in our book, I would say, certainly
for those of us who are involved in experiencing it,
and for Carol Ann and I when we wrote about it,
is we acknowledge very directly that Microsoft,
as they say in the book, was on the wrong side of history
when open source really exploded at the beginning
of this century.
And I can say that not just about Microsoft,
I can say it about me personally.
The good news is if life is long enough
and you're successful in other areas,
you can be on the wrong side of history
and learn that you're there and you need to change.
And that's what we did.
Today Microsoft is the single largest contributor
to open source projects in the world,
at least when it comes to businesses.
And we're very proud of that.
When we look at GitHub, we see GitHub as the home
for open source development.
And we have a responsibility as a steward
to make it a great and secure and productive home
for the world's open source developers.
So we're much more now in the open source camp
than outside it, even if not every piece
of code that we write is developed as part
of an open source project.
I think in some ways, one of the other really interesting
questions for all of us is how do we learn from open source
and apply that learning to what we now refer to as open data.
We launched, just a few weeks ago,
what we're calling an open data campaign, a global effort
that we think, really needs to be focused
on ensuring that more data sets, perhaps especially data
sets that relate to great societal issues, are opened up.
So we've created five principles that we're
applying to ourselves and the data that we have.
We're investing in more work with nonprofits.
We're promoting public policy.
Ultimately, I think the great need in the world
is to ensure that those that have smaller data
sets, whether they be companies, small businesses, universities,
or the like, have the opportunity
to scale by federating their data with data sets held
by other institutions.
But there is a clear need to do so
with the kinds of platform and tools
that a company like Microsoft creates.
It is an imperative to protect privacy.
We're very enthused about new techniques like differential
privacy and where we see that technology going.
There needs to be the recognition
that people need to be able to share data
without surrendering their ownership or control of it.
There's a lot here that is very broad.
And I think this is another area that's really ripe for progress
at a place like MIT.
OK, so if you were a student today,
what would be your major?
You know, I'd majored, as I said,
I'm the liberal arts side of liberal arts meets engineering
at Microsoft.
I like to quote Steve Jobs, who said
that he worked every day at the intersection
between engineering and the liberal arts.
It is a way of reminding us all that in fact we
need the kind of multidisciplinary approach
that I was talking about before.
I majored in international relations at Princeton.
I'd probably major again, if I could, in history.
I have just found it remarkably helpful for somebody
who does the work that I do to be able to learn from the past.
You know, one of the interesting things about history
is that ultimately there's sort of the history of everything,
including the history of science,
the history of technology.
I definitely would not be the kind of student that
would study only one thing.
One of the great things you realize later on in life
is that when you're in a university,
it's an enormous opportunity to study lots of other things.
And so that's what I would encourage people to do.
Brad, there is a lot of uncertainty in the world
right now and a lot of reasons people feel anxious.
On the flip side, I'm just curious what motivates you
and what makes your clock tick and get up and open
the day with an exciting outlook every morning?
Well, look, this is a hard time.
Let's just recognize the obvious.
Even for those of us who have the great good fortune
to have jobs, to not be among the 36 million Americans who
have filed for unemployment claims
in the last three months, to be among the people who
are healthy, this is a challenging time.
But I also think that because of the challenging times
that we confront the truth is the world has never
needed digital technology to do as much as it
needs it to do right now.
And even when COVID-19 is a memory--
it will become a memory--
you know, we're not going to be talking about COVID-19 in 2030
as much as we'll be talking about carbon, I suspect--
the world will need the kind of technology we can create.
And when I say we, I don't mean we at Microsoft,
I mean all of us who are having this conversation.
I mean all of you who are getting a degree at MIT
right now.
This is the future that you're going to create.
And you have a bigger opportunity,
I think, to have a more important and more positive
impact in the world than perhaps any generation that has ever
studied at MIT has had before.
Because digital technology is just ubiquitous in the world.
And the opportunity to ensure that it not only works well,
but it does good for the world, oh, my
gosh, how can you not be excited about getting up
in the morning.
That's sort of what I feel.
And yeah, I just am frankly so happy I've had the opportunity
to work with great scientists and engineers
on technology that matters.
And I really hope that all of you will as well.
It's something that should inspire all of us, I believe.
Well, this is fantastic advice for our students.
Thank you for that.
If you go back 40 years, in hindsight,
what advice from a global leader would
have been most useful for you at that time?
You know, I've worked with Bill Gates,
Steve Ballmer, Satya Nadella.
I've had the opportunity to meet a lot of great world leaders
in my job.
The one thing that I find truly successful people have
in common is enormous curiosity.
And I think it's something that if you're at a place like MIT,
you probably have already.
But the advice I would give is it's like a muscle.
Exercise it.
Use it.
Push yourself to read more, listen more,
learn more in more fields.
Start with what is at the core of what
is your major or focus in graduate education,
but keep adding to that.
And what you'll find, I think, is two things.
One is there are more dots that you can connect than you
can ever would have imagined.
More things relate to each other as you broaden the aperture.
But the second thing is life is just more fun when
you ask people questions.
It's a lot more interesting when you
get to learn from other people.
And develop that sense of curiosity.
It's not only something that will make you more successful,
I think it will make your life more enjoyable.
And I benefited from a lot of similar advice.
So I think I heard some of that when I was a student.
But if I could go back in time, I would write a little note
to myself and I'd probably read it every morning.
Well, I'm inspired.
And unfortunately we are at the top of the hour with you.
And we will have to save the other questions for the future.
So I would like to take this opportunity
to thank you for sharing your wisdom and insights with us
and also give you an open invitation
to come back for an in-person visit to our campus
when we are open for business again.
I look forward to being there in person.
I always enjoy my visits.
It's great to connect digitally, but it's even
more fun to be there in person.
So thank you.
Thank you, Brad, for joining us today.
And thank you, Daniela, for moderating
this truly fascinating and really deeply informative
conversation, which I knew it would be in advance.
But it's great to see it actually realized.
And in fact, even the little temporary gap in the middle,
at least for me, it gave me a little bit of time to catch up,
because just this constant sort of pacing
of questions and answers, I needed a little processing
time.
So it just-- thank you so much.
Really fantastic.
Just one housekeeping thing--
While you were catching up, I was panicking.
But it was, you know, it was a good little interlude
for the audience--
Thank you.
--with 2020 hindsight.
Just remember, Microsoft, we make software secure.
Yeah.
Well.
But this is the world.
If we're going to be more secure, if security
isn't first, it never happens.
So it's actually a good lesson.
That's right.
We always put it off otherwise.
But just one housekeeping thing.
We'll be posting a video of today's event
on both the CSAIL and College of Computing websites
once we have the time to get it closed captioned.
So thank you all.
Everybody, especially in these times, take care and be well.
Thank you for joining us.
Thank you.