Cookies   I display ads to cover the expenses. See the privacy policy for more information. You can keep or reject the ads.

Video thumbnail
Let's imagine that humans have just invented superintelligent AI, so a computer that’s self aware and
very clever. Well, that means it’s time to play one of my favourite
games, Genocide Bingo
Rule 1 of Genocide Bingo is Don't win Genocide Bingo. Rule 2 of Genocide Bingo is DON'T WIN GENOCIDE BINGO.
What kind of AI have we made then? Well, it's sentient for starters. And it’s super duper clever,
probably a few million times smarter than the entire combined human race. Which means
we should probably give it access to the internet. Maybe it can find cures for new diseases or
solve political problems or whatever and oh, you just won genocide bingo. AI just
vapourised humanity in a nuclear holocaust. Good job, jerk.
The problem here is that self-awareness in all species of mammals at least usually results
in a strong sense of self-preservation. Well, an AI would be smart enough to know we could
turn it off whenever we wanted and it probably wouldn't want to be turned off. And there's
a very effective way to stop that happening, isn't that right Skynet.
But hang on, isn't this all a little pessimistic? Why would it want to wipe us out because it's self aware? Can't it just
be chilled out and kind instead? Oh like the kindness we show to species less
intelligent than us you mean? Om nom nom nom nom.
There doesn’t seem to be much of a correlation between intelligence and being nice. Dolphins
are pretty damn clever and they're one of the only species who kill not just for food
but just because it's fun, apparently. More intelligence doesn't always mean diplomacy and cuddles
but smarter ways to murder stuff. Why would an AI think differently? Even if there's only
a slim chance it will be evil, you only need to make one nasty AI out of a thousand and that's goodnight
homo sapiens, cheers for playing. And it doesn’t have to wipe us out with a nuclear
apocalypse either. There’s loads of other fun stuff it could do, like,
Crash the economy, poison the water supply, disable ATMs, takeover plane autopilots, knock
out the national power grids, sabotage nuclear reactors, disable the internet, disable telecommunications,
disable people, murder people, murder people, murder people, murder people, we’re all gonna
die Okay, let's try something else then.
We'll make it self aware again.
But this time we'll make sure it likes humans. We could even set some groundrules like serve your creators,
always be polite, and no bloody genocide this time, all right?
Yeah, great, actually, and oh, you just won genocide bingo again and everyone's dead.
Part of the bonus of being self-aware is that you can choose to modify yourself. We change
our minds all the time. Well, if it really is self-aware just because you coded a few
instructions in like always say please and thank you doesn't mean it couldn't just ignore
them. It's very difficult to imagine how you would hardwire morality into something which
is a million times smarter than we are. Let's try this again then and be really careful
this time okay? So, we'll put a little test into the mix.
We'll make it think it's got access to the internet, but really it will just be on a
secure server. Clever humans, eh? And if it behaves itself, we let it out into the real
world and then we'll- Genocide again?
AI is super
clever, much smarter than you and I and has likely already worked out that it
might be being tested and will just pretend to be pleasant for the sake of it until you
let it free. And then everything gets a bit killy. I've actually covered this in a previous video called 27 if you're interested.
Okay, a nice way around this then, let's just give it really basic instructions that can't
possibly lead to genocide, like make ice cream. That's nice, isn't it? And it's kind of hard to imagine how this could possibly
go wro- oh, great, this is getting awkward now isn't it.
So, superintelligence isn't like normal code. If you forget to add a bracket in normal coding
the program lets you know. AI though may well just keep doing the thing until it runs
out of resources and find clever ways of carrying on after that.
EARTH: Make ice cream please. AI: How much ice cream, exactly?
EARTH: Like, a lot? Jesus Christ, just do your job. AI: huh, all right.
Several days later, Earth: What is this? It's chewey and the flavour
is kind of weird. AI: Yeah, so we ran out of cream a few days ago a
few days ago. Earth: Right. And you've been using, what,
exactly, condensed milk or something? AI: Yeah. Something like that, yeah.
Earth: Um......why does this ice cream taste like human babies?
AI: Nice weather today isn't it? Earth: Oh, for fuck's sak-
Okay, let's give it all we've got then. Sentience, loves humanity,
access to the internet, intention testing, basic instructions, and a few ground rules.
What do we get huh? I think we call know what's coming don't we and
Oh. That's a nice surprise. If we get the mixture right somehow, and we
avoid genocide by building friendly superintellience, we haven't just
built AI.
Even if AI is friendly,
what we may've done is just given birth to our successors. They'll be millions of times
smarter, faster, and more creative than us, and they will only keep getting better. It
takes a very long time for humans to evolve, hundreds of thousands of years, even for very small changes. Superintelligence
could do it in nanoseconds. And there probably won't be an off-switch.
And when you think about it like that, the whole history of our species seems a little
like that quote by Marshall McLuhan, we might be the sex organs of the machine world. Or
rather, when the machines look back on our civilisation, the whole of purpose of it,
to them, may've just been to build theirs. That isn't a very pleasant answer to what
is the meaning of life, but it might be an accurate one.
And instead of chrome spaceships and galactic human empires waiting in our future, our species
might instead just be a small mark on the evolutionary tree somewere between slime and
gods. Let's just hope those gods are thankful to their dumb parents when we eventually give
birth to them. Otherwise genocide bingo might be the last game we ever play.
Remember books? Yeah me neither. Well I just finished writing one. It's a load of short stories
about spacey stuff and love and the future and there may or may or not be boobs. But
there might be; hint hint. It's taken me about a year and a half of hysterically screaming at
a keyboard, but here it is. Most of the videos on this channel started as short
stories originally, and you can read some of them by clicking on the link in the description.
Check it out if you like, leave a cripplingly bad review, I hate you, goodbye.