Cookies   I display ads to cover the expenses. See the privacy policy for more information. You can keep or reject the ads.

Video thumbnail
So typically when people talk about
probability they think about nice probability distribution like the bell curve or the Gaussian curve
So this means that it's most likely that you get something close to zero and then less and less likely
that you get very positive or very negative things and this is a
It's a rather nice looking curve.
However, many things in the world turn out to have much nastier
probability distributions. A lot of disasters for example have a power law distribution.
So if this is the size of a disaster and
this is the probably, they fall off like this. This doesn't look very dangerous from the start.
Most disasters are fairly small, there's a high probability of something close to zero and a low probability of something large.
But it turns out that the probability getting a really large one can become quite big.
So suppose this one has alpha equal to one [that what the corres...] that means that there is the chance of getting
a disaster of size 10 is proportional to 1 in 10 and that disaster is 10 times as large
that's just a tenth of that probability and that it's also ten times as large as that big disaster again a tenth of that.
That means that we've quite a lot of probability of getting very very largely disasters so in this case getting something that is
very far out here is exceedingly unlikely, but in the case of power laws you can actually expect to see some very very large outbreaks.
So if you think about the time that various disasters happen.
They happen irregularly and occasionally one is through the roof, and then another one, and you
can't of course tell when they don't have that's random
And you can't really tell how big they are going to be except that you're going to be distributed in this way.
The real problem is that when something is bigger than any threshold that you imagine well
It's not just going to be a little bit taller,
it's going to be a whole lot.
So if we're going to see a war for example large even
Second World War, we shouldn't expect it to kill a million people more. We could expect it to kill tens of most likely
hundreds or in there and even a billion people more
which is a rather scary prospect.
So the problem here is that disasters seem to be having these heavy tails. So a heavy a tail
in probability slang that means that the probability mass over here, the chance that something very large is happening,
there again it falls off very slowly
And this is of course a big problem because we tend to think in terms of normal distributions
Normal distribution are nice we say they're normal because a lot of the things in our everyday life get distributed like this
The tallness of people for example [in the well
they're] very rarely we meet somebody who's a kilometer tall, however,
when we meet the people and think about how much they're making or much money they have
Well Bill Gates. He is far far richer than just ten times you and me and then he's actually got
He's from afar out here.
So when we get to the land where we have these fa- heavy tails when err both the the richest
if we are talking about rich people and the dangers if we talk about this also
tend to be much bigger than we can normally think about.
(Off camera) Hmm yes definately unintuitive.
mmm and the problem is of course our intuitions are all shaped by what's going on here in the normal realm.
We have this experience about what has happened so far in our lives and
Once we venture out here and talk about very big events or intuition suddenly become very bad. We make mistakes
We don't really understand the consequences, cognitive biases take over and this can of course completely mess up our planning
So we invest far too little in handling the really big disasters
and we're far too uninterested in going for the big wins in technology and science.