We have shown that in the context of decomposition

Linear dependence immediately implies lack of uniqueness.

That's because there's a nontrivial linear combination that equals zero.

And we can add any amount of the special linear combination to any decomposition

and that would change the coefficients of the decomposition

Without changing it's value. So without changing the end result

And that means that if there exists one decomposition of some vector,

there's exists not only more than one, but actually infinitely many, linear combinations.

that give that vector

We have never shown however, that linear independence implies uniqueness.

And it's a totally different fact that requires justification.

Now it's quite obvious, or maybe at least inutitive when it comes to geometric vectors,

but it needs to be formally justified for general vectors.

and that's what we are about to do,

and this proof will be slightly technical, so you should really try to follow and understand it

if your interested in the strength of the logical framework of linear algebra.

but if this video confuses you then you can just ignore it and move on to the next video

so here's how the argument goes,

we will show that

if there's more than one decomposition of the vector d in terms of a,b, and c.

that that would mean that vectors a, b and c are linearly dependent.

So, if they're independent there cannot be more than one decompostion of the vector d,

so independence would mean uniqueness.

so let's show that if there are 2 different decompositions of the vector d

then the vectors a,b, and c are linearly dependent.

so suppose one the the decompositions has coefficients alpha, beta, and gamma

and another one, a different one has coefficients

alpha 1, beta 1, and gamma 1.

and at least one of these coefficients needs to be different from the original coefficients

so alpha 1a, plus beta 1b, plus gamma 1c, equals still d

so now we have two different decompostions that give the vector d

and at least one of these coefficients is different.

we don't know which one, but at least one of them is.

so if we subtract one of these identities from the other,

then on the right hand side we'll have the zero vector

on the left hand side we were able to combine vectors a,b, and c

and we have the linear combination alpha minus alpha 1a + beta minus beta 1b

and finally gamma minus gamma 1c

now we assumed that the linear combinations are different

so at least one of these coefficients don't match,

that meants that at least one of these coefficients is not zero.

and what do we have? we have a nontrivial linear combination that equals zero.

and therefore the vectors a,b, and c are linearly dependent.

so in conclusion,

linear dependence implies lack of uniqueness,

while linear independence implies uniqueness.