I think the Singularity is the most
likely non-catastrophic event for the
near future, so that doesn't mean that it's certain, it doesn't mean that
terrible things like happened instead
you are contemplating
something that can replace the most competitively effective feature humans have
(intelligence)
uh... that it's entirely natural that there would be some real uneasiness about this.
The nearest analogy in the history of the earth is the rise of humans within the animal kingdom.
It is very unsettling to realize that we may be entering an era where questions like:
"What is the meaning of life?"
are practical engineering questions.
There's no doubt that that should be very unsettling, on the other hand I think it might be kind of healthy if we can sit down and look at the things we really want
and look at what they would mean if we could get them.
Humans are better characterized, not as the tool creating animal,
that has figured out how-to outsource its cognition,
how to spread it's cognitive abilities into the outside world.
We have only been doing that for a little while,
like ten thousand years,
reading and writing is outsourcing of memory...
substitution of fragments of human cognition into the outside world,
if the human responsibility, occupational responsibility
becomes more more focused on areas of judgment but haven't yet been automated,
then what you're seeing is rather like a rising tide of this cognitive outsourcing.