Sunday, February 17, 2019

When intellect fails



Let's start with the proposition that this refers to myself, and what I believe.  Long ago,
when I was in high school, I wrote a paper about aestethics, if memory serves.  ( Funny
thing about memory is that it isn't always reliable.  Therefore, I think it was about
aesthetics. )  Since I didn't trust emotions, and still don't, I wrote my paper along those
lines.  This provoked a response from the teacher who made it clear that she didn't agree, but my grade wasn't to be affected, because it was supported by consistent argument.

So, what is the point?  Mainly, that I don't trust emotions, and that hasn't changed.

But also this-- that I believed that one could arrive at valid conclusions based upon an
intellectual exercise.  But now, I am not so sure.  Not that I have arrived at the opposite
conclusion, and would agree with my teacher of that long ago time.  Only that the intellect can fail.

Perhaps the cause of the failure is a poor intellect.  But even the greatest intellect will
be capable of error.  Intellect can go just so far.

Reminds me of a PBS program that discussed physics.  It went like this: no matter how hard you try, you cannot teach physics to a dog.  The reason is the wide chasm of intellectual capacity between dogs and humans.  If you accept that, then the possibility of fair greater intellect exists in the potential universe.  Ours then, may not be the ultimate intellect in the universe.  Unless we are alone in the universe.

It may even be possible to create an intellect greater than our own, which in turn, will
surpass us, and replace us.  This forms the conceptual framework of the "singularity", if I
may be so bold.

I can think of one possibility, and that is with artificial intelligence combined with
quantum computing.  Quantum computers, if perfected, could at some time, when combined with other advances, make today's computers seem like tinker toys.

Should we be concerned about this?  Some folks are.  But humans are probably headed toward some sort of apocalypse anyhow.  Whether it be from this, or from some other thing.  The point being that anything human, or derived from humans, is subject to error.

No comments: