AI and cognitive psychology rant (getting more and more OT - tell me if I should shut up)

Bengt Richter bokr at oz.net
Mon Nov 3 16:02:06 EST 2003


On Mon, 03 Nov 2003 01:48:23 +0000, Stephen Horne <steve at ninereeds.fsnet.co.uk> wrote:

>On 3 Nov 2003 00:13:58 GMT, bokr at oz.net (Bengt Richter) wrote:
>
>>On 1 Nov 2003 22:19:11 -0800, mis6 at pitt.edu (Michele Simionato) wrote:
>>
>>>Stephen Horne <steve at ninereeds.fsnet.co.uk> wrote in message news:<uhe2qv0gff8v17trs4cj6mg88t2o7smq9b at 4ax.com>...
>>[...]
>>>> The evidence suggests that conscious minds exist
>>>> within the universe as an arrangement of matter subject to the same
>>>> laws as any other arrangement of matter.
>
>>If there is some "stuff" whose state can eventually be shown to have 1:1 relationship
>>with the state of a particular individual's conscious experience
>
>No-one has ever shown that. Actually, exactly the opposite. Our
 ^^^^^^^^^^^^^^^^^^^^^^^^^^-- fits well with "can eventually," no? ;-)

>consciousness is a very poor substitute for reality. You may see you
 who proposed consciousness as a _substitute_? ;-)

>conscious awareness as detailed, but that is only because as soon as
>you shift you focus on some detail it naturally enters the
>consciousness (indeed it often gets backdated, so that you think it
>was in your consciousness at a time that it simply wasn't there at
>all).
I am sure there is a lot to say about how conscious experience deviates
from some concept of what it should be if it was better, in someone's judgement,
but that is not what I was trying to get at ;-)
>
>What would be a particularly valued aspect of consciousness? How about
>'agency' - the sense of owning and controlling our own actions - the
>sense of free will?
Well, that too is an interesting topic, but not my attempted focus.
>
>Well, if electrodes are placed on your brain in the right place, they
>can directly move your arm. So what? Well, you will be completely
>unaware of the remote control - unless you are told about it, you will
>claim that you chose to move your arm of your own free will. You will
>even have an excuse for why you moved your arm which you believe
>implicitly.
I've heard similar things about post-hypnotic suggestion.

>
>In fact you don't even need electrodes on the brain - the same effect
>can be seen with people whose left and right brains are separated
>(corpus callosum cut). Hold up a card saying 'get a drink' so that it
>is visible only to one eye and they will go to get a drink. Hold up a
>card saying 'why did you get a drink?' to the other eye and they will
>show no awareness of the first card, insisting they just felt thirsty
>or whatever.
>
>Quite simply, consciousness is nothing special. It is a product of
If it is "nothing special," would you give it up?! (forever, I don't mean a nap ;-)

>information processing in the brain. Sometimes that information
I am uncomfortable with "information processing" as summary of what produces
consciousness. "Information processing" is a bit dry for my feeling about consciousness ;-)

>processing goes wrong for some reason or another, and consciousness
>gets distorted as a result.
Well, you are getting close to my subject, but haven't yet focused on it, AFAICS.
I don't say that to be rude, or accuse you of any lack, I'm just trying to trace
the failure of my attempted communication (which, after this post, I may have to
decide to live with, not having infinite time for interesting threads ;-)

>
>The 'transducers' are our senses, providing information about reality
>(imperfectly) to our brains.
I don't think the buck stops there. Certainly our senses are the first level of
transducers, but I was proposing that the brain itself was a "transducer." I.e., in
current experience, the brain is so far a sine-qua-non for conscious experience.
But why? How? IMO talking about the brain as if that were then final zoom setting
on attention to consciousness is like the beginning talk by the Greeks about atoms.
We have to get to sub-atomic particles and waves etc., at least.

>
>>From the fact that my consciousness goes out like a light almost every night, I speculate that
>>that what persists day to day (brains, synaptic connections, proteins, etc) is not the _essential_
>>basis of my experience, but rather, those persistent things somehow _shape_ the medium through
>>whose state-changes my conscious experience arises.
>
>What if the thing that woke up the next day was a perfect copy of you,
Well, close again. I was trying to explore the notion of a distinction between
the "thing" as physical shape holder, and something non-physical that could be given
a particular shape as a consequence, like an electric or magnetic field in the
neighborhood. (Or, what if consciousness is a peculiar dynamic thing like lasing,
something that happens under certain conditions, which as evolution would have it,
occurs in brains a lot).

As a metaphor for the essential medium of consciousness, imagine that water had
consciousness associated with it, namely a consciousness of its physical boundaries
and distribution in space, but of _nothing outside_ of the given connected blob of water.

Now if you pour this water into a bottle, it will experience bottle-shape, but the
essential concept that I am trying to get across is that it would be its _own_
bottle shape that it was experiencing, not the bottle's bottle shape, even though
the latter might be the current cause of the water's configuration.

So, metaphorically, I am suggesting that our brains are like flexible bottles that change
shape due to influences from the real world (and are a physical part of the real world),
but the brains themselves are not the "water," and not the ultimate basis for experience.
Rather, the experience comes from the "water," which is experiencing its _own_ shape
as it flows and follows the changes in its brain_as_container's shape.

An experiment to explore this might be to set get some of this "water" into contact
with an alternative shape-influence at the same time as being largely influenced by
normal brain function. The experience would presumably be normal reality with some
artificial content. Electric stimulation of brain areas is getting into the ball park,
but finding where the puddle of magic water in the ballpark is will require closer investigation,
I think. Also, the probings will cause persistent changes in the "bottle" so it will be a subtle
matter to distinguish water-shape from container-shape.

This is obviously a simplistic metaphor, the trick being to find something better than "water" to
talk about as something that is/can be modulated to create conscious content, and then
to demonstrate its modulation by means other than normal brain processes. IWT it would have
to be by extension of normal consciousness processes in order to have an i/o path for the effects
-- i.e., a site of ordinary consciousness associated with a person who can talk to us about their
experience -- but it will be tricky to demonstrate that e.g., some nano-device is playing the full
role of "bottle" for some part of the total conscious experience, as opposed to merely deforming
the old bottle in some subtle way.

>complete with the same memories, rather like Arnie in the Sixth Day?
>No - not a clone. A clone is at best an identical twin with a
>different age as well as different memories, and identical twins do
>not have identical brains even at birth - there isn't enough
>information in our DNA to give an exact blueprint for the initial
>connections between the neurons in our brains.
>
>But assume a perfect copy of a person, complete with memories, could
>be made. How would it know that it wasn't the same self that it
>remembered from yesterday?
The id() function returns a different number, being a different instance,
even though == says True ;-)

Seriously, the ego is probably somewhat illusory ;-)

>
>Now consider this...
>
>The brain really does change during sleep. In a way, you literally are
>not the same person when you wake up as when you went to sleep.
The physical state-holder is not identical, but what do you identify with?
The more your body changes as you age, the more you have to recognize that
your body's persistence is more like the persistence of an eddy near a rock
in a stream than the persistence of the rock. What then of your consciousness,
which perhaps is not even associated with the same atoms after some years? IOW,
it looksto me like the important aspect is essentially form, not substance.
>
>More worryingly, the continuity of consciousness even when awake is
>itself an illusion. Think of the fridge light that turns off when the
>fridge is shut - if you didn't know about fridge lights, and could
>only see it when the door is open, you would assume the light was
>always on. Similarly, whenever you try to observe your state of
>consciousness it is inherently on so it apears to be always on and
>continuous, but science strongly suggests that this appearance is
>simply wrong.
This doesn't in the least clash with e.g. the concept of some kind of field-like
basis for experience. It could flicker on and off and flit around. No prob.

>
>So do we have any more claim to our conscious sense of self than this
>hypothetical copy would have?
If the field hypothesis were a useful description of how things work,
then the two identical physical copies would be identical transducers,
and given the identical contact with reality (a little difficult if not
physically congruent ;-) they would presumably give rise to identical
"field effects" hypothetically tracking respective conscious experiences,
however flitty etc.
>
>The fact is that no-one has shown me anything to make me believe that
>we have 'experience' separate from the information processing capacity
>of the brain. So far as I can see, the copy would have as much claim
>to the conscious sense of self, 'continuing on' from prior memory, as
>the original.
Yes, if I understand you correctly. But I am interested in whether we should
be trying to look with sensitive devices for some kind of field/energy processes
that correlate with reported subjective conscious experience. (I.e., the "water"
in the metaphor above) I feel blocked if I have to limit myself to talking about
"information processing capacity of the brain."

While interesting, IMO it is not on the path to discovering the _basis_ for consciousness
in some measurable events/processes/tranformations/relationships/etc. Information processing
is more of a modulator than a medium in my view.

Talking about information processing is like talking about chopping vegetables.
It is only one factor in giving taste to the soup, and the subject is how we experiece
the taste of soup. And what we are, that we can have a tasting-soup experience without being soup.

>
>>Now consider the experience of being "convinced" that a theory "is true." What does that mean?
>
>Science suggests that all subjective meaning is linked to either
Who is Science? Some person made some observations and concocted a story about them,
that's my take ;-)

>direct senses or action potentials in the brain. If you think of the
>concept 'democracy', for instance, you may actually generate the
>action potentials for raising your hand to vote (depending on your
>particular subjective understanding of that abstract term) - though
>those potentials get instantly suppressed.
Sounds a stretch to apply this linkage story too broadly.
>
>In fact a key language area (Brocas area IIRC) is also strongly linked
>to gesture - hence sign language, I suppose.
IMO, language is only limited by your imagination and that of the other in your
communication. That's why we speak of body language, etc. Anything perceptible
can serve, if the two are in tune (~ at the same point in a context-sensitive parse).
The fact that there is generally muscular effort in creating a perceptible signal
should not IMO be taken to mean that our understanding must be limited to things
with associated 'action potentials' ;-)

>
>This makes good sense. Evolution always works by modifying and
>extending what it already has. The mental 'vocabulary' has always been
>in terms of the bodily inputs and outputs, so as the capacity for more
"Always" is a way to prune your thought tree. Careful ;-)

>abstract thought evolved it would of course build upon the existing
>body-based 'vocabulary' foundations.
Ditto about  "of course."
>
>I can easily suggest possible associations for the term 'convinced' by
>referring to a thesaurus - 'unshakeable', for instance, is a clear
>body/motion related metaphor.
>
>Or maybe it relates to the body language action potentials associated
>with the appearance of being convinced?
ISTM you are building up a pattern of words, and stirring a lot of
lingual action potentials ;-) But what happens when you stop talking to
yourself? How do you _feel_ when you _feel_ convinced? Lies may be written
with the same ink as truths. And when the squiggles on paper become
nano-squiggles in/amongst your brain molecules and somehow your conscious
experience is affected, what is the difference between what happens when
you are deceived and when you are convinced of a truth?

>
>At which point I'm suddenly having an a-ha moment - maybe the verbal
>and nonverbal communication deficits in Asperger syndrome and autism
>are strongly linked. Maybe a person who is unable to associate an idea
>to its body language, for instance, loses a lot of the intuitive sense
>of that idea. Thus the idea must be learned verbally and the verbal
>definition inherently gets taken too literally.
>
I am afraid I can't follow your thoughts re Asperger syndrome as I don't
know anything about it beyond the label (unless I have it to some
degree, but then I would have to know how to classify my experience as
related to that or not ;-)

>Basically, if a person has to give the concept a new, abstract,
>internal symbol instead of using the normal body-language associated
>internal symbol, then any innate intuitions relating to that concept
>will be lost. The neurological implementation of the intuitions may
>exist but never get invoked - and therefore it will tend to atrophy,
>even if its normal development doesn't depend on somatosensory
>feedback in the first place.
ISTM the brain is fairly adept at selecting metaphor-stand-in players to keep
a show going when there seems to be a need for some new role-player
on the stage. IOW, I suspect that our core ability is not in the linguistics
of stage directions, but in putting on a mental show (sometimes silent mime ;-)
with some useful relation to "reality-out-there."

If some show-production capability has failed to develop (e.g., presenting other
humans as more than cardboard cartoon cliches), I suppose it could be due to some
bit players being locked in their dressing rooms and having starved to death, but
I think I would look for a reason other than a building contractor's mistake to figure
why the lockup. Perhaps one of the bit players early on created a traumatic disturbance
on the stage, and the theatre owner just decided no more of that, perhaps being too young
and inexperienced to manage a frightening crew incident at the time.

Of course, I'm just playing with a metaphor here without knowing anything about how
_your_ theater is run, and just a little about my own ;-)

>
>That could explain some odd 'coincidences'.
>
>Hmmmm.
>
>I think I might post this idea somewhere where it is actually on topic
>;-)
>
>>Is it pain and pleasure at bottom
>
>Nope - the innate circuitry of our brain brings a lot more than that.
I am not sure what to make of your apparent confidence in asserting truths ;-)

>'Pain' and 'pleasure' are actually quite high level concepts, things
If you experience pleasure only as a high level concept, you are missing something ;-)
Ok, re pain, it's nice to be able to withdraw to a conceptual refuge, but what's
going on at the place you are avoiding is not high level, ISTM.

>which we experience as 'good' or 'bad' only because we have the
>information processing machinery that makes those associations.
I think I know it _feels_ good or _feels_ bad without having to have words for it.

OTOH, I do think we can channel different signals to whatever generates the good or
bad feelings (perhaps accounting for some of those tastes for which there is no accounting ;-)

>
>>I think we will find out a lot yet. Beautiful, subtle stuff ;-) Too bad we are
>>wasting so much on ugly, dumb stuff ;-/
>
>'Ugly' and 'dumb' are themselves only subjective perceptions. If you
I disagree with the 'only' part ;-) Subjective perceptions participate in
feedback loops that include effects in the real world. When I say 'ugly' and
'dumb' of course it is from my POV, and related to how I am currently wired
for pain and pleasure, etc. Would that Hitler's (yes it was bound to come up ;-/)
subjective perceptions had affected 'only' his delusional internal world, and
likewise wrt some current personages.

>really want to know the truth, you must accept that it will not always
>be what you want to hear.
Sure, but there is a part of truth that you create by the way you think and
the way it guides you to participate in the world (so hurtful self-fulfilling
delusions in the powerful cause much unnecessary misery). The staight-out mean
predators who have found a way to wire themselves (disconnecting some compassion
circuits) for comfort with that way of being presumably exist too. I don't like
to think of that as normal for evolved humans even though obviously there are relatively
successful models for predatory survival in the animal world (and in human history,
depending on your definition of success). I view it is being stuck in an unenlightened
mode of existence, for humans. As for dealing with bad stuff, it is hard to avoid
"having to do" bad stuff in response I suppose, but IMO attitude is important, and should
shape response, and IWT probably should best be based on trying to undertand the quotation,
"Forgive them, for they know not what they do." -- especially since one may not really
know what one is doing oneself ;-)

>
>>>I am also quite skeptical about IA claims.
>>Yes, but AI doesn't have to be all that "I" to have a huge economic and social impact.
>
>I thought IA != AI, though I have to admit I'm not sure what IA stands
>for. Instrumentalist something-or-other?
Don't know. Sorry, I took it for a typo.
>
>As for AI, I'd say I agree. Having a human-style consciousness is not
>necessarily a practical asset for an intelligent machine.
>
Whatever human-style means ;-)

Regards,
Bengt Richter




More information about the Python-list mailing list