AI and cognitive psychology rant (getting more and more OT - tell me if I should shut up)

Stephen Horne steve at
Wed Nov 5 10:24:54 CET 2003

On 3 Nov 2003 21:02:06 GMT, bokr at (Bengt Richter) wrote:
>On Mon, 03 Nov 2003 01:48:23 +0000, Stephen Horne <steve at> wrote:

Right, continuing off from yesterday...

>Well, close again. I was trying to explore the notion of a distinction between
>the "thing" as physical shape holder, and something non-physical that could be given
>a particular shape as a consequence, like an electric or magnetic field in the
>neighborhood. (Or, what if consciousness is a peculiar dynamic thing like lasing,
>something that happens under certain conditions, which as evolution would have it,
>occurs in brains a lot).

My major thought is that you are effectively invoking the god of the
gaps. There is no evidence of this 'field' - you are free to
hypothesise it, but until some kind of evidence is found I am free to
ignore it.

In comparison, there is a huge body of evidence that the brain is an
information processing machine and that consciousness is a function of
the brain.

Even altered states of consciousness have been both measured and
invoked, for instance, in ways that are perfectly consistent with a
neurological information processing basis for consciousness.
Electrically stimulate the temporal lobe in a way that mimics temporal
lobe epilepsy, for instance, and the end result is normally described
as a feeling of being in the presence of god.

Even moderately high levels of dopamine (well below those needed to
cause schizophrenia) are consistently associated with a more
'spiritualistic' outlook. Of course if I claim that any 'spiritualist'
viewpoint is an aberration caused by excess dopamine, the obvious
retort is that I must be suffering from dopamine deficiency - but this
misses the point. The processes that regulate dopamine in the brain
are pretty well understood. Dopamine is simply an aspect of the
information processing that the brain does - a metaphorical cog in the
machine. The spiritualistic outlook is *generated* by the information
processing in the brain.

And if you look at the social evolution of the human species, it is
not hard to believe that spirituality is an innate feature with an
important social purpose.

And as for our valuing consciousness, our valuing ourselves clearly
has an important evolutionary purpose - despite the myth, even
lemmings aren't suicidal.

The perception of self is pretty central within most peoples concept
of consciousness, so it is unsurprising that most people see
consciousness as important and valuable.

>>The brain really does change during sleep. In a way, you literally are
>>not the same person when you wake up as when you went to sleep.

>The physical state-holder is not identical, but what do you identify with?
>The more your body changes as you age, the more you have to recognize that
>your body's persistence is more like the persistence of an eddy near a rock
>in a stream than the persistence of the rock. What then of your consciousness,
>which perhaps is not even associated with the same atoms after some years? IOW,
>it looksto me like the important aspect is essentially form, not substance.

And yet consciousness itself changes dramatically, sometimes from
second to second, and we still feel a sense of continuity.

How can we explain this sense of continuity?

One way is to appeal to the 'god of the gaps' - to claim some
currently unknown entity which we cannot yet observe must provide the
real experience of consciousness.

The other way is to look at the pragmatics of social behaviour. A huge
amount of social behaviour depends on that sense of continuity - that
feeling that (me today) == (me tomorrow) == (me yesterday).

Why, for instance, should I return a favor if I believe that I don't
owe anything because the recipient of the favor was, in effect,
someone other than me? How can it be fair to punish the person today
if he is not the same person who yesterday committed a crime? For that
matter, why worry about punishment if the person who will be punished
is not the same person who is committing the crime?

Having a sense of continuity of self is, in other words, pretty
directly linked to the practical information processes involved in
maintaining a reputation and social status.

>>More worryingly, the continuity of consciousness even when awake is
>>itself an illusion. Think of the fridge light that turns off when the
>This doesn't in the least clash with e.g. the concept of some kind of field-like
>basis for experience. It could flicker on and off and flit around. No prob.

Yes, and so long as you can keep your theory in the gaps and well away
from scientific probing, no-one can prove it wrong, can they?

Whereas the information processing concept is actively being probed as
I type, revealing important clues about how consciousness works on a
regular basis.

>Yes, if I understand you correctly. But I am interested in whether we should
>be trying to look with sensitive devices for some kind of field/energy processes
>that correlate with reported subjective conscious experience. (I.e., the "water"
>in the metaphor above) I feel blocked if I have to limit myself to talking about
>"information processing capacity of the brain."
>While interesting, IMO it is not on the path to discovering the _basis_ for consciousness
>in some measurable events/processes/tranformations/relationships/etc. Information processing
>is more of a modulator than a medium in my view.

OK, so lets consider the implications of this 'field'...

Lets consider the 'alters' in multiple personality disorder. This has
been studied under a brain scanner. When alters switch, the activity
in the hippocampus changes dramatically. If the 'primary' personality
is active, the hippocampus is more active.

In both of our models, the hippocampus does not generate consciousness
itself. The information processing model tends to primarily implicate
the prefrontal cortex. In your model, obviously you invoke the field.

All the hippocampus provides is a set of memories - explicit memories,
to be precise.

An alter can suddenly emerge after years of not being expressed, and
with a perfectly intact sense of continuity of self - as if that self
had just been transported to the future with a time machine. And the
same thing applies to the primary personality when he/she returns,
perhaps days, weeks or longer later - an intact sense of continuity of
self despite the apparent jump in time.

If the brain contains a single 'field' which persists the whole time,
how come their are two (or more) independent personalities with
separate senses of continuity of consciousness?

If there is a distinct 'field' for each personality, how are the two
kept distinct in the same container? And while you can claim that
there is some awareness of time lost in sleep, it is much harder to
justify an awareness of time lost in this case so how come the fields
are not aware (in general) of having lost control of the 'host' body,
brain included, for perhaps very long periods of time?

The fields only really make sense if each field arises as the
personality arises, taking the shape set out by the memories that
become active. But think what that means. If the field is created and
shaped by the activity of neurons (and presumably influences the
neurons in turn as a feedback process), but it has *no* independent
existence of its own - then it is nothing more than another cog in the
information processing machine.

Does this really satisfy your need for the mind to be something more
than information processing?

The information processing theory can take MPD in its stride, of
course. If you keep your memories in several self-consistent but
distinct chunks then, even if the same neural machinery handles the
processing, you get a different 'self' by simply switching which sets
of memories are active.

There remain important questions to answer about how and why MPD
occurs (not the traumatic-experience-etc why but more the question of
if there is an evolutionary benefit to maintaining several distinct
'selves', or whether it is simply a kind of breakdown), but there is
no sign at yet of them being unanswerable.

Basically, as I said before, you seem to be appealing to the 'god of
the gaps'. You find the idea of the mind as an information processor
unapalatable, but - as with many people who claim not to be religious
per se - you don't want to appeal to an actual god or soul. So instead
you appeal to an undiscovered type of physics.

But even if this undiscovered type of physics exists, once discovered
and understood it would just be another piece of the science jigsaw.
Just like quantum computing, it would simply be another way of doing
the information processing.

Can I explain why people shouldn't be comfortable with being a part of
the physical world, and of having minds that are explainable in
information processing terms? Obviously there's the sense of agency
and illusion of free will, but while they might explain a
misinterpretation they don't explain a continuing strong conviction,
so what else?

There is a concept called the 'expanding circle' which relates to who
is and who is not considered a 'person'. I put that in quotes because
it is far from literal - humans are often excluded, and (particularly
in modern times) animals etc are very often included.

Basically, it refers to an intuitive sense of who has rights, who you
can empathise with, etc. When you can torture and kill 'the enemy'
without being traumatised yourself (assuming you are not simply a
sociopath) it is a clear sign that 'the enemy' are outside of your
circle, for instance.

This intuitive sense has clear practical evolutionary value - being
willing to kill others in your tribe without batting an eyelid would
obviously not suit a social species, yet it would be hard to carry out
intertribal warfare if you empathised with your enemies. And this is
complicated by the fact that it seems tribes did make both short and
longer term alliances - you couldn't rely on an individual being a
stranger, but needed to make more flexible classifications.

This isn't the only intuitive classification the mind makes, of
course. There appear to be clear intuitive distinctions between
animals, plants, and inanimate objects for instance. These
distinctions seem to be innate and instinctive, though there is
flexibility in them.

If these are general intuitive principles, it is no surprise that when
you introspect you find it hard to accept that your mind is not
distinct from an information processing machine. Your mind naturally
classifies your self in the intuitive category of 'person'.

Basically, it would be surprising if most people didn't resist the
idea of being, in principle, little different to a computer.

As for me, well perhaps those classification systems don't work so
well in people with autism spectrum disorders. Maybe that is why we
are overliteral in applying moral rules as well as other areas - we
don't really have the same intuitive sense that everyone else has of
the differences in how people, animals and inanimate objects should be

Maybe that is a part of the 'theory of mind' issue.

So when I introspect, it doesn't particularly bother me to see myself
as a (broken) machine designed and built to carry genes from one
generation to the next, and neither does it particularly bother me to
see my mind and consciousness as a product of information processing.

>Talking about information processing is like talking about chopping vegetables.
>It is only one factor in giving taste to the soup, and the subject is how we experiece
>the taste of soup. And what we are, that we can have a tasting-soup experience without being soup.

In what way is 'experience' not information processing?

I know, I know. There is all this information being integrated from
senses, working memory, associations from longer term memory,
emotional colouring etc, but 'the experience itself must be something
else' as the claim typically goes.

But of course you are aware of 'experience' - that's just 'higher
order perception'. The brain is perfectly capable of several 'meta's
in front of 'experience', 'thought' or whatever.

To put it another way, when looking at a picture you may feel that you
are aware of the whole picture, but you are not. Big changes can be
made to the picture without you noticing, as long as they don't effect
the precise detail that you are focussing on at that precise moment.
It is called change blindness.

A big part of the sense that something is 'conscious' is actually
illusory in itself - things that can be immediately brought into
consciousness tend to be classified by the mind as currently conscious
because, in the practical sense, they may as well be. So with change
blindness, you 'experience' the whole picture mainly because you
intuitively know that you can focus on any part of it in an instant.

The same applies to the soup, really. At the moment you may only be
focussing on the texture, or the temperature, or for that matter the
fly you just spotted in your bowl - but there is a higher level
perception that you are having the 'whole soup experience' because you
unconsciously know that you can tune into any aspect of it at will.

The 'whole soup experience' is basically an illusion, or maybe a
'collective noun' for the lower level perceptions that you had (or
didn't have) while eating the soup.

BTW - what is this thing about 'without being soup'. If someone
claimed to 'be one with the soup' I would suspect them of having a
rather odd case of altered consciousness. The 'being one with
everything' sense seems to be an exaggerated case of something quite
ordinary, by the way.

Pick up a pencil and run it over a rough surface. Most people will
quite quickly start experiencing the bumps as being at the tip of the
pencil, as if the pencil were an extension of the body.

Phantom limbs also have some significance to this.

Basically, a part of our sense of self is a sense of the boundary
between ourselves and the rest of the world. Normally this matches
pretty well with our bodies, but there is a degree of flexibility to
put the bounds where they are needed, and sometimes things just plain
go wrong.

The experience of tasting soup is certainly more than just the result
of the input from the taste buds, of course. There are other senses
involved, there are emotional associations and associations from
memory - the whole soup-eating experience. 

Basically, there is nothing here which I don't see as information

>>>Now consider the experience of being "convinced" that a theory "is true." What does that mean?
>>Science suggests that all subjective meaning is linked to either
>Who is Science? Some person made some observations and concocted a story about them,
>that's my take ;-)

You might like to read some Rita Carter stuff - this is my easiest
reference because, as I mentioned before, I'm currently reading her
book "consciousness".

The specific field of science, in this case, would be neuroscience.

>>In fact a key language area (Brocas area IIRC) is also strongly linked
>>to gesture - hence sign language, I suppose.

>IMO, language is only limited by your imagination and that of the other in your
>communication. That's why we speak of body language, etc. Anything perceptible
>can serve, if the two are in tune (~ at the same point in a context-sensitive parse).
>The fact that there is generally muscular effort in creating a perceptible signal
>should not IMO be taken to mean that our understanding must be limited to things
>with associated 'action potentials' ;-)

Body language is quite distinct from the kind of gesture that occurs
in sign language. As I understand it, most body language is not
generated by Brocas area.

Also, the use of muscles isn't what this is about. Even when you here
a word, or think about the concept, your brain generates the
associated action potentials - but inhibits the actual muscular

>>This makes good sense. Evolution always works by modifying and
>>extending what it already has. The mental 'vocabulary' has always been
>>in terms of the bodily inputs and outputs, so as the capacity for more

>"Always" is a way to prune your thought tree. Careful ;-)

It is also very much the right word in this case. Sophisticated useful
abilities in complex organisms do not arise through fluke mutations,
much as broken fragments of china do not suddenly join themselves into
a perfect vase and leap up onto your wobbly shelf. It's called the law
of entropy.

>>abstract thought evolved it would of course build upon the existing
>>body-based 'vocabulary' foundations.
>Ditto about  "of course."

Evolution doesn't throw away what it already has and start again from
scratch. It seeks local maxima - it cannot anticipate long term

Assuming that the initial 'mental vocabulary' was spelled using
body-based 'heiroglyphs', then it is very hard to believe that
evolution would ignore this foundation. And besides, as I already
mentioned, the body-based vocabulary is still alive and well in modern
human minds.

>How do you _feel_ when you _feel_ convinced?

I ask the same of you - how do _you_ feel when _you_ feel convinced
that there is something more than information processing?

As I mentioned already, I don't have an emotional reaction to the idea
of my mind being an information processing machine. I mostly see the
science as a means of understanding myself better and trying to solve
some very real problems that I have.

But I have a number of reasons for expecting most people to have quite
a strong emotional reaction against the idea of their minds being
information processors. Reasons that themselves arise out of the
information processing model.

So who is being more objective? The person who looks to the theory
that is currently generating interesting results and new ways to look
at things by the bucket-load? Or the person who looks to an
'undiscovered' idea as a way to avoid accepting ideas he finds

>>Basically, if a person has to give the concept a new, abstract,
>>internal symbol instead of using the normal body-language associated
>>internal symbol, then any innate intuitions relating to that concept
>>will be lost. The neurological implementation of the intuitions may
>>exist but never get invoked - and therefore it will tend to atrophy,
>>even if its normal development doesn't depend on somatosensory
>>feedback in the first place.
>ISTM the brain is fairly adept at selecting metaphor-stand-in players to keep
>a show going when there seems to be a need for some new role-player
>on the stage.

Of course. That seems to me to be key to the body-based mental
vocabulary idea. How else could abstract terms be represented but as

For instance, just because my mental spelling of 'democracy' involves
putting up my hand to vote, doesn't mean I'm not going to put a tick
in the box (or push the button on the voting machine or whatever) when
I'm in that booth.

But consider how you look in a dictionary. It's hard to find the right
word if you don't know how it is spelled.

Now consider what happens if innate concepts are stored in such a
'dictionary', dependent on using the standard 'spellings' for the
mental concept to look them up.

> IOW, I suspect that our core ability is not in the linguistics
>of stage directions, but in putting on a mental show (sometimes silent mime ;-)
>with some useful relation to "reality-out-there."

Of course. I wouldn't expect the mental vocabulary to be spelt in
verbal terms (speaking action potentials, or sound-of-word) except as
a last resort, when it is too abstract to assign a 'metaphor'. And I
never said that the mental 'words' are organised into sentences - that
is almost certainly pushing the metaphor way too far. I meant
'vocabulary' in the sense of a set of symbols manipulated in an
information processing machine.

>Perhaps one of the bit players early on created a traumatic disturbance
>on the stage, and the theatre owner just decided no more of that, perhaps being too young
>and inexperienced to manage a frightening crew incident at the time.

If that's a metaphor for the causes of autistic spectrum disorders, it
is an extremely dated one. Autism isn't caused by traumatic
experience. It is a neurological disorder, with an extremely large
degree of genetic causation. There is a significant environmental
causation, but emotional trauma at least isn't it.

The trauma comes later, when autistic symptoms wrongly convince others
that the person is rejecting them, is being deliberately disruptive,
is untrustworthy etc etc, and when people react in turn by rejecting
the autistic person.

Autistic symptoms can typically be seen when the child is three or
even younger - when the brain is toward the end of its 'initial
development', and particularly when the prefrontal cortex comes fully

The trauma doesn't typically start until school age, though
occasionally it can start earlier when doctors jump to conclusions
about abuse and separate the child from the environment and people
that he/she is reliant upon, destroying the sense of predicatability
that autistic children need.

>>>Is it pain and pleasure at bottom
>>Nope - the innate circuitry of our brain brings a lot more than that.
>I am not sure what to make of your apparent confidence in asserting truths ;-)

This is simple fact. But I'm tiring of explaining everything etc.

Start with Rita Carters books 'mapping the mind' and 'consciousness'.

Steven Pinkers books (particularly 'How the Mind Works') are also IMO
very good, with the possible exception of 'words and rules' which on
first reading droned on far too long and didn't seem that relevant to
a wider understanding of the mind anyway - though I may go back and
re-evaluate that idea now.

Joseph LeDouxs books (The Emotional Brain and Synaptic Self) are
simply essential reading if you are serious about understanding how
the brain works.

There is a BBC book-of-a-series called 'human instinct' by Robert
Winston which may be worth reading, but keep your critical faculties
operating - he doesn't seem to be an expert himself, and he
occasionally seems to miss the point of the theories he's describing.

My first-choice textbooks to have around are...

Neuroscience - Exploring the Brain
  Mark F. Bear, Barry W. Connors, Michael F. Paradiso
  ISBN 0-7817-3944-6 (2nd edition)
  ISBN 0-683-00488-3 (1st edition)

  Obviously concentrates on the neurology, but nevertheless essential
  reading if you ask me - the second edition really is much improved
  over the first, too.

Social Psychology
  Robert A. Baron, Donn Byrne
  ISBN 0-205-34977-3 (10th edition)
  ISBN 0-205-31131-8 (9th edition)

  Definitely the one I trust most in social psychology, and truly
  fascinating. Not a great deal has changed between editions, but I'd
  pay far more for far less when it's this good.

Cognitive Neuroscience - The Biology of the Mind
  Michael S. Gazzaniga, Richard B. Ivry, George R. Mangun
  ISBN 0-393-97219-4 (1st edition)

  I'm in two minds about this one. It is certainly very interesting,
  but the cognitive and neurology levels are often quite weakly tied
  together. Probably an artifact of this being a relatively new field,
  and the two source perspectives being less than perfectly welded.

  Great for understanding the prefrontal cortex in particular, though.
  I found the stuff on executive function particularly enlightening.

  Richard Gross
  ISBN 0-340-64762-0 (3rd edition)

  There is some good stuff in here, though there is obviously overlap
  (and in general less detail) with the more specialised fields.
  Pretty sure this has been updated.

Cognitive Psychology
  Michael W. Eyesenck, Mark T. Keane
  ISBN 0-86377-551-9 (4th edition)

  Essential to have around, but not the most interesting. That is
  probably unfair, though - I am very sceptical of pure cognitive
  theories that have no link back to neurology.

  There is (at least) a fifth edition now.

Abnormal Psychology
  Ronald J. Comer
  ISBN 0-7167-4083-4 (4th edition)
  ISBN 0-7167-2494-4 (2nd edition)

  The second edition was excellent, and still stands as my first point
  of reference for abnormal psychology stuff - though sadly its stuff
  on autistic spectrum disorders is pretty poor. The fourth edition is
  updated a bit here and there, but the main change seems to be to
  remove as much detail as possible.

With social psychology and abnormal psychology in particular, there
are plenty of other choices and it's always good to have more than one

Obviously there is a big dose of appealing to authority in this list,
but if you are really interested in what science is revealing about
the mind and you can spare the money and (much more significantly) the
time, there is a lot of good stuff in that list. Obviously the
textbooks would wait until you're *really* taking it seriously.

I'll be very brief with the rest...

>>'Pain' and 'pleasure' are actually quite high level concepts, things
>If you experience pleasure only as a high level concept, you are missing something ;-)

I think you're confusing 'high level' with 'abstract'. Emotional
experience is very high level. As Joseph LeDoux is keen on pointing
out, our experience of emotions is not the same as the processing that
generates them. Most of the processing is relatively low level, in the
limbic system and brain stem, but that is not the same as the
experience of emotion which occurs in the prefrontal cortex.

The prefrontal cortex does seem to be the center of rational thought
and planning, but that is a long way from being its whole job.

You will find it hard to find someone who knows more about neurology
that LeDoux - if your first impression is that his books are pop
science, try counting the references to his papers in the textbooks.

Which is not to say that his books aren't readable, I must emphasise.

Ayway, it is quite possible that I am missing something in
pain/pleasure terms - but how can I know for sure? For all I know,
everyone feels pain the same as me but you're all a bunch of whiners
and crybabies ;-)

>>'Ugly' and 'dumb' are themselves only subjective perceptions. If you
>I disagree with the 'only' part ;-)

OK - I take the point.

Steve Horne

steve at ninereeds dot fsnet dot co dot uk

More information about the Python-list mailing list