[Serious Phil] The Meanings of "Consciousness"
SWMirsky at aol.com
Mon Feb 20 10:45:42 CST 2012
--- In Phil-Sci-Mind at yahoogroups.com, "andy_kappa" <Philscimind at ...> wrote:
> I use the word 'mental' in contrast to my use of the word 'physical'. Additionally I use the word 'mind' without being hoodwinked into thinking there is a referent. The upshot is that I don't understand the connection between 'consciousness' and 'mind or mental'. I use the word 'conscious' in a variety of ways, but are we being similarly hoodwinked into thinking that the word 'consciousness' has a referent?
Why do you think there's no referent for "mind" or "consciousness" just because there's no discernible entity that corresponds with it?
Does "Yale University" have no referent? Does a named atomic particle (like a proton or quark) have no referent? Do verbs (names for actions) have no referents? Does the baseball game played on such and such a date in such and such a place by such and such teams not have a referent by which we can determine whether the phrase picks something out in the world (is true) or not?
Perhaps you are construing "referent" so narrowly as to exclude such things but then we are still faced with the fact that we DO refer to such things in many contexts, that referring works in those contexts. If we CAN refer, and be deemed correct or incorrect in doing so, then there must be a referent that we have in mind and mean to convey to others by the act of referring.
If "mind" and "consciousness" are such words, as I think they are, then the fact that their referents aren't like the referents of "rock" or "tree" or "baseball" does not mean they are non-existent. Must all existents be physical (delineated in terms of discernible sensory parameters)?
Anyway, I think the issue ought not to be whether "mind" picks out a referent at all but whether it picks out an entity-like referent, and, if it doesn't, what kind of referent is involved?
> Also, I find difficulty in any explanation of the meaning of 'consciousness' by appeal to 'experience' since I don't know how the uses of these two words differ (as they are being used here). Similarly for 'awareness'.
"Consciousness" is a hard word to nail down, hence Minsky's calling it a "suitcase word" which I think is largely correct. Chalmers makes a point along these lines by noting that we often seem to have both observable and private phenomena in mind as referents for the words
we use for the mental sphere. I think that's also sensible. Sometimes by "consciousness" all we mean is the behaviors of wakefulness which we observe in others somewhat like ourselves while other times we mean the various phenomena we notice in ourselves when we introspect (e.g., having perceptions in particular or in general, being aware of particular aspects of something in our line of site, recalling a certain moment or feeling, forming an idea, working through a problem, feeling a wave of emotion, grasping a symbol's meaning).
In discussions like this it's important to get clear on what we mean in the relevant case(s) and then to agree to use the same meanings in the same contexts and so forth. Otherwise we can quickly bog down in dueling examples which seem to contradict one another, and that can only add to confusion.
> Is Searle using the words 'consciousness' and 'understanding'
> synonymously? That would be strange.
Searle mostly avoids reference to "consciousness" (though not exclusively so). Most of the time he prefers to make his case through various verbal proxies. For instance, the Chinese Room Argument focuses on understanding as in "understanding Chinese" vs. non-comprehending symbol manipulation (what he calls "formal" or "syntactical" operations). But Searle never really elucidates what he thinks understanding consists of except to say that it amounts to "semantic" content which the person who understands Chinese grasps when he or she sees the Chinese ideograms (symbols) but which the individual who has learned and is following rules for manipulating the symbols doesn't.
Searle's argument trades on the intuition that something is present in understanding that is absent when mere symbol manipulation (which he designates "syntax") is all that's happening. But he never essays to tell us what it is -- or even seems to recognize the importance of doing so. By omitting an account (or even the effort to give such an account) of this, he makes his argument using an inadequately explicated term.
> When I say that I'm conscious of something, I mean something quite different to when I say that I understand something.
Absolutely. In the case of Searle's CRA, he is working with the proxy term "understanding" though in at least one of his books, The Mystery of Consciousness, he explicitly connects the matter of understanding, in terms of its absence in the CRA, to consciousness.
Stevan Harnad argues that what Searle has flagged in invoking this intuition is a feeling or what Harnad calls "feltness" which is present in every instance of understanding but which even the "smartest" computer system must invariably lack. But Harnad does no better at explicating what his "feltness" consists of than Searle did in explicating "understanding", leaving us back in the same place: that these aspects of consciousness are inherently mysterious and irreducible to something more basic than themselves.
This just shows that the word "consciousness" is one of the particularly slippery ones in our language. But that doesn't mean that we can't talk about it -- if we can come to agreement concerning what the term means in the context of our discussion.
On the other hand, if we keep slipping meanings, if we slide from "consciousness," qua being generally aware of things, to "consciousness" as being aware of some particular thing in a field of which we are generally aware, or, alternatively, "consciousness" as being a certain class of brain waves discernible in brains or certain kinds of eye movements discernible in a subject, then we are going to run into constant trouble.
By "consciousness" in the present context, ALL I mean is those features of our private or subjective experience that we recognize in ourselves in the course of introspection and which we take others to have based on behavioral criteria.
We can and should recognize all the different uses but we have to keep them distinct for a meaningful discussion to go on.
(Sean raised the question to me off-line re: what it was that I objected to in Searle's CRA and I answered and he put that on this list to presumably spark discussion. I'm prepared to join in but I want to be clear from the outset as to my meanings so that any discourse we have will not be clouded by ongoing meaning slippage.)
> So the claim that Searle's CRA "assumes that consciousness is not reducible to anything not itself conscious" is still quite opaque to me.
Not surprising since my summary, which Sean posted here, did not fully explicate the four points on which I have criticized his CRA. The issue you're troubled by hangs on this:
In order for Searle to argue (as he does in the CRA) that no possible configuration of the constituent elements of the Chinese Room (CR) can do what the CR cannot, itself, do, he has to take the position that the absence of any evidence of understanding (his proxy for consciousness in this case) in ANY of the constituent elements is evidence for THAT conclusion.
But that means he must be arguing that understanding (the proxy for consciousness in this case) is an irreducible something, i.e., that it is not the product of anything else that is not, itself, already an instance of understanding. If that were not the case, then there would be no reason, in principle, to assume that the absence of understanding in the CR implies anything at all about the underlying constituent elements themselves. But this view, that understanding cannot be present in a system if it isn't there in one or more of the system's constituents, is fundamentally dualistic, i.e., it assumes that whatever understanding is, it cannot be reduced to anything else in the universe. (By "reduced" I mean explained AS the function of something that is not, itself, an instance of understanding.)
My own view is that what we call understanding (the proxy here for consciousness) may be adequately understood and explained as a complex phenomenon and not as an irreducibly simple one. Thus it can be seen as a SYSTEM LEVEL level feature rather than as a feature of some system constituent(s). In THAT case, there would be NO reason why the fact that there's no understanding in the CR should be taken to imply anything about any other configuration of the same type constituents.
Of course it doesn't guarantee that a more complex system of some particular type would succeed. But it doesn't have to. Searle's CRA only argues that a computationally based understanding would be impossible.
My argument against the CRA is that it doesn't support that conclusion at all (because of the four basic mistakes which I've alluded to in the e-mail which Sean posted here to start things off).
> --- In Phil-Sci-Mind at yahoogroups.com, Philscimind@ wrote:
> > "consciousness" = what we usually mean by "mind" or "mental", etc.
> > Specifically, in a discussion like this, it means the various phenomena or
> > events we think of as our mental lives.
> > In Wittgensteinian terms, it means the private sphere of our experience,
> > that which occurs TO us which is not shareable by replication in others
> > (though they may have similar or even comparable experiences of their own). We
> > can and do refer to it via language, describing it and, sometimes, through
> > the media of the arts, we endeavor to evoke it in others.
> > It does NOT mean having a particular sort of experience as in raising
> > someone's "consciousness" in the sense used by Marx or Marcuse, say.
> > It does mean the state or condition of experiencing, itself, i.e., of
> > having experience.
> > Searle's reference generally is to minds though he often substitutes
> > "consciousness" in the sense I've suggested above. Dennett DOES generally use
> > the term "consciousness" in lieu of mind since the former term seems to be the
> > more generic one.
> > Another way to get at this is to think of it as what anyone means by
> > "awareness" as in the condition of being aware, having awareness.
> > Marvin Minsky calls it a "suitcase word" without any fixed meaning, but
> > that actually applies to lots of words and is perhaps better understood in
> > terms of Wittgenstein's notion of family resemblances.
> > "Consciousness" just happens to have a lot of softness in its particulars,
> > no doubt because the very nature of the referents it picks out are
> > generally of a non-public nature, again reflecting the very Wittgensteinian insight
> > that language breaks down when we try to apply it to the private sphere
> > (as in his point about the impossibility of private language).
> > SWM
> Philscimind mailing list
> Philscimind at ...
More information about the Philscimind