How to Read a Mind

We think we know our own minds - are we deluded?

In Julian Barnes’s novel Staring at the Sun, teenage Jean Serjeant is struck by the firmness of her parents’ moral views. Their opinions seem to her like ‘honking frogs’ compared with her own ‘twitching, vulnerable tadpoles’. How can people be so sure of what they think, Jean wonders: ‘How could you know your own mind without using your mind to discover your mind in the first place?’ It seems almost circular, putting Jean in mind of ‘a dog circling in pursuit of its own cropped tail’.[1]

Jean’s questions provoke further questions. How do we discover what we think? If we must use our minds to discover our minds, then can we make mistakes about them? Can you be wrong about what you think, just as you can be wrong about what somebody else thinks? I express liberal views on most political and social issues, but can I be sure I really believe the things I say? Perhaps I just say them to fit in and get my friends’ approval?

The suggestion that we might make mistakes about our own minds runs against common sense. We assume that our minds are, as it were, transparent to us -- that we can tell what’s in them directly and infallibly. Yet there are reasons to doubt that common sense is right about this. It is possible to have a thought without knowing that you have it. Infants and non-human animals have beliefs (for example, that Daddy is close by or that there is food on the table), without knowing that they have them. They have beliefs but do not have beliefs about their beliefs. Some further process is required (some ‘use of the mind’) to gain that self-knowledge, and the process might not always be reliable. Moreover, there is experimental evidence that we do in fact make mistakes about our own minds. It is well established that people’s choices can be influenced by factors of which they are not consciously aware. For example, if offered a choice of identical items, people tend to opt for the one furthest to the right. But if asked why they chose this item, they do not mention its position but offer some plausible reason, such as that they thought it was the best quality. Similar effects have been observed in other experimental situations. It seems that when people do not know why they performed an action, they unconsciously confabulate an explanation for it, attributing to themselves mental states they do not really have.[2]

So how then do we get to know about our own minds? Let’s start with a related question: How do we get to know about other people’s minds? Sometimes, of course, people tell us what they think, but we can also infer a person’s mental state from nonverbal evidence (and, indeed, we sometimes use this evidence to discount what they tell us, as when we suspect deceit). We know how situations typically affect people’s minds (that people perceive certain ­­aspects of their surroundings and form corresponding beliefs about them) and how their minds affect their behaviour (people try to get things they want, avoid things they dislike, and so on), and we are skilled at detecting signs of emotion and mood. Indeed, many psychologists believe that the human mind has a special system for working out people’s mental states - a ‘mindreading’ system. As social animals, it is very important for us to know what others think and want so that we can anticipate their behaviour, co-operate with them, and avoid being tricked, and there would have been strong evolutionary pressure for the development of a special mental subsystem dedicated to the task. (Studies of the development of mindreading abilities in infants also provide strong evidence for the existence of such a system.) This system, it is claimed, operates rapidly and unconsciously, generating beliefs about the mental states of people we encounter.[3]

How is this relevant to knowing our own minds? Well, there is a strong case for thinking that self-knowledge is produced by this same mindreading system - a case made by the philosopher Peter Carruthers in his 2011 book, The Opacity of Mind. Carruthers argues that in time our ancestors started to apply their mindreading abilities to themselves, forming beliefs about their own mental states as they did about other people’s, and that this is the basis for our self-knowledge. However, since the mindreading system was originally designed for understanding other people, it does not give us direct access to our own thoughts, and we must infer them from observations of our circumstances and behaviour, interpreting ourselves just as we interpret others. If we are more knowledgeable about our thoughts than about other people’s, this is simply because we have more evidence to go on, including that provided by experience of our own bodily states and of the words and images that pass through our conscious minds (sensory states like these are directly available to the mindreading system). This theory nicely explains why we sometimes misattribute beliefs to ourselves. When we confabulate reasons for our actions, we are reporting self-interpretations unconsciously generated by our mindreading systems.

___

"We may be less self-aware than we think - and more hypocritical."
___

If this account is on the right lines, then there is a moral in it for us. We may be less self-aware than we think - and more hypocritical. Perhaps I don’t really believe all the fine sentiments I express. My mindreading system hears me say them and interprets me as believing them, but maybe the system gets it wrong. How can I tell? How can I check the accuracy of my own mindreading system? (Recall Jean’s image of a dog chasing its own tail.) Must we accept that we can never really know what we think? That would be a depressing conclusion. But I don’t think it follows. For I can at least improve the reliability of my beliefs about myself. I just need to give my mindreading system more data, by observing myself more carefully. Is my behaviour really consistent with the beliefs I profess? Do I sometimes speak or act in ways that belie them? Do I sometimes feel uncomfortable when expressing my views, and do I express them with the same firmness in private as in public? Paying attention to these things should help my mindreading system to generate a more accurate interpretation of my own mental state, either confirming that I genuinely believe what I say or revealing the presence of some doubt or insincerity. In short, we can improve our self-knowledge by being less complacent and more self-observant.

Why, then, are some people so sure of what they think, as Jean’s parents are? Part of the answer, of course, is that most of us are unaware that our self-knowledge is fallible; we assume that we have transparent access to our own minds. (Indeed, Carruthers argues that this assumption is built into the mindreading system itself; in order to simplify its calculations, the system assumes that people know their own mental states, even if they sometimes conceal them. It is, as it were, programmed to ignore its own existence.) I suspect that some of our social practices also foster self-assurance. In Western society, at least, there is a strong tradition of public argument and debate, often of a confrontational kind. We are encouraged, and even expected, to have clear views on social, moral, and political questions and to be prepared to state them in public. People who ‘know their own minds’ and assert their opinions forcefully are listened to and admired. It wouldn’t be surprising if this environment often induces us to express our views in a stronger and less qualified form than we would otherwise do. This may then lead our mindreading systems to interpret us as being strongly convinced of the views we express, which may in turn incline us to assert them with even more strength, and so on, cyclically. In this way, our culture may work to encourage complacency, rapidly boosting twitching tadpoles into honking frogs -- with negative consequences for the quality of public debate.

Jean’s questions are salutary, then. Reflecting on them and studying the mechanisms of self-knowledge will certainly help us to understand ourselves better, and it may mitigate the negative effects of our culture of debate, encouraging self-scrutiny, reducing complacency, and quietening the frogs a little.



[1]       Barnes, J. (1986), Staring at the Sun (Knopf), p.20.

[2]       For more details of these studies, see Carruthers, P. (2011), The Opacity of Mind (Oxford University Press), chapter 11.

[3]       For a review of the evidence for a mindreading system, see Carruthers (2011), chapter 8.

Latest Releases
Join the conversation

Mark Hoskins 15 August 2017

Can I have a belief about a belief without knowing it?