Chapter 6

Consciousness

Consciousness is one of the most important mental phenomena of the mind. It can rightly be considered the supreme mental phenomenon because it makes us sentient in a special way that differentiates us from present-day artificial intelligence (AI). Without it, no matter how intelligent we are, we will be just robots of some kind, roaming this planet without having awareness and experiences of what it is like to be ourselves, to see colorful flowers, to hear melodious music, to feel tender touch, to feel happiness, and so on in our minds.

However, consciousness is a term that has various meanings [1–21], and the meaning used above is just one of them. This theory investigates consciousness only in this meaning to answer questions about it. First, this chapter clarifies what consciousness in this meaning is and how it relates to consciousness in other meanings. The next chapter resolves the questions of its nature: what it is physically and ontologically, why and how it occurs, and other related questions.

6.1 Consciousness: Definition

 If our mind is examined while we are not engaged in any specific activity, it will be found to have many mental processes going on concurrently as resting-state mental processes—for example, thinking casually, experiencing a light mood, and perceiving trivial sensations (such as non-exciting visions, soft sounds, and light touches) from the uneventful surroundings. When there is a strong enough stimulus, either external (a flash of light, a loud call, a needle prick, etc.) or internal (a popping-up thought of an urgent task, a sudden reliving of a past exciting incident, a pang of sorrow, etc.), our attention will be directed to that stimulus, and our concentration will focus on it. In addition, the mind will become less aware or even unaware of the resting-state mental processes that are unrelated to the stimulus. For example, when exciting news appears on television and we turn to watch it, we may become less aware or unaware of the sounds in the surroundings, not register pain in our backs, and not feel any emotion that we felt before. Thus, normally, when we are awake, many mental processes occur in a dynamic state—some become dominant for some time, and some become marginal or even disappear for some time. However, whether we are doing nothing or concentrating on something, some mental processes are always at work if we are awake and in a normal condition. One of them functions to be aware of and experience mental phenomena (such as image, sound, and emotion) with awareness and experiences of what those mental phenomena are like (such as what the image of flowers is like, what the sound of music is like, and what the feeling of happiness is like) occurring.     

Because this mental process functions to be aware of and experience mental phenomena with awareness and experiences of what the mental phenomena are like occurring, this mental process functions to be consciously aware of and consciously experience mental phenomena (see Section 3.1), and the awareness and experiences of what those mental phenomena are like that occur are called conscious awareness and conscious experiences, respectively (see Section 3.4). Accordingly, in this theory, this mental process will be called the consciousness mental process and can be defined as:

Definition: The consciousness mental process is the mental process that functions to be consciously aware of and consciously experience mental phenomena.

Equivalently, it can also be defined as:

Definition: The consciousness mental process is the mental process that functions to create conscious awareness and conscious experiences.

However, under normal conditions, the mind is consciously aware of and consciously experiences several mental phenomena (such as images, sounds, touch, emotions, and thoughts) simultaneously. Hence, under normal conditions, several kinds of conscious awareness and conscious experiences occur concurrently. Therefore, under normal conditions, the consciousness mental process functions to create a composite of all conscious awareness and conscious experiences. If we define consciousness as what the consciousness mental process functions to create, then consciousness is the composite of all conscious awareness and conscious experiences. This is its definition in this theory:

Definition: Consciousness is the composite of all conscious awareness and conscious experiences. 

According to the definitions and discussions of conscious awareness and conscious experiences in Section 3.4, the above definition means in detail that consciousness is the composite of all awareness and experiences of what qualia are like. Equivalently, it means that consciousness is the composite of awareness and experiences of what all occurring qualia are like. In real life, this means that consciousness is the composite of awareness and experiences of, among other things, what an image of a house, a sound of a song, an odor of a flower, a feeling of happiness, and a thought of one’s self are like. Since qualia are representations of things in our lives (see Section 3.7), in plain language, consciousness is the composite of awareness and experiences of what the things in the outside world are like in their various representations (image, sound, smell, taste, touch, etc.). In this theory, when conciseness is required, this last description will be shortened to “consciousness is awareness and experiences of what things are like.”

Also, because the composite that is consciousness comprises only conscious awareness and conscious experiences, to be concise, the shorter terms—the composite, conscious awareness and conscious experiences, and conscious awareness and experiences—will be used (instead of the full, longer term: the composite of …) interchangeably to mean consciousness in general discussions about it. Which term will be used depends on the aspect being focused on at that point. Also, when we investigate consciousness in a specific situation, such as perception of an image, sound, or smell individually, the term consciousness of or (be) conscious of, such as consciousness of a red color quale, consciousness of a musical sound quale, or (be) conscious of a flower smell quale, will be used instead of the base terms consciousness and conscious to denote this specificity. Consciousness of in this usage is just conscious awareness and a conscious experience of that quale; it is not consciousness in general, which is the composite of all conscious awareness and conscious experiences. 

Finally, it is important to note that consciousness by this definition involves phenomenality or manifestation of what itself is like in the mind in two aspects:

     I. Because awareness and experiences of what qualia are like (what an image of a house, a sound of a song, an odor of a flower, etc. are like) occur when we are conscious of those qualia, it means that awareness and experiences of the phenomenality of those qualia occur when we are conscious of them.

     II. Because we can be consciously aware of and consciously experience what our consciousness is like, our consciousness itself is consciously experienceable and manifests what itself is like in the mind; thus, consciousness itself is a quale and has phenomenality

Sometimes, to emphasize the involvement of phenomenality, the term phenomenal consciousness [2,4,8–10,12,15–18,22–37] will be used, but it means the same as consciousness.

6.2 Other Definitions of Consciousness

The definition of consciousness in the previous section is just one definition among several others. Consciousness is an artificial concept. The mind comprises numerous interrelated mental processes and has various aspects. It is arbitrary for anyone to choose, according to his or her opinion, which aspect to focus on and which one or ones of those mental processes should be called or grouped together to be called “consciousness” [7]. This results in the term “consciousness” having varied meanings and referring to related but not the same entities. Hence, to avoid confusion in the following discussions, which involve consciousness of varied definitions, consciousness (regular typeface) is used in this chapter to represent a term that is used generally and can have varied meanings, whereas consciousness (bold and italics) is used to denote a term that has a fixed meaning as defined in the previous section. Their adjective forms are conscious and conscious, respectively. In other chapters, only consciousness (regular typeface) is used to avoid difficulties in reading; most of the time, it refers to consciousness as defined by this theory, but sometimes, consciousness in other meanings, depending on the context.

     Some of the most common meanings of the terms consciousness and conscious are as follows:

1. A creature’s state of being awake, responsive, and capable of performing mental functions. For example, he lost consciousness in a car accident and was unresponsive afterward, with no eye opening, vocalization, or motor response to any stimuli; however, a few hours later, his consciousness returned to normal—he became awake, responsive, and capable of performing mental functions normally, with eye opening, vocalization, and motor responses to stimuli, as usual. 

Compared with consciousness, consciousness in this meaning is different because it does not require that awareness and an experience of what something that one is conscious of is like must occur; just being awake, responsive, and capable of performing mental functions is enough for consciousness to occur. Therefore, any animal that is awake, responsive, and able to perform mental functions can rightfully be said to be conscious and have consciousness in this meaning. Similarly, because functioning robots are manifestation-wise awake, responsive, and capable of performing electronic functions comparable to human mental functions, such as sensing stimuli, processing information, and commanding their effectors (e.g., sound-producing devices, arms, or legs), resulting in human-like behaviors, such as sensing things (see, hear, smell, etc.), assessing situations, making decisions, remembering things, and performing human-like activities (move around, talk, work, etc.), one can argue that, at least manifestation-wise, robots are conscious and have consciousness in this meaning.

2. A mental state of being aware of something. In this meaning, consciousness is typically used in its adjective form, conscious. For example, he is conscious of the smell in the room; she is conscious of the pain in her body; and we are conscious of what is going on around us. 

Compared with conscious and consciousness, conscious and consciousness in this meaning are different because they do not require that awareness and an experience of what something that one is conscious of (such as the smell or pain) is like must occur; just being aware of something is enough to be conscious and for consciousness to occur. Therefore, one can argue that animals and robots are also conscious of things and have consciousness in this meaning because they are aware of things around them, such as images, sounds, smells, and other things, and can react to them accordingly.

3. A mental state of being aware of something with attention, resulting in detailed information about that thing gaining access into the brain’s workspace, where it is distributed to various parts of the brain (or becomes globally available) to be processed by other processes (such as analyzing, memorizing, and reporting processes). Because, in this case, consciousness occurs when something’s information gains access into the brain’s workspace, many authors call this type of consciousness “access consciousness” or “a-consciousness” [4,8,10,12, 16,18,22–25,32,38–43]. For example, when a man was walking through a crowd in a mall, his attention was caught by a loud announcement; he became conscious of the message that there would be a show at the convention hall on the third floor at 4 pm; after thinking for a while, he decided to attend, remembered the place and time, and called his friends to tell them about it. This means that the detailed information about the message had gained access to his brain’s workspace and was distributed globally to other processing processes with the occurrences of, in addition to access consciousness of the message, several other activities, including analyzing, decision-making, memorizing, and reporting. This contrasts with the sounds of the crowd and other things in the mall that may have entered his mind in another way: without attention and detailed information. Although he was aware of them, as evidenced by the fact that he was able to tell that there were sounds in the surroundings other than the sound of the announcement, his consciousness of these background sounds lacked the important characteristics mentioned above—further processing of the details of those sounds (such as analyzing, memorizing, and reporting) did not occur (because the detailed information did not enter the brain’s workspace and, thus, was not distributed globally for wider processing). In this case, some kind of consciousness of the surrounding sounds (such as consciousness in the second or fifth meaning) must have occurred, but it was not access consciousness like the one that occurred for the announced message. 

Compared with consciousness, consciousness in this meaning or access consciousness is different because it does not require that awareness and an experience of what something that one is conscious of (such as the announcement sound) is like must occur; just being aware of something with attention with the result that its detailed information becomes globally available and can be processed in various ways, as mentioned above, is sufficient for consciousness or access consciousness to occur. Therefore, when used in this meaning, one can argue that animals and robots also have consciousness or access consciousness because they can be aware of things with attention so that detailed information about those things becomes globally available in their minds or processing systems (in the case of robots)—this is evident from their behaviors, which reflect acquisition, integration, and utilization of the available detailed information.

4. A composite of mental processes that performs complex functions, resulting in complex abilities and behaviors. These complex functions are numerous, such as binding several components of sensory inputs (such as shape, color, and brightness components of a visual input) into a single unified percept (such as a complete image), selecting a certain stimulus from various stimuli to process, directing attention to and maintaining concentration on the selected stimulus, creating and modulating emotions, recognizing oneself and others’ selves, remembering things, searching past memories, utilizing past experiences, analyzing situations, comparing choices, making decisions, planning actions, executing flexible, goal-directed behaviors, and creating and using tools [1,2,18,44–49]. For example, healthy people are conscious in this meaning because they can perform the complex mental functions just mentioned, as is evident from their complex behaviors, which require complex mental functions.

Compared with consciousness, consciousness in this meaning is different because it does not require that awareness and an experience of what something that one is conscious of is like must occur; just being able to perform complex mental functions resulting in complex abilities and behaviors is enough for consciousness to occur. Therefore, when used in this meaning, one can argue that several types of animals, such as insects, fish, reptiles, birds, and mammals, have consciousness because they exhibit complex abilities and behaviors that require complex mental functions. Many people consider these complex mental functions as markers of consciousness in animals or even as animal consciousness itself [43,44,50–53]. Similarly, one can argue that robots also have consciousness in this meaning because functioning robots have electronic processing functions that are equivalent to these complex mental functions, resulting in similar complex abilities and behaviors. For example, they can perceive complex stimuli, remember and recall things, analyze situations, make decisions, plan actions, command their effectors to work, and play complex games (such as chess, Go, and bridges) [54–58].

5. A mental state of being aware of and experiencing something with awareness and an experience of what that thing is like occurring. For example, when we look at a red color and a red color quale occurs in our minds, awareness and an experience of what the red color quale is like occur; that is, consciousness of the red color quale occurs. Because awareness and an experience of what the red color quale is like—which are awareness and an experience of the red color quale’s phenomenality—occur, many authors call this kind of consciousness “phenomenal consciousness” or “p-consciousness” [2,4,8–10,12,15–18,22–37]. 

Compared with consciousness, consciousness in this meaning is similar because it requires that awareness and an experience of what something that one is conscious of is like must occur. In this meaning, we cannot be certain or argue with strong evidence that animals have phenomenal consciousness because they cannot report to us whether they are aware of and experience what something that they are conscious of is like in their minds. Also, it is certain that present-day robots do not have phenomenal consciousness because they are definitely unaware of and do not experience what something that they are aware of is like in their processing systems. This is because, in a robot, every function requires a specific circuit or software—no function in a robot is possible without a specific circuit or software. But a present-day robot simply does not have a specific circuit or software for it to be aware of and experience what something is like (because we, its creators, do not build such circuits and software for it as we, at present, plainly do not know how to do so). Therefore, it is clear that current robots do not have phenomenal consciousness (a more detailed discussion on this matter is provided in Chapter 9, Section 9.3.1: Do computers and robots have qualia and consciousness?).

People usually use the terms conscious and consciousness in the first, second, or fourth meanings. The terms in the third meaning are mostly used by neuroscientists. Several neurological theories of consciousness, such as the Global Workspace Theory [59–64], Global Neuronal Workspace Hypothesis [65–70], and extended theory of the global workspace of consciousness [71], are about consciousness in this meaning. Animal cognition and robotics scientists usually use consciousness in the fourth meaning. Most philosophers, as well as some neuroscientists, discuss the terms in the fifth meaning; the hard problem [1–3,8,16,17,26,29,36,37,72–78] and explanatory gap of consciousness [1–3,8,15,16,25,27,29,35–38,73,79–83] are the problems about consciousness in this meaning. Consciousness investigated in this theory, consciousness, is also consciousness in this fifth meaning.

It should be noted that these types of consciousness are not mutually exclusive. In fact, they usually occur together. For example, the consciousness that occurs when one attends to a stimulus (such as the announcement sound in the example above) is consciousness in all the five meanings: consciousness in the first meaning because one is awake, responsive, and capable of attending the announcement; in the second meaning because one is conscious of the announcement; in the third meaning (or a-consciousness) because one pays attention to the announcement with the result that its detailed information gains access into the mind and becomes globally available for various processing processes; in the fourth meaning because one is able to perceive and interpret the announcement (which requires complex mental functions); and in the fifth meaning (or p-consciousness) because one can be aware of and experience what the quale of the announcement sound is like. However, sometimes, the consciousness that occurs can be consciousness in only some but not all meanings. For example, in the example of the man in the mall, consciousness of the background sounds can be considered consciousness in all but the third meaning, as discussed previously.

6.3 Other Meanings and Related Terms

There are still other but less common usages of the word consciousness in other meanings. For example, some use the term consciousness interchangeably with the mind [12,13,84] or all mental processes that function to evaluate and model the world [85] or with a non-material entity that pervades the universe, the term used in some beliefs and religions [18,85–89]—all of these probably have too broad meanings. Some use consciousness to mean thought or self-consciousness, which is obviously too narrow a meaning [4,15–18,20,36]. Also, some use consciousness to mean knowledge—that is, “one is conscious of something” means “one has knowledge of it”; this meaning is certainly too narrow as well [4,17,18,36]. Moreover, things are even more complicated because other words are sometimes used to refer to consciousness, such as awareness, feelings, the first-person perspective, phenomenality, phenomenal experience, qualia, and subjective experience [74,90]. Therefore, the reader should be careful about the meanings of the terms consciousness, conscious, and other mentioned wordings when finding them in literature.

Regarding the content of consciousness, because consciousness in some usages (such as that in the first and fourth meanings) does not require that, to be conscious or have consciousness, one must manifestly be conscious of something, some authors use the word “intransitive consciousness” [16,24,45,91] to describe such consciousness. On the other hand, the word “transitive consciousness” [12,16,24,45,91–93] is used to describe consciousness that overtly has content (such as that in the second, third, and fifth meanings), in which case one can be said to be conscious of something. In addition, some authors consider consciousness to be “state consciousness” or “mental-state consciousness” if the consciousness is just a state of the mind (such as that in the second, third, and fifth meanings) but “creature consciousness” if the consciousness is a property of the whole creature (such as that in the first and fourth meanings) [12,16,24,45,91,92].

Finally, many specific terms are used by different authors for different types of consciousness that have different types of awareness content. The most basic consciousness, which has awareness only of sensory perception, is usually called “primary consciousness” [27–29,94–100], “core consciousness” [101,102], “sensory consciousness” [28,51], “perceptual consciousness” [94,96], or “anoetic consciousness” [100,103]. A more complex consciousness that has awareness of not only sensory perception but also other things is called “higher-order consciousness” or “higher-level consciousness” [94,95,99]. More specifically, in addition to awareness of sensory perception, a higher-order consciousness that has awareness of executive control, decision making, voluntary action, one’s own thoughts, and one’s own awareness is called “noetic or reflective consciousness” [94–96,100,103], and a higher-order consciousness that also has awareness of one’s own identity over time is called “autonoetic consciousness” [100,103]. It should be noted that the definitions of these terms do not specify phenomenality as a part of consciousness. Thus, even with the current technology, we can make computers or robots aware of the content of these types, and one can argue that such computers or robots have consciousness as defined in these terms. On the contrary, this theory requires phenomenality to be part of consciousness but does not specify the kind of its content. The consciousness’s content can be simple awareness of sensory perception or complex awareness of executive control, decision making, voluntary actions, etc. The determining point is that, if phenomenality occurs, it is consciousness; if not, no matter how complex its content is, it is not consciousness.

In summary, the meaning of consciousness is similar to consciousness in the fifth meaning. To reiterate, consciousness is the composite of all conscious awareness and conscious experiences—the composite of all awareness and experiences of what qualia are like. This theory deals with consciousness in this meaning because only in this meaning is consciousness phenomenal consciousness, which differentiates sentient from non-sentient beings and separates us from AI, as noted previously. Without having the awareness and experiences of what qualia are like, even the most highly intelligent and capable brains will function and be just like central processing units in present-day computers and robots. For other authors, the opinions may be that consciousness should also include complex mental functions [18,45–49], such as those exemplified in the fourth meaning of consciousness. However, those functions can be set up even in today’s computers, robots, and other types of AI, and AI can be even better (faster, more accurate, more stable, less exhaustible, more knowledgeable, etc.) at those functions than we are. On the other hand, AI cannot be aware of and experience what qualia are like, while, contrastingly, we can. Thus, those functions do not distinguish us from AI, but awareness and experiences of what qualia are like do. In addition, opinions may vary on how many of those functions and which ones of them should be included in the entity that one wants to call consciousness. Therefore, because the term consciousness, as defined in Section 6.1, differentiates us from AI and suffices for discussions on the most important problems of consciousness, such as the hard problem and explanatory gap, it is adequate and will be used as the working definition in this theory. This definition can be considered the minimum definition of consciousness; a wider definition that includes some complex mental functions is possible but is not needed in this theory. It will be seen that any wider definition will not affect the conclusions and predictions of theorems in this theory, which uses this minimum definition.

6.4 Other Aspects of Consciousness 

There are some other aspects of consciousness that should be discussed to understand consciousness more comprehensively. They are as follows:

     I. It should be noted that the consciousness mental process does not function to be consciously aware of and consciously experience all mental phenomena (or products of mental processes). Therefore, conscious awareness and experiences, and thus consciousness, do not occur for all mental phenomena, as discussed in Sections 3.4 and 3.5. Some mental processes do not have mental phenomena or other components that the consciousness mental process can be aware of and experience such that awareness and experiences of what something in these mental processes is like occur. Thus, there is no consciousness of these mental processes. These mental processes are called subconscious or unconscious mental processes [18,104–135]. Mental processes whose mental phenomena the consciousness mental process can be consciously aware of and consciously experience are called conscious mental processes (which differ from the consciousness mental process). It is only the phenomena (or products) of these conscious mental processes, not how they perform their functions, that the consciousness mental process is consciously aware of and consciously experiences [18,76]. For example, the final-stage visual-perception, emotion-generation, and thought-formation mental processes are conscious mental processes because they have phenomena (or products)—mental images, emotions, and thoughts—that the consciousness mental process can be consciously aware of and consciously experience, not because the consciousness mental process can be aware of and experience how these mental processes perform their functions to produce their phenomena. In other words, we are conscious of the products of mental processes, not of their processing processes [18,76].

     Mental processes other than those in the above examples are also conscious mental processes if they have phenomena that are consciously experienceable. They can be classified into two groups, C1 and C2, as discussed in Section 3.4. They are listed here again for ease of reading: 

     C1. Mental processes in the final stages of all external sensory perception processes and of some internal sensory perception processes, such as those in the final-stage visual, auditory, olfactory, vestibular, and visceral pain perception processes, which have phenomena or products that are consciously experienceable, such as mental images, sounds, smells, equilibrioception (sense of balance), and visceral pain.

     C2. Mental processes in the final stages of some of the highest-level emotion-cognition-execution processes, such as those in the final stages of emotion-generating, thinking, planning, decision-making, and motor-commanding processes, which have phenomena or products that are consciously experienceable, such as emotions, thoughts, mental plans, decisions, and motor commands. 

     However, as discussed in Section 3.5, the vast majority of mental processes are unconscious mental processes; they do not have phenomena or any other components that are consciously experienceable, and consciousness does not occur with these unconscious mental processes. They can be classified into two groups, UC1 and UC2, as discussed in Section 3.5. They are also listed here again for ease of reading:

UC1. Mental processes in a) the early stages of all sensory perception, b) all stages of non-sensory stimulus perception, and c) some special situations. Some examples are a) those in the early stages of visual, auditory, and somatosensory perception, b) those in all stages of perception of blood constituent levels, vessel pressure, and reflex stimuli, and c) those in cases of subliminal perception [135–139], binocular rivalry [140–147], attentional blinks [148–154], inattentional blindness [155–161], and blindsight [162–165]. We are not aware of and do not experience what something is like in those mental processes.

UC2. Mental processes in a) the cerebral hemispheres except those specified in C1 and C2 and b) all other structures of the nervous system (the brainstem, cerebellum, spinal cord, and peripheral, autonomic, and enteric nervous systems). Mental processes in this group are numerous. Examples include those that function for unconscious attention, unconscious thinking, unconscious reading, unconscious visual perception and face recognition, unconscious mathematical operations, unconscious decision making, unconscious control of muscle tone and motor coordination, unconscious control of spontaneous breathing, unconscious control of cardiac functions and vessel contraction, unconscious control of blood constituent levels, unconscious control of gastrointestinal tract secretions and movements, and unconscious control of all reflexes [18,104–135]. We are not aware of and do not experience what something is like in those mental processes.

II. Because consciousness in the second, third, and fifth meanings has content, it always occurs with other mental processes (i.e., conscious mental processes C1 and C2) that provide the content (e.g., sensory perception, emotion, and thought) for the consciousness. In other words, consciousness never occurs alone in these meanings [93].* Therefore, it is possible to view consciousness and the content (provided by the conscious mental processes) as parts of a larger entity, which may be called “consciousness plus” or other comparable terms. However, many still call this larger entity “consciousness,” as in the third and fourth meanings. Because consciousness in this larger form is commonly referred to simply as consciousness, both in the literature and by the media and general public, one must be mindful that the term consciousness found in the literature or media may have this meaning, not the meaning defined by this theory or other definitions.

[*Some authors believe that consciousness can occur alone without content and call this type of consciousness “minimal phenomenal experience” (MPE), “consciousness as such,” or “pure awareness” [166–168]. However, this is, at present, controversial.]

     III. Regarding the operation of consciousness mental processes, it is to be noted that, although, at any moment under normal conditions, consciousness consists of conscious awareness and experiences of several mental phenomena simultaneously, the intensity and accuracy of the awareness and experiences of these mental phenomena are usually not equal. During any period, only one or a few mental phenomena have primary conscious awareness and experiences, whereas others have secondary conscious awareness and experiences. The former have intense and accurate awareness and experiences, whereas the latter have less intense and less accurate awareness and experiences. For example, when we are watching a movie, our primary conscious awareness and experiences will be of the visual perception, auditory perception, and emotions that are movie-related, while our conscious awareness and experiences of non-movie-related sensory perception and emotions, time, etc. will become less intense and less accurate. Even in the case of a mental phenomenon that has the primary conscious awareness and experience, the intensity and accuracy of the awareness and experience of that mental phenomenon are usually not equal for all its parts. For example, the intensity and accuracy of the awareness and experience of an image will be maximal at the center of the visual field but diminish drastically for the periphery of the visual field. Reports of parts that have secondary awareness and experiences are usually inaccurate or sometimes outright wrong. These inaccuracies commonly occur in daily life. Even the reader can notice that the reader cannot accurately describe things that are not in the central visual field, such as the letters that surround the text the reader is reading right now, or cannot accurately report what has happened in the surroundings during the past five minutes while the reader was reading this chapter. Such inaccuracies in secondary conscious awareness and experiences can be demonstrated more objectively in experiments involving phenomena such as attentional blinks, inattentional blindness, and change blindness [169–173].

     IV. It is undeniable that these kinds of secondary awareness and experience exist and that we are not unconscious of those objects that do not have primary awareness and experiences. Evidently, we can be aware that, apart from the thing at the center of our interest and attention, there is something that we can inaccurately register in the background or periphery—the background and periphery are not blank [174–176]. Obviously, we do not live in a world consisting only of primary awareness and experiences. Because we are conscious of these background objects in the sense that awareness and experiences of what their qualia are like occur in our minds, conscious awareness and experiences or phenomenal consciousness must have occurred. However, because we cannot accurately report these background objects in detail, their detailed information does certainly not enter the global workspace to be widely distributed and used in various ways, as previously discussed in the third meaning of consciousness. Thus, access consciousness in the sense that is discussed there (which is the sense used in the Global Workspace or Global Neuronal Workspace Theory [59–70]) does certainly not occur in these kinds of secondary awareness and experience. Nevertheless, if one relaxes the definition of access consciousness and uses it in the sense that some information has gained access to the mind, has been registered, and is at least roughly reportable, access consciousness in this relaxed sense must certainly occur for these kinds of secondary awareness and experience. 

     V. Because the details of objects that we are aware of in secondary awareness can vary from completely wrong to partially correct to almost completely correct, it means that consciousness content in secondary awareness can be present in a graded fashion [177–180]—it does not need to be present in an all-or-none fashion as consciousness content of something that we pay attention to has been shown to be [181,182]. The consciousness content of something in an unattended situation can be degraded by being fragmented (only some parts of that thing being registered), less salient (having less effect on impression, memory, emotion, etc.), too generic (classifiable only in a general category not in a specific category, such as a skittering mouse identifiable only as a small animal), or flash-like (appearing very briefly) [178]. It has been found that, at the representational level, the factors that can affect perceptual representations are the object’s intensity, precision, stability, and availability for working memory [178]. Relatedly, several recent studies have suggested that consciousness can admit of degrees and proposed a graded perceptual awareness scale for objective quantification [24,45]. 

     VI. Another interesting but controversial issue is whether phenomenal consciousness and access consciousness must occur together or whether one can occur without the other [26,32,38,183–185]. In addition, the mechanism for phenomenal consciousness occurring without access consciousness is still being debated, especially the much-discussed one—the overflowing [26,38,186–189]. However, the discrepancy in the occurrence of the two types of consciousness is real and does exist. For example, in the case of the man in the mall discussed previously in the third meaning of consciousness, phenomenal consciousness of the background sound occurs, but access consciousness of the background sound does not. Again, it seems that the discrepancy lies in the definition of “access” in access consciousness (as just discussed in IV). If “access” must involve attention and result in the accessing information becoming globally available, processed in various ways, and reportable in detail (i.e., “access” in the strict sense), then phenomenal consciousness can sometimes occur without access consciousness. On the other hand, if “access” means merely that information about something has gotten into the mind’s processing system, resulting in registration of that thing with at least that thing being roughly reportable (i.e., “access” in the relaxed sense as noted in IV), then phenomenal consciousness always occurs with access consciousness. 

6.5 Conscious Alertness and Awareness

Functionally, consciousness in most meanings, including consciousness, can be considered to have two major aspects: a) alertness (wakefulness or arousal) and b) awareness and experiences (registration and feelings of something, which is consciousness content) [4,18,20,21,111, 183–185,190–194]. Alertness depends on a neural network called the ascending reticular activating system (ARAS) [20,102,183,192, 193,195–198], whereas awareness and experiences depend on the consciousness neural circuit (discussed in detail in the next section). The ARAS is a neural network in the core of the brain extending from the rostral pons to the thalami and bilateral basal forebrains, from where it sends extensive axonal projections to the cerebral cortices, including those of the consciousness neural network. There can be no conscious alertness if there is no stimulation from the ARAS, even though the consciousness neural network is intact and able to function, such as in the case of extensive brainstem hemorrhage, infarction, and injury. The level of conscious alertness varies according to the activation from the ARAS—the stronger the activation, the higher the level of alertness. Thus, alertness can vary on a continuum of states, from the lowest to the highest level. Frequently used terms for some of these states, from the lowest to the highest level, are coma, stupor, drowsiness, normal alertness, and heightened alertness.

Regarding the content, conscious awareness and experiences can vary in terms of both the quantity and quality of their content. The quantity of content depends on the extent of the consciousness network function. When there is an impairment of the consciousness network that decreases its functioning extent, such as in cases of cerebral infarction, injury, or hypoxia, the content’s quantity will decrease. The extent to which it decreases depends on the extent of the impairment. Thus, like conscious alertness, the quantity of consciousness content can vary on a continuum of states from consciousness with minimal content to consciousness with normal content. Frequently used terms for some of these states, from that with minimal content to that with normal content, are vegetative state/unresponsive wakefulness syndrome (VS/UWS), minimally conscious state (MCS), akinetic mutism [21,90,183,184,191,194,199–213], dementia [214–217], various cerebral neglect syndromes [218–221], and normal consciousness. In the first two categories, which represent severe conditions, patients can open their eyes and have some reflex responses, such as blinking, chewing, and yawning, but show no (in VS/UWS) or minimal (in MCS) signs of conscious awareness of their self and the environment (which can be investigated by clinical testing or by special investigations using EEG, evoked potentials, and fMRI) [21,183,191,194,199,202,203,205,206,222].

Concerning the quality of content, consciousness can vary in its content’s quality (or complexity): from consciousness of basic content, such as consciousness of sensory perception, to consciousness of more complex content, such as consciousness of one’s own mental activities, to consciousness of even more complex content, such as consciousness of one’s own self and consciousness of being conscious. Several authors call consciousness with these specific kinds of content using specific terms, namely, primary consciousness, higher-order consciousness, and reflective consciousness, as discussed in detail in the third paragraph of Section 6.3. Some authors call these kinds of differences “different degrees” or “different levels” of consciousness [56,94–96,103], which actually refer to different degrees or levels of consciousness content quality. Unfortunately, these phrases can be confused with the different degrees of awareness accuracy discussed in Paragraph V of Section 6.4 or with the different levels of alertness discussed in the first paragraph of this section. Accordingly, one must be careful about what kind of degree or level is being referred to when the term is used without specification or contextual clues.

These variable abnormalities in consciousness show that consciousness (in most meanings with the probable exception of access consciousness) is not an all-or-none but a graded phenomenon [48,51,85], depending on both the strength of stimulation from the ARAS and the functioning of the consciousness neural circuit. Conventionally, the graded level of conscious alertness and awareness can be scored using various scales, such as the Jouvet coma scale, Glasgow Coma Scale (GCS), Full Outline of Unresponsiveness (FOUR) score, Bozza-Marrubini scale, and Coma Recovery Scale-Revised (CRS-R) [213,223–231].

Presently, many scientists have the opinion that consciousness is a multidimensional phenomenon, having other important aspects, dimensions, or features than the aspects of alertness and awareness [48,206,232–236], such as subjective awareness, cognition, and behavioral aspects [233]; phenomenal, semantic, physiological, and functional aspects [48]; qualitative richness, situatedness, intentionality, integration, and dynamics and stability [206]; p-richness, e-richness, unity, temporality, selfhood, and many other features [44,235]. Thus, they believe that these elements should also be included in the complete assessment of consciousness [44,48,206,233,235]. Another important issue is that measurements of consciousness usually require signaling (e.g., verbal, ocular, or manual) from the patient. This can make consciousness measurements inaccurate if the patient has problems in the motor system that impair signaling. Therefore, objective consciousness measurements that are independent of the subject’s signaling ability and cover more than overall alertness and awareness are being developed, such as the perturbational complexity index (PCI), which employs transcranial magnetic stimulation [236–238], and the local global paradigm (LGP) [236,238–240]. However, at present, no method that objectively assesses consciousness in these various facets is standard or widely accepted [48,206,233].      

Despite these remarkable developments, it should be noted that this kind of multi-dimensional consciousness comprises various complex perception, emotion, and cognitive-executive functions, not only alertness, awareness, and experiences. As previously discussed, although these complex functions usually occur with consciousness, they are not consciousness, as defined in this theory, which is only awareness and experiences of what qualia are like. Hence, the reader should observe that the proposed consciousness measurements above, although clinically and scientifically valuable in assessing the overall complex brain functions, do not specifically measure conscious awareness and experiences or consciousness as the sole target of assessment. This fact must be considered when employing these measurements or interpreting the results.

6.6 Consciousness Neural Process and Circuit

Because the consciousness mental process is a mental process, it cannot occur alone by itself but must occur with some neural process (Theorem I). In this theory, this neural process is called “the consciousness neural process,” and its neural circuit, “the consciousness neural circuit.” Both the consciousness neural circuit and process function to create conscious awareness and experiences, or consciousness, as the consciousness mental process does. From the properties of consciousness, it can be predicted that some important characteristics of the consciousness neural circuit and process must be as follows:

  1. The circuit must be an extensive network that connects to all neural circuits that have qualia so that awareness and experiences of all qualia are possible. These qualia neural circuits are neural circuits that function with conscious mental processes (C1 and C2, Section 6.4, which have qualia). However, it is not connected to the lower neural circuits that perform more basic functions, such as neural circuits in the basal ganglia, brainstem, cerebellum, and autonomic nervous system, in such a way that conscious awareness and experiences can occur. Consequently, there is neither conscious awareness nor conscious experience of anything in these neural circuits.
  2. It must be able to read signals from the neural circuits that have qualia so that it can obtain the qualia’s information to process and create awareness and experiences of the qualia, i.e., the consciousness of the qualia.
  3. It must have its signals fed back to itself so that it can be aware of and experience the created consciousness of the qualia (created as described above), which means its signaling state must be a reentrant signaling state.

Since the consciousness neural circuit and its neural processes are very important in creating consciousness, a large amount of research has been conducted on them. In the field of neuroscience, they are usually referred to as the neural correlates of consciousness. This term is usually described as the minimal neuronal mechanisms that are jointly sufficient for any one specific conscious percept [4,6,26,49,68,75,97,111,125, 193,241–266]. Although conclusive answers have not yet been obtained, some models are strongly supported by experimental evidence and are widely accepted. Regarding the neural circuit of consciousness, as far as one can conclude from the evidence at present, it seems very likely that the default mode network or resting state network, which includes several brain regions—the medial prefrontal cortex, posterior cingulate cortex, retrosplenial cortex, precuneus, and some parts of the parietal lobe, temporal lobe, cerebellum, and basal ganglia—and has the anterior medial prefrontal cortex and posterior cingulate cortex as the hubs, is the neural network for background or resting state consciousness [190,267–288]. This network functions to create consciousness for usual background stimuli, both external and internal, that enter the mind concurrently and continuously and do not require specific attention and concentration. It overlaps with the network that functions to create consciousness for episodic stimuli that gain access to the mind at times and attract specific attention [4,8,10,12,16,18,22–25,32,38–43]. At present, evidence shows that the network of the Global Workspace Theory proposed by Baars [59–64] or the network of the Global Neuronal Workspace Hypothesis of Dehaene [65–70], both of which include a cortico-thalamic (C-T) core and a network of neurons with long-range axons densely distributed in the prefrontal, fronto-parietal, parieto-temporal, and cingulate cortices, is the network of this latter kind of consciousness, which is usually called access consciousness. Other studies point to similar networks that are in the superior parietal and dorsolateral prefrontal cortices [244,265,289], but others stress the posterior hot zone more, including the temporal, parietal, and occipital cortices [245,256]. Specifically, in the case of vision, ample evidence shows that cortical neural synchronization between neural processes in the occipital areas and parietal, temporal, or frontal areas is necessary for visual conscious experiences to occur [67,97,244,260,265]. Also, there are separate neural correlates for the visuospatial, emotional face, and written-letter processing components; that is, cortical neural synchronization between the dorsal occipital and parietal areas, between the parietal and frontal areas, and between the occipital, parietal, and temporal areas, respectively [97]. As for the complete neural network of consciousness, the one that functions to be conscious of both continuous background stimuli and episodic accessing stimuli, it is likely to be some combination of the two types of consciousness networks just discussed. Some proposed networks are the neural network in the extended theory of global workspace of consciousness submitted by Song [71], the overlapping network of a-consciousness and p-consciousness with two functional aspects introduced by Frigato [290], or the “functional rich club” presented by Deco et al. [291].

Finally, it should be repeated that the functions of the two major consciousness networks—one for episodic, attention-demanding stimuli and the other for continuous, non-attention–demanding stimuli—are not mutually exclusive. That is, the two consciousness networks do not work alternately in such a way that one is completely shut off while the other is working. For example, when the former network is functioning, such as when a subject in a consciousness experiment is attending to a visual stimulus on a screen, the latter network is still functioning, as evidenced by the fact that the subject is still aware of not only the target or the masks he or she is attending to but also other parts of the experiment (e.g., the signaling device that he or she must use and when and how he or she is supposed to signal) and the surroundings (e.g., the background vision, the ambient sound, and the room’s smell). Evidently, the subject can use the device to signal as he or she is supposed to and can later report some of the surroundings’ characteristics, which means he or she is aware of not only the target but also something else. However, the awareness of these background components is less intense and less accurate than that of the attended stimulus, and these background components cannot be reported in detail or with accuracy. Still, it is not that, while the mind is attending to something, such as the target in an experiment, it is not aware of anything else at all and that everything else vanishes [174–176]. Undeniably, we regularly have similar experiences in our daily lives. For example, when we are concentrating on listening to a song or to what somebody is speaking to us on the phone, the outside visual world does not disappear into nothingness even though we do not pay attention to it—we are still conscious of it to some degree and, hence, can describe it, at least roughly. It is of paramount importance that, in searching for the complete consciousness neural network, these facts must be taken into account.

6.7 Remarks

In conclusion, consciousness has various meanings. This theory selects to use consciousness in the meaning defined in Section 6.1 because consciousness only in this meaning is the one that makes us different from AI and that is the crux of the hard problem and the explanatory gap of consciousness, which this theory aims to resolve. 

The next chapter will answer fundamental questions about what this consciousness is physically and ontologically, why it occurs, how it occurs from physical systems, and other related questions.

⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓

Among all the things that happen in the mind,

the only one thing that makes us fundamentally different from AI

is awareness and experiences of

what an image of a house, what a sound of a song, what an odor of a flower, what a feeling of happiness, what a thought of oneself, and so on

are like …

the phenomenal consciousness.

⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓

> Go to Chapter 7

< Go back to Chapter 5

<< Go back to Homepage


References

  1. Chalmers DJ. Consciousness and its place in nature. In: Chalmers DJ, editor. Philosophy of mind: Classical and contemporary readings. Oxford: Oxford University Press; 2002. ISBN-13: 978-0195145816 ISBN-10: 019514581X. http://consc.net/papers/nature.html
  2. Chalmers DJ. Facing up to the problem of consciousness. J Conscious Stud. 1995;2(3):200–219. http://consc.net/papers/facing.html
  3. Chalmers DJ. Moving forward on the problem of consciousness. J Conscious Stud. 1997;4(1):3–46. http://consc.net/papers/moving.html
  4. De Sousa A. Towards an integrative theory of consciousness: Part 1 (neurobiological and cognitive models). Mens Sana Monogr. 2013 Jan–Dec;11(1):100–150. doi: 10.4103/0973-1229.109335. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3653219/
  5. Dehaene S. The brain mechanisms of conscious access and introspection. In: Neurosciences and the Human Person: New Perspectives on Human Activities. Pontifical Academy of Sciences, Scripta Varia 121: Vatican City; 2013:1–20. https://paperzz.com/doc/6764027/the-brain-mechanisms-of-conscious-access-and-introspection
  6. Dehaene S, Changeux JP. Experimental and theoretical approaches to conscious processing. Neuron. 2011 Apr 28;70(2):200–227. doi: 10.1016/j.neuron.2011.03.018. https://www.researchgate.net/publication/281109453_Experimental_and_Theoretical_Approaches_to_Conscious_Processing
  7. Hohwy J, Seth A. Predictive processing as a systematic basis for identifying the neural correlates of consciousness. Philos Mind Sci 2020;1:1–34. doi: 10.33735/phimisci.2020.II.64. https://philosophymindscience.org/index.php/phimisci/article/view/8947/8521 https://researchmgt.monash.edu/ws/portalfiles/portal/341186169/335240211_oa.pdf
  8. Gennaro RJ. Consciousness. In: Internet Encyclopedia of Philosophy. http://www.iep.utm.edu/consciou/
  9. Kriegel U. Chapter 3. Philosophical theories of consciousness: Contemporary western perspectives. In: Zelazo PD, Moscovitch M, Thompson E, editors. The Cambridge handbook of consciousness. Cambridge University Press; 2007:35–66. http://perpus.univpancasila.ac.id/repository/EBUPT181231.pdf
  10. Mashour GA, Alkire MT. Evolution of consciousness: Phylogeny, ontogeny, and emergence from general anesthesia. Proc Natl Acad Sci U S A. 2013 Jun 18;110(Suppl 2):10357–10364. doi: 10.1073/pnas.1301188110. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3690605/
  11. Pennartz CMA, Farisco M, Evers K. Indicators and criteria of consciousness in animals and intelligent machines: an inside-out approach. Front. Syst. Neurosci. 2019;13(25). doi: 10.3389/fnsys.2019.00025. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6660257/
  12. Rosenthal D. Concepts and definitions of consciousness. In: Banks WP, editor. Encyclopedia of Consciousness. Amsterdam: Elsevier; 2009:157–169. https://www.davidrosenthal.org/DR-Concepts-Dfns.pdf
  13. Seager W. Chapter 2. A brief history of the philosophical problem of consciousness. In: Zelazo PD, Moscovitch M, Thompson E, editors. The Cambridge handbook of consciousness. Cambridge University Press; 2007:9–33. http://perpus.univpancasila.ac.id/repository/EBUPT181231.pdf
  14. Stoerig P. Chapter 25. Hunting the ghost: Toward a neuroscience of consciousness. In: Zelazo PD, Moscovitch M, Thompson E, editors. The Cambridge handbook of consciousness. Cambridge University Press; 2007:707–730. http://perpus.univpancasila.ac.id/repository/EBUPT181231.pdf
  15. Sturm T. Consciousness regained? Philosophical arguments for and against reductive physicalism. Dialogues Clin Neurosci. 2012 Mar;14(1):55–63. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3341650/
  16. Van Gulick R. Consciousness. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy (Summer 2017 edition). https://plato.stanford.edu/archives/sum2017/entries/consciousness
  17. Velmans M. Chapter 1. How to separate conceptual issues from empirical ones in the study of consciousness. In: Banerjee R, Chakrabarti BK, editors. Progress in brain research. Vol. 168. Models of brain and mind physical, computational and psychological approaches. Amsterdam: Elsevier B.V.; 2008:1–10. https://www.researchgate.net/publication/28764945_HOW_TO_SEPARATE_CONCEPTUAL_ISSUES_FROM_EMPIRICAL_ONES_IN_THE_STUDY_OF_CONSCIOUSNESS
  18. Velmans M. How to define consciousness—and how not to define consciousness. J Conscious Stud. 2009;16(5):139–156. http://cogprints.org/6453/1/How_to_define_consciousness.pdf
  19. Velmans M. Is human information processing conscious? Behav Brain Sci. 1991;14:651–726. https://pdfs.semanticscholar.org/1bca/4e316885e05bda693868c7ce49cfbf206dba.pdf
  20. Zeman A. Consciousness. Brain. 2001 Jul;124(Pt 7):1263–1289. https://academic.oup.com/brain/article-pdf/124/7/1263/802709/1241263.pdf
  21. Zeman A. What do we mean by “conscious and “aware”? Neuropsychol Rehabil. 2006 Aug;16(4):356–376. doi: 10.1080/09602010500484581. https://www.ncbi.nlm.nih.gov/pubmed/16864477
  22. Block N. On a confusion about a function of consciousness. Behav Brain Sci. 1995;18(2):227–287. https://www.academia.edu/download/49640917/On_A_Confusion_About_a_Function_of_Consc20161016-5993-740vtb.pdf
  23. Boly M, Seth AK. Modes and models in disorders of consciousness science. Archives italiennes de biologie. 2012;150:172–184. doi: 10.4449/aib.v150i2.1372. https://www.researchgate.net/publication/233724791_Modes_and_models_in_disorders_of_consciousness_science
  24. Carruthers P. The problem of animal consciousness. Proceedings and Addresses of the APA. 2018b;92:179–205. http://faculty.philosophy.umd.edu/pcarruthers/The%20problem%20of%20animal%20consciousness.pdf
  25. Chalmers DJ. The conscious mind: In search of a fundamental theory. Oxford: Oxford University Press;1996. https://personal.lse.ac.uk/ROBERT49/teaching/ph103/pdf/Chalmers_The_Conscious_Mind.pdf
  26. Cohen MA, Dennett DC. Consciousness cannot be separated from function. Trends Cogn Sci. 2011 Aug;15(8):358–364. doi: 10.1016/j.tics.2011.06.008. http://www.michaelacohen.net/uploads/5/9/0/7/59073133/1-s2-0-s1364661311001252-main.pdf
  27. Feinberg TE, Mallatt J. Phenomenal consciousness and emergence: Eliminating the explanatory gap. Front Psychol. 2020;11:1041. doi: 10.3389/fpsyg.2020.01041. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7304239/
  28. Feinberg TE, Mallatt J. The evolutionary and genetic origins of consciousness in the Cambrian Period over 500 million years ago. Front Psychol. 2013;4:667. doi: 10.3389/fpsyg.2013.00667. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3790330/
  29. Feinberg TE, Mallatt J. The nature of primary consciousness. A new synthesis. Conscious Cogn. 2016;43:113–127. doi: 10.1016/j.concog.2016.05.009. https://www.gwern.net/docs/psychology/2016-feinberg.pdf
  30. Jacob P. Intentionality. In: Zalta EN, editor. The Stanford encyclopedia of philosophy (Winter 2019 edition). https://plato.stanford.edu/archives/win2019/entries/intentionality/
  31. Kind A. Qualia. In: Internet Encyclopedia of Philosophy. http://www.iep.utm.edu/qualia/
  32. Kouider S, de Gardelle V, Sackur J, Dupoux E. How rich is consciousness? The partial awareness hypothesis. Trends Cogn Sci. 2010 Jul;14(7):301–307. doi: 10.1016/j.tics.2010.04.006. https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.378.5885&rep=rep1&type=pdf
  33. Pernu TK. The five marks of the mental. Front Psychol. 2017;8:1084. doi: 10.3389/fpsyg.2017.01084. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5500963/
  34. Revonsuo A. Inner presence: Consciousness as a biological phenomenon. Cambridge: MIT Press. 2006:29–68.
  35. Tye M. Qualia. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy (Winter 2017 edition). https://plato.stanford.edu/archives/win2017/entries/qualia/
  36. Velmans M. Understanding consciousness. 2nd ed. Hove, East Sussex: Routledge; 2009. https://dl.uswr.ac.ir/bitstream/Hannan/130278/1/0415425158.Routledge.Understanding.Consciousness.Second.Edition.Apr.2009.pdf
  37. Weisberg J. The hard problem of consciousness. In: Internet Encyclopedia of Philosophy. https://www.iep.utm.edu/hard-con/
  38. Block N. Consciousness, accessibility, and the mesh between psychology and neuroscience. Behavioral and Brain Sciences. 2007;30(5–6):481–499. https://www.nyu.edu/gsas/dept/philo/faculty/block/papers/Block_BBS.pdf
  39. Block N. Biology versus computation in the study of consciousness. Behav Brain Sci. 1997;20:159–165. https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.10.4577&rep=rep1&type=pdf
  40. Block N. How can we find the neural correlate of consciousness? Trends Neurosci. 1996 Nov;19(11):456–459. doi: 10.1016/s0166-2236(96)20049-9. https://www.academia.edu/4757334/How_can_we_find_the_neural_correlate_of_consciousness
  41. Block N. How not to find the neural correlate of consciousness. Royal Institute of Philosophy Supplement. 1993;43. https://www.researchgate.net/publication/28762320_How_Not_To_Find_the_Neural_Correlate_of_Consciousness
  42. Chalmers D. Availability: The cognitive basis of experience. Behav Brain Sci. 1997;20:148–149. http://consc.net/papers/availability.html
  43. Lionel N. Why and how access consciousness can account for phenomenal consciousness. Phil. Trans. R. Soc. 2018:B3732017035720170357. http://doi.org/10.1098/rstb.2017.0357 https://royalsocietypublishing.org/doi/10.1098/rstb.2017.0357
  44. Birch J, Schnell AK, Clayton NS. Dimensions of animal consciousness. Trends Cogn Sci. 2020;24(10):789–801. doi: 10.1016/j.tics.2020.07.007. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7116194/?report=classic
  45. Carruthers P. Comparative psychology without consciousness. Conscious Cogn. 2018a;63:47–60. doi: 10.1016/jxoncog.2018.06.012. http://faculty.philosophy.umd.edu/pcarruthers/Comp%20psych%20without%20consciousness.pdf
  46. Baars BJ. Chapter Ten. The functions of consciousness. In: A cognitive theory of consciousness. New York: Cambridge University Press; 1988. http://bernardbaars.pbworks.com/f/++++Functions+of+Consciousness.pdf
  47. Dehaene S. Chapter 3. What is consciousness good for? In: Consciousness and the brain: Deciphering how the brain codes our thoughts. New York, New York: The Penguin Group; 2014:89–114.
  48. Jonkisz J, Wierzchoń M, Binder M. Four-dimensional graded consciousness. Front Psychol. 2017;8:420. doi: 10.3389/fpsyg.2017.00420. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5359253/?report=classic
  49. Seth AK, Baars BJ. Neural Darwinism and consciousness. Conscious Cogn. 2005 Mar;14(1):140–168. http://ccrg.cs.memphis.edu/assets/papers/2004/Seth%20&%20Baars,%20Neural%20Darwinism-2004.pdf
  50. Doerig A, Schurger A, Herzog MH. Hard criteria for empirical theories of consciousness. Cogn Neurosci. 2021 Jan–Jan;12(2):41–62. doi: 10.1080/17588928.2020.1772214. https://www.tandfonline.com/doi/full/10.1080/17588928.2020.1772214
  51. Irwin LN. Renewed perspectives on the deep roots and broad distribution of animal consciousness. Front Syst Neurosci. 2020;14:57. doi: 10.3389/fnsys.2020.00057. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7438986/?report=classic
  52. Kotchoubey B. Human consciousness: Where is it from and what is it for. Front Psychol. 2018;9:567. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5924785/
  53. Mallatt J, Feinberg TE. Multiple routes to animal consciousness: Constrained multiple realizability rather than Modest Identity Theory. Front Psychol. 2021 Sep;12:732336. doi: 10.3389/fpsyg.2021.732336. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8497802/?report=classic
  54. Cadell C. Google AI beats Chinese master in ancient game of Go. Reuters. May 23, 2017. https://www.reuters.com/article/us-science-intelligence-go-idUSKBN18J0PE
  55. Guinness World Records. First AI to beat a 9-dan professional Go player. https://www.guinnessworldrecords.com/world-records/523455-first-ai-to-beat-a-9-dan-professional-go-player
  56. Guinness World Records. First computer to beat a world chess champion under regular time controls. https://www.guinnessworldrecords.com/world-records/100873-first-computer-to-beat-a-world-chess-champion-under-regular-time-controls
  57. Petkauskas V. Rise of the machines: when a computer beats a chess master. Cybernews. May 16, 2022. https://cybernews.com/editorial/rise-of-the-machines-when-a-computer-beats-a-chess-master/
  58. Spinney L. Artificial intelligence beats eight world champions at bridge. The Guardian. Mar 29, 2022. https://www.theguardian.com/technology/2022/mar/29/artificial-intelligence-beats-eight-world-champions-at-bridge
  59. Baars BJ. Global workspace theory of consciousness: Toward a cognitive neuroscience of human experience. Prog Brain Res. 2005;150:45–53. doi: 10.1016/S0079-6123(05)50004-9. https://www.cs.helsinki.fi/u/ahyvarin/teaching/niseminar4/Baars2004.pdf
  60. Baars BJ. How does a serial, integrated and very limited stream of consciousness emerge from a nervous system that is mostly unconscious, distributed, parallel and of enormous capacity? Ciba Found Symp. 1993;174:282–290; discussion 291–303.
  61. Baars BJ, Franklin S, Ramsoy TZ. Global workspace dynamics: Cortical “binding and propagation” enables conscious contents. Front Psychol. 2013;4:200. doi: 10.3389/fpsyg.2013.00200. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3664777/
  62. Baars BJ, Geld N, Kozma R. Global Workspace Theory (GWT) and prefrontal cortex: Recent developments. Front Psychol. 2021;12:749868. doi: 10.3389/fpsyg.2021.749868. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8660103/
  63. Dehaene S, Kerszberg M, Changeux JP. A neuronal model of a global workspace in effortful cognitive tasks. Proc Natl Acad Sci U S A. 1998;95(24):14529–14534. doi: 10.1073/pnas.95.24.14529. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC24407/
  64. Newman J, Baars BJ, Cho SB. A Neural Global Workspace Model for Conscious Attention. Neural Netw. 1997 Oct 1;10(7):1195–1206. doi: 10.1016/s0893-6080(97)00060-9. https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.453.6016&rep=rep1&type=pdf
  65. Dehaene S, Changeux JP, Naccache L. The Global Neuronal Workspace Model of conscious access: From neuronal architectures to clinical applications. 2011. In: Dehaene S, Christen Y, editors. Characterizing consciousness: From cognition to the clinic? Research and Perspectives in Neurosciences. Berlin, Heidelberg: Springer-Verlag; 2011. https://doi.org/10.1007/978-3-642-18015-6_4 http://www.antoniocasella.eu/dnlaw/Dehaene_Changeaux_Naccache_2011.pdf
  66. Dehaene S, Charles L, King JR, Marti S. Toward a computational theory of conscious processing. Curr Opin Neurobiol. 2014 Apr;25:76–84. doi: 10.1016/j.conb.2013.12.005. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5635963/
  67. Dehaene S, Naccache L. Towards a cognitive neuroscience of consciousness: Basic evidence and a workspace framework. Cognition. 2001 Apr;79(1–2):1–37. https://www.jsmf.org/meetings/2003/nov/Dehaene_Cognition_2001.pdf
  68. Dehaene S, Sergent C, Changeux JP. A neuronal network model linking subjective reports and objective physiological data during conscious perception. Proc Natl Acad Sci U S A. 2003 Jul 8;100(14):8520–8525. doi: 10.1073/pnas.1332574100. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC166261/
  69. Mashour GA, Roelfsema P, Changeux JP, Dehaene S. Conscious processing and the global neuronal workspace hypothesis. Neuron. 2020;105(5):776–798. doi: 10.1016/j.neuron.2020.01.026. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8770991/
  70. Sergent C, Dehaene S. Neural processes underlying conscious perception: Experimental findings and a global neuronal workspace framework. J Physiol Paris. 2004 Jul–Nov;98(4–6):374–384. doi: 10.1016/j.jphysparis.2005.09.006. https://pdfs.semanticscholar.org/ae61/178a998b4e08851af8ba80e7815fd2c9e6d9.pdf
  71. Song X, Tang X. An extended theory of global workspace of consciousness. Prog Nat Sci. 2008 Jul 10;18(7):789–793. https://doi.org/10.1016/j.pnsc.2008.02.003  https://www.sciencedirect.com/science/article/pii/S100200710800138X
  72. Brogaard B, Electra Gatzia DE. What can neuroscience tell us about the hard problem of consciousness? Front Neurosci. 2016;10:395. doi: 10.3389/fnins.2016.00395. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5013033/
  73. Chalmers DJ. The puzzle of conscious experience. Sci Am. 1995 Dec;273(6):80–86. http://s3.amazonaws.com/arena-attachments/2382142/9247d5f1a845e5482b1bd66d82c3a9bf.pdf?1530582615
  74. Chis-Ciure R, Ellia F. Facing up to the hard problem of consciousness as an integrated information theorist. Found. Sci. 2021:1–17. doi: 10.1007/s10699-020-09724-7. http://philsci-archive.pitt.edu/20787/1/Chis-Ciure%20%26%20Ellia%20-%20Facing%20up%20to%20the%20Hard%20Problem%20as%20an%20Integrated%20Information%20Theoriest%20%28final%20preprint%29%20%282021%29.pdf
  75. Crick F, Koch C. Towards a neurobiological theory of consciousness. Seminars in the Neurosciences. 1990;2:263–275. https://authors.library.caltech.edu/40352/1/148.pdf
  76. Dennett DC. Facing up to the hard question of consciousness. Philos Trans R Soc Lond B Biol Sci. 2018;373(1755):20170342. doi: 10.1098/rstb.2017.0342. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6074080/?report=classic
  77. Loorits K. Structural qualia: A solution to the hard problem of consciousness. Front Psychol. 2014;5:237. doi: 10.3389/fpsyg.2014.00237. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3957492/
  78. Syamala H. How vedanta explains conscious subjective experience. Scientific GOD Journal. 2015;6:194–207. https://www.researchgate.net/publication/313870517_How_Vedanta_Explains_Conscious_Subjective_Experience
  79. Block N. Chapter 77. Comparing the major theories of consciousness. In: Gazzaniga MS, editor. The Cognitive Neurosciences. 4th ed. Cambridge, MA: MIT Press; 2009:1111–1122. https://www.nyu.edu/gsas/dept/philo/faculty/block/papers/Theories_of_Consciousness.pdf
  80. Byrne A. Inverted qualia. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy (Winter 2016 Edition). https://plato.stanford.edu/archives/win2016/entries/qualia-inverted/
  81. Chalmers DJ. Phenomenal concepts and the explanatory gap. In: Alter T, Walter S, editors. Phenomenal concepts and phenomenal knowledge: New essays on consciousness and physicalism. Oxford University Press; 2006. https://www.sciencedharma.com/uploads/7/6/8/0/76803975/pceg.pdf
  82. Levine J. Materialism and qualia: The explanatory gap. Pacific Philosophical Quarterly. 1983;64:354–361. https://hope.simons-rock.edu/~pshields/cs/cmpt265/levine.pdf
  83. Papineau D. Mind the gap. Philosophical Perspectives. 1998;12:373–389. https://sas-space.sas.ac.uk/878/1/D_Papineau_Gap..pdf http://www.davidpapineau.co.uk/uploads/1/8/5/5/18551740/mind_the_gap.pdf
  84. Prabhu HR, Bhat PS. Mind and consciousness in yoga—Vedanta: A comparative analysis with western psychological concepts. Indian J Psychiatry. 2013;55(Suppl 2):S182–S186. doi: 10.4103/0019-5545.105524. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3705680/?report=classic
  85. Farisco M, Laureys S, Evers K. The intrinsic activity of the brain and its relation to levels and disorders of consciousness. Mind Matter. 2017;15:197–219. https://orbi.uliege.be/bitstream/2268/249262/1/FULLTEXT01.pdf
  86. De Angelis A. Collective and universal consciousness: Is individuality an evolutionary accident? Journal of Humanities and Social Sciences. 2022;5(2):198–199. https://philarchive.org/archive/DEACAU
  87. Petrenko VF. Contact with universal consciousness through the research of human mentality. Essentiafoundation.org. 2021 Apr 23. https://www.essentiafoundation.org/contact-with-universal-consciousness-through-the-research-of-human-mentality/reading/
  88. Koch C. Is Consciousness Universal? Scientifcamerican.com. 2014 Jan 1. https://www.scientificamerican.com/article/is-consciousness-universal/
  89. Haanel CF. The Master Key System. Psychology Publishing. 1916. http://newthoughtlibrary.com/haanel-charles/Master-Key-System/audioEbook/master-key-systemRAW.pdf
  90. Koch C. The quest for consciousness: A neurobiological approach. Roberts & Company Publishers; 2004. https://www.researchgate.net/publication/232296537_The_Quest_for_Consciousness_A_Neurobiological_Approach
  91. Black D. Analyzing the etiological functions of consciousness. Phenomenol Cogn Sci. 2021;20(2). doi: 10.1007/s11097-020-09693-z. https://www.researchgate.net/publication/343685406_Analyzing_the_etiological_functions_of_consciousness
  92. Rosenthal DM. State consciousness and transitive consciousness. Conscious Cogn. 1993;2(4):355–363. https://www.davidrosenthal.org/DR-State-Transitive.pdf
  93. Winters JJ. The Temporally-Integrated Causality Landscape: Reconciling neuroscientific theories with the phenomenology of consciousness. Front Hum Neurosci. 2021;15:768459. doi: 10.3389/fnhum.2021.768459. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8599361/
  94. Butler AB, Cotterill RMJ. Mammalian and avian neuroanatomy and the question of consciousness in birds. Biol Bull. 2006;211(2):106–127. doi: 10.2307/4134586. https://www.journals.uchicago.edu/doi/10.2307/4134586
  95. Morin A. Levels of consciousness and self-awareness: A comparison and integration of various neurocognitive views. Conscious Cogn. 2006;15:358–371. doi10.1016/j.concog.2005.09.006. https://www.researchgate.net/publication/7507190_Levels_of_consciousness_and_self-awareness_A_comparison_and_integration_of_various_neurocognitive_views
  96. Pepperberg IM, Lynn SK. Possible levels of animal consciousness with reference to grey parrots. Amer Zool. 2000;40:893–901.
  97. Babiloni C, Marzano N, Soricelli A, et al. Cortical neural synchronization underlies primary visual consciousness of qualia: Evidence from event-related potentials. Front Hum Neurosci. 2016;10:310. doi: 10.3389/fnhum.2016.00310. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4927634/
  98. Edelman DB, Baars BJ, Seth AK. Identifying hallmarks of consciousness in non-mammalian species. Conscious Cogn. 2005 Mar;14(1):169–187. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.83.4676&rep=rep1&type=pdf
  99. Edelman GM. Naturalizing consciousness: A theoretical framework. PNAS. 2003 Apr 29;100(9):5520–5524. https://doi.org/10.1073/pnas.0931349100. https://www.pnas.org/content/100/9/5520
  100. Fabbro F, Aglioti SM, Bergamasco M, Clarici A, Panksepp J. Evolutionary aspects of self- and world consciousness in vertebrates. Front Hum Neurosci. 2015 Mar 26;9:157. doi: 10.3389/fnhum.2015.00157. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4374625/
  101. Panksepp J. The affective brain and core consciousness: How does neural activity generate emotional feelings? In: Lewis M, Haviland-Jones JM, Barrett LF, editors. Handbook of Emotions. 3rd edition. New York: Guilford; 2008:47–67. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.935.3519&rep=rep1&type=pdf
  102. Parvizi J, Damasio A. Consciousness and the brainstem. Cognition. 2001;79(1–2):135–160. https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.845.1580&rep=rep1&type=pdf
  103. Panksepp J. Cross-Species affective neuroscience decoding of the primal affective experiences of humans and related animals. Sirigu A, editor. PLoS One. 2011 Sep 7;6(9):e21236. https://doi.org/10.1371/journal.pone.0021236  http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0021236
  104. Ansorge U, Kunde W, Kiefer M. Unconscious vision and executive control: How unconscious processing and conscious action control interact. Conscious Cogn. 2014 Jul;27:268–287. doi: 10.1016/j.concog.2014.05.009. https://www.psychologie.uni-wuerzburg.de/fileadmin/06020300/user_upload/Kunde/Ansorge_Kunde_Kiefer_CC_2014.pdf
  105. Axelrod V, Bar M, Rees G, Yovel G. Neural correlates of subliminal language processing. Cereb Cortex. 2015 Aug;25(8):2160–2169. doi: 10.1093/cercor/bhu022. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4494027/?report=classic
  106. Bargh JA, Morsella E. The unconscious mind. Perspect Psychol Sci. 2008 Jan;3(1):73–79. doi: 10.1111/j.1745-6916.2008.00064.x. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2440575/
  107. Bergström F, Eriksson J. Neural evidence for non-conscious working memory. Cereb Cortex. 2018 Sep;28(9):3217–3228. https://doi.org/10.1093/cercor/bhx193 https://academic.oup.com/cercor/article/28/9/3217/4058206
  108. Borsook D, Youssef AM, Barakat N, Sieberg CB, Elmanb I. Subliminal (latent) processing of pain and its evolution to conscious awareness. Neurosci Biobehav Rev. 2018 May;88:1–15. doi: 10.1016/j.neubiorev.2018.02.015. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5985199/
  109. Creswell JD, Bursley JK, Satpute AB. Neural reactivation links unconscious thought to decision-making performance. Soc Cogn Affect Neurosci. 2013 Dec;8(8):863–869. doi: 10.1093/scan/nst004. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3831563/
  110. Custers R, Aarts H. The unconscious will: how the pursuit of goals operates outside of conscious awareness. Science. 2010 Jul 2;329(5987):47–50. doi: 10.1126/science.1188595. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.599.8946&rep=rep1&type=pdf
  111. Dehaene S, Changeux JP, Naccache L, Sackur J, Sergent C. Conscious, preconscious, and subliminal processing: A testable taxonomy. Trends Cogn Sci. 2006 May;10(5):204–211. doi: 10.1016/j.tics.2006.03.007. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.65.3821&rep=rep1&type=pdf
  112. Dehaene S. Fathoming unconscious depths. In: Consciousness and the brain: Deciphering how the brain codes our thoughts. New York, New York: The Penguin Group; 2014:47–88.
  113. Dijksterhuis A. First neural evidence for the unconscious thought process. Soc Cogn Affect Neurosci. 2013 Dec;8(8):845–846. doi: 10.1093/scan/nst036. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3831564/
  114. Dijksterhuis A, Bos MW, Nordgren LF, Van Baaren RB. On making the right choice: The deliberation-without-attention effect. Science. 2006 Feb 17;311(5763):1005–1007. doi: 10.1126/science.1121629. http://www1.psych.purdue.edu/~gfrancis/Classes/PSY392/Dijksterhuisetal.pdf
  115. Dijksterhuis A, Strick M. A Case for thinking without consciousness. Perspect Psychol Sci. 2016 Jan;11(1):117–132. doi: 10.1177/1745691615615317. https://www.researchgate.net/publication/292188707_A_Case_for_Thinking_Without_Consciousness
  116. Dijksterhuis A, Nordgren LF. A theory of unconscious thought. Perspect Psychol Sci. 2006;1:95–109. https://doi.org/10.1111/j.1745-6916.2006.00007.x. https://journals.sagepub.com/doi/pdf/10.1111/j.1745-6916.2006.00007.x
  117. Evans JSBT. Dual-processing accounts of reasoning, judgment, and social cognition. Annu Rev Psychol. 2008;59:255–278. doi: 10.1146/annurev.psych.59.103006.093629. https://sites.ualberta.ca/~francisp/Phil488/EvansDualProcessing2008.pdf
  118. Garrison KE, Handley IM. Not merely experiential: Unconscious thought can be rational. Front Psychol. 2017 Jul 6;8:1096. doi: 10.3389/fpsyg.2017.01096. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5498519/
  119. Halligan PW, Oakley DA. Giving up on consciousness as the ghost in the machine. Front Psychol. 2021 Apr;12:571460. doi: 10.3389/fpsyg.2021.571460. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8121175/
  120. Hassin RR. Yes it can: On the functional abilities of the human unconscious. Perspect Psychol Sci. 2013 Mar;8(2):195–207. doi: 10.1177/1745691612460684. https://scinapse.io/papers/2171395624
  121. Horga G, Maia TV. Conscious and unconscious processes in cognitive control: A theoretical perspective and a novel empirical approach. Front Hum Neurosci. 2012;6:199. doi: 10.3389/fnhum.2012.00199. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3458455/
  122. Kiefer M, Ansorge U, Haynes JD, et al. Neuro-cognitive mechanisms of conscious and unconscious visual perception: From a plethora of phenomena to general principles. Adv Cogn Psychol. 2011;7:55–67. doi: 10.2478/v10053-008-0090-4. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3259028/
  123. Kihlstrom JF. Cognition, unconscious process. Elsevier Ltd. 2007. Retreived Nov 28, 2019 from http://www.baars-gage.com/furtherreadinginstructors/Chapter08/Chapter8_Cognitive_Unconscious.pdf
  124. Kihlstrom JF. The cognitive unconscious. Science. 1987 Sep 18;237(4821):1445–1452. doi: 10.1126/science.3629249. https://www.ocf.berkeley.edu/~jfkihlstrom/PDFs/1980s/1987/ScienceCogUncog.pdf
  125. Lamme VAF. Visual functions generating conscious seeing. Front Psychol. 2020 Feb 14;11:83. doi: 10.3389/fpsyg.2020.00083. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7034432/
  126. Morsella E, Poehlman TA. The inevitable contrast: Conscious vs. unconscious processes in action control. Front Psychol. 2013;4:590. doi: 10.3389/fpsyg.2013.00590. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3767904/
  127. Nisbett RE, Wilson TD. Telling more than we can know: Verbal reports on mental processes. Psychol Rev. 1977 May;84(3):231–259. https://doi.org/10.1037/0033-295X.84.3.231  https://deepblue.lib.umich.edu/bitstream/handle/2027.42/92167/TellingMoreThanWeCanKnow.pdf
  128. Pockett S. The Neuroscience of Movement. In: Pockett S, Banks WP, Gallagher S, editors. Does consciousness cause behavior? Cambridge, Massachusetts: The MIT Press; 2006:9–24. https://pdfs.semanticscholar.org/29f6/6625a81f8e5a0142baed0ab9a9979678c3a1.pdf
  129. Ritter SM, Dijksterhuis A. Creativity – The unconscious foundations of the incubation period. Front Hum Neurosci. 2014;8:215. doi: 10.3389/fnhum.2014.00215. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3990058/
  130. Salti M, Monto S, Charles L, King JR, Parkkonen L, Dehaene S. Distinct cortical codes and temporal dynamics for conscious and unconscious percepts. eLife. 2015;4:e05652. doi: 10.7554/eLife.05652. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4467230/
  131. Squire LR, Dede AJO. Conscious and unconscious memory systems. Cold Spring Harb Perspect Biol. 2015 Mar;7(3):a021667. doi: 10.1101/cshperspect.a021667. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4355270/
  132. Strick M, Dijksterhuis A, Bos MW, Sjoerdsma A, van Baaren RB, Nordgren LF. A meta-analysis on unconscious thought effects. Soc Cogn. 2011 Dec;29:738–763. https://doi.org/10.1521/soco.2011.29.6.738  https://www.researchgate.net/profile/Madelijn_Strick/publication/277426531_A_Meta-Analysis_on_Unconscious_Thought_Effects/links/59e706540f7e9b13acac6add/A-Meta-Analysis-on-Unconscious-Thought-Effects.pdf
  133. Umiltá C. Chapter 12. Consciousness and control of action. In: Zelazo PD, Moscovitch M, Thompson E, editors. The Cambridge handbook of consciousness. Cambridge University Press; 2007:327–351. http://perpus.univpancasila.ac.id/repository/EBUPT181231.pdf
  134. van Gaal S, Lamme VA. Unconscious high-level information processing: Implication for neurobiological theories of consciousness. Neuroscientist. 2012 Jun;18(3):287–301. doi: 10.1177/1073858411404079. https://pdfs.semanticscholar.org/b9af/d0e8d460ba73cf197a96e2dd8b524e05390c.pdf
  135. Vlassova A, Donkin C, Pearson J. Unconscious information changes decision accuracy but not confidence. Proc Natl Acad Sci U S A. 2014 Nov11;111(45):16214–16218. doi: 10.1073/pnas.1403619111. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4234611/
  136. Ionescu MR. Subliminal perception of complex visual stimuli. Rom J Ophthalmol. 2016 Oct–Dec;60(4):226–230. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5711286/  
  137. Jacobs C, Sack AT. Behavior in oblivion: The neurobiology of subliminal priming. Brain Sci. 2012 Jun;2(2):225–241. doi: 10.3390/brainsci2020225. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4061795/?report=classic
  138. Elgendi M, Kumar P, Barbic S, Howard N, Abbott D, Cichocki A. Subliminal priming—state of the art and future perspectives. Behav Sci (Basel). 2018 Jun;8(6):54. doi: 10.3390/bs8060054. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6027235/?report=classic  
  139. Ruch S, Züst MA, Henke K. Subliminal messages exert long-term effects on decision-making. Neurosci Conscious. 2016;2016(1):niw013. doi: 10.1093/nc/niw013. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6204644/?report=classic  
  140. Blake R, Brascamp J, Heeger DJ. Can binocular rivalry reveal neural correlates of consciousness? Philos Trans R Soc Lond B Biol Sci. 2014 May 5;369(1641):20130211. doi: 10.1098/rstb.2013.0211. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3965165/
  141. Blake R, Wilson H. Binocular Vision. Vision Res. 2011 Apr 13;51(7):754–770. doi: 10.1016/j.visres.2010.10.009. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3050089/  
  142. Dieter KC, Brascamp J, Tadin D, Blake R. Does visual attention drive the dynamics of bistable perception? Atten Percept Psychophys. 2016 Oct;78(7):1861–1873. doi: 10.3758/s13414-016-1143-2. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5014653/
  143. Doesburg SM, Green JJ, McDonald JJ, Ward LM. Rhythms of consciousness: Binocular rivalry reveals large-scale oscillatory network dynamics mediating visual perception. PLoS One. 2009;4(7):e6142. doi: 10.1371/journal.pone.0006142. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2702101/
  144. Kang MS, Blake R. An integrated framework of spatiotemporal dynamics of binocular rivalry. Front Hum Neurosci. 2011;5:88. doi: 10.3389/fnhum.2011.00088. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3171066/
  145. Lin Z, He S. Seeing the invisible: The scope and limits of unconscious processing in binocular rivalry. Prog Neurobiol. 2009 Apr;87(4):195–211. doi: 10.1016/j.pneurobio.2008.09.002. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2689366/
  146. Rees G. Neural correlates of the contents of visual awareness in humans. Philos Trans R Soc Lond B Biol Sci. 2007 May 29;362(1481):877–886. doi: 10.1098/rstb.2007.2094. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2430003/
  147. Sterzer P, Stein T, Ludwig K, Rothkirch M, Hesselmann G. Neural processing of visual information under interocular suppression: A critical review. Front Psychol. 2014;5:453. doi: 10.3389/fpsyg.2014.00453. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4032950/  
  148. Chun MM, Potter MC. A two-stage model for multiple target detection in rapid serial visual presentation. J Exp Psychol Hum Percept Perform. 1995;21(1):109–127. https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.588.1550&rep=rep1&type=pdf
  149. Dux PE, Marois R. How humans search for targets through time: A review of data and theory from the attentional blink. Atten Percept Psychophys. 2009 Nov;71(8):1683–1700. doi: 10.3758/APP.71.8.1683. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2915904/
  150. Nieuwenstein M, Van der Burg E, Theeuwes J, Wyble B, Potter M. Temporal constraints on conscious vision: on the ubiquitous nature of the attentional blink. J Vis. 2009;9(9):18, 1–14. doi: 10.1167/9.9.18. https://iovs.arvojournals.org/article.aspx?articleid=2122359
  151. Raymond J, Shapiro KL, Arnell KM. Temporary suppression of visual processing in an RSVP task: An attentional blink? J Exp Psychol Hum Percept Perform. 1992;18(3):849–860. http://wexler.free.fr/library/files/raymond%20(1992)%20temporary%20suppression%20of%20visual%20processing%20in%20an%20RSVP%20task.%20an%20attentional%20blink..pdf
  152. Shen D, Vuvan DT, Alain C. Cortical sources of the auditory attentional blink. J Neurophysiol. 2018 Aug 1;120(2):812–829. doi: 10.1152/jn.00007.2018. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6139462/
  153. Willems C, Martens S. Time to see the bigger picture: Individual differences in the attentional blink. Psychon Bull Rev. 2016;23(5):1289–1299. doi: 10.3758/s13423-015-0977-2. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5050248/
  154. Zivony A, Shanny S, Lamy D. Perceptual processing is not spared during the attentional blink. J Cogn. 2018;1(1):18. doi: 10.5334/joc.20. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6634390/
  155. Hyman IE, Jr., Sarb BA, Wise-Swanson BM. Failure to see money on a tree: Inattentional blindness for objects that guided behavior. Front Psychol. 2014;5:356. doi: 10.3389/fpsyg.2014.00356. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4005951/
  156. Kreitz C, Furley P, Memmert D, Simons DJ. Inattentional blindness and individual differences in cognitive abilities. PLoS One. 2015;10(8):e0134675. doi: 10.1371/journal.pone.0134675. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4530948/
  157. Mack A, Rock I. Inattentional blindness. Cambridge, MA: MIT Press; 1998. http://www.thatmarcusfamily.org/philosophy/Course_Websites/Contemporary/Readings/Mack_Rock.pdf
  158. Pugnaghi G, Memmert D, Kreitz C. Loads of unconscious processing: The role of perceptual load in processing unattended stimuli during inattentional blindness. Atten Percept Psychophys. 2020;82(5):2641–2651. doi: 10.3758/s13414-020-01982-8. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7343742/pdf/13414_2020_Article_1982.pdf
  159. Simons DJ, Chabris CF. Gorillas in our midst: sustained inattentional blindness for dynamic events. Perception. 1999;28:1059–1074. https://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=38916520883902727885B25ECD0C5D70?doi=10.1.1.65.8130&rep=rep1&type=pdf
  160. Usher M, Bronfman ZZ, Talmor S, Jacobson H, Eitam B. Consciousness without report: Insights from summary statistics and inattention ‘blindnes.’ Philos Trans R Soc Lond B Biol Sci. 2018 Sep 19;373(1755):20170354. doi: 10.1098/rstb.2017.0354. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6074079/?report=classic
  161. Wright TJ, Roque NA, Boot WR, Cary Stothart C. Attention capture, processing speed, and inattentional blindness. Acta Psychol (Amst). 2018 Oct;190:72–77. doi: 10.1016/j.actpsy.2018.07.005. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6309252/  
  162. Ajina S, Bridge H. Blindsight and unconscious vision: What they teach us about the human visual system. Neuroscientist. 2017 Oct;23(5):529–541. doi: 10.1177/1073858416673817. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5493986/
  163. Carota A, Calabrese P. The achromatic ‘philosophical zombie,’ a syndrome of cerebral achromatopsia with color anopsognosia. Case Rep Neurol. 2013 Jan-Apr;5(1):98–103. doi: 10.1159/0003510. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3656676/
  164. Leopold DA. Primary visual cortex, awareness and blindsight. Annu Rev Neurosci. 2012;35:91–109. doi: 10.1146/annurev-neuro-062111-150356. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3476047/
  165. Silvanto J. Why is “blindsight” blind? A new perspective on primary visual cortex, recurrent activity and visual awareness. Conscious Cogn. 2015 Mar;32:15–32. doi: 10.1016/j.concog.2014.08.001. https://www.sciencedirect.com/science/article/pii/S1053810014001329
  166. Metzinger T. Minimal phenomenal experience: Meditation, tonic alertness, and the phenomenology of “pure” consciousness. Philos. Mind Sci. 2020;1:1–44. doi: 10.33735/phimisci.2020.I.46. https://philosophymindscience.org/index.php/phimisci/article/download/8960/8538
  167. Wiese W. The science of consciousness does not need another theory, it needs a minimal unifying model. Neurosci Conscious. 2020;2020(1):niaa013. doi: 10.1093/nc/niaa013. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7352491/
  168. Windt JM. Consciousness in sleep: How findings from sleep and dream research challenge our understanding of sleep, waking, and consciousness. Philos Compass. 2020;15:e12661. https://researchmgt.monash.edu/ws/portalfiles/portal/309812613/301226033_oa.pdf
  169. Jensen MS, Yao R, Street WN, Simons DJ. Change blindness and inattentional blindness. Wiley Interdiscip Rev Cogn Sci. 2011 Sep;2(5):529–546. doi: 10.1002/wcs.130. https://www.researchgate.net/publication/275100327_Attention_Change_Blindness_and_Inattentional_Blindness
  170. Rensink RA. Attention: Change blindness and inattentional blindness. In: Banks WP, editor. Encyclopedia of Consciousness. Academic Press; 2009:47–59. https://doi.org/10.1016/B978-012373873-8.00006-2 https://www2.psych.ubc.ca/~rensink/publications/download/EncycConsc-CB-IB-rr.pdf
  171. Rensink RA. To have seen or not to have seen: A look at Rensink, O’Regan, and Clark (1997). Perspect Psychol Sci. 2018;13:230–235. doi: 10.1177/1745691617707269. https://philarchive.org/archive/RENTHS-2
  172. Simons DJ, Levin DT. Change blindness. Trends in Cognitive Sciences. 1997;1(7):261–267. https://home.cs.colorado.edu/~mozer/Teaching/syllabi/3702/readings/SimonsLevin1997.pdf
  173. Simons DJ, Rensink RA. (2005). Change blindness: Past, present, and future. Trends Cogn Sci. 2005 Jan;9(1):16–20. doi: 10.1016/j.tics.2004.11.006. https://philpapers.org/archive/SIMCBP.pdf
  174. Cohen MA, Dennett DC, Kanwisher N. What is the bandwidth of perceptual experience? Trends Cogn Sci. 2016;20(5):324–335. doi: 10.1016/j.tics.2016.03.006. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4898652/
  175. Koch C, Tsuchiya N. Attention and consciousness: Two distinct brain processes. Trends Cogn Sci. 2007;11:16–22. doi: 10.1016/j.tics.2006.10.012. https://www.cogsci.msu.edu/DSS/2007-2008/Koch/koch-tsuchiya-07.pdf
  176. Koculak M, Wierzchoń M. Consciousness science needs some rest: How to use resting-state paradigm to improve theories and measures of consciousness. Front Neurosci. 2022;16:836758. doi: 10.3389/fnins.2022.836758. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9002124/?report=classic
  177. Bar M, Tootell RB, Schacter DL, et al. Cortical mechanisms specific to explicit visual object recognition. Neuron. 2001 Feb;29(2):529–535. doi: 10.1016/s0896-6273(01)00224-0. Erratum in: Neuron 2001 Apr;30(1):299. https://www.cell.com/neuron/fulltext/S0896-6273(01)00224-0?_returnURL=https%3A%2F%2Flinkinghub.elsevier.com%2Fretrieve%2Fpii%2FS0896627301002240%3Fshowall%3Dtrue
  178. Fazekas P, Overgaard M. A multi-factor account of degrees of awareness. Cogn Sci. 2017 Apr 10. doi: 10.1111/cogs.12478. https://onlinelibrary.wiley.com/doi/full/10.1111/cogs.12478
  179. Nieuwenhuis S, de Kleijn R. Consciousness of targets during the attentional blink: A gradual or all-or-none dimension? Atten Percept Psychophys. 2011 Feb;73(2):364–373. doi: 10.3758/s13414-010-0026-1. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3037489/
  180. Overgaard M, Rote J, Mouridsen K, Ramsøy TZ. Is conscious perception gradual or dichotomous? A comparison of report methodologies during a visual task. Conscious Cogn. 2006 Dec;15(4):700–708. doi: 10.1016/j.concog.2006.04.002. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.316.365&rep=rep1&type=pdf
  181. Asplund CL, Fougnie D, Zughni S, Martin JW, Marois R. The attentional blink reveals the probabilistic nature of discrete conscious perception. Psychol Sci. 2014 Mar;25(3):824–831. doi: 10.1177/0956797613513810. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3954951/
  182. Sergent C, Dehaene S. Is consciousness a gradual phenomenon? Evidence for an all-or-none bifurcation during the attentional blink. Psychol Sci. 2004 Nov;15(11):720–728. doi: 10.1111/j.0956-7976.2004.00748.x. https://www.unicog.org/publications/SergentDehaene_AllOrNoneBlink_PsychScience2004.pdf
  183. Boly M, Seth AK, Wilke M, et al. Consciousness in humans and non-human animals: Recent advances and future directions. Front Psychol. 2013;4:625. doi: 10.3389/fpsyg.2013.00625. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3814086/
  184. Cavanna AE, Shah S, Eddy CM, Williams A, Rickards H. Consciousness: A neurological perspective. Behav Neurol. 2011;24(1):107–116. doi: 10.3233/BEN-2011-0322. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5378000/pdf/BN-2011-24-1-645159.pdf
  185. Han ME, Park SY, Oh SO. Large-scale functional brain networks for consciousness. Anat Cell Biol. 2021;54(2):152–164. doi: 10.5115/acb.20.305. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8225483/?report=classic
  186. Naccache L. Why and how access consciousness can account for phenomenal consciousness. Philos Trans R Soc Lond B Biol Sci. 2018;373(1755):20170357. doi: 10.1098/rstb.2017.0357. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6074081/?report=classic
  187. Overgaard M. Phenomenal consciousness and cognitive access. Philos Trans R Soc Lond B Biol Sci. 2018;373(1755):20170353. doi: 10.1098/rstb.2017.0353. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6074085/
  188. Phillips I. The methodological puzzle of phenomenal consciousness. Philos Trans R Soc Lond B Biol Sci. 2018;373(1755):20170347. doi: 10.1098/rstb.2017.0347. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6074091/
  189. Block N. Perceptual consciousness overflows cognitive access. Trends Cogn Sci. 2011;15:567–575. doi: 10.1016/j.tics.2011.11.001. https://philarchive.org/archive/BLOPCO-2
  190. Bachmann T, Hudetz AG. It is time to combine the two main traditions in the research on the neural correlates of consciousness: C = L × D. Front Psychol. 2014 Aug 22;5:940. doi: 10.3389/fpsyg.2014.00940. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4141455/
  191. Laureys S. The neural correlate of (un)awareness: Lessons from the vegetative state. Trends Cogn Sci. 2005;9:556–559. doi: 10.1016/j.tics.2005.10.010. https://orbi.uliege.be/bitstream/2268/2393/1/Laureys%20trends%20cogn%20scie2005.pdf
  192. Modolo J, Hassan M, Wendling F, Benquet P. Decoding the circuitry of consciousness: From local microcircuits to brain-scale networks. Netw Neurosci. 2020;4(2):315–337. doi: 10.1162/netn_a_00119. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7286300/?report=classic
  193. Nani A, Manuello J, Mancuso L, Liloia D, Costa T, Cauda F. The neural correlates of consciousness and attention: Two sister processes of the brain. Front Neurosci. 2019;13:1169. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6842945/
  194. Perri CD, Thibaut A, Heine L, Soddu A, Demertzi A, Laureys S. Measuring consciousness in coma and related states. World J Radiol. 2014 Aug 28;6(8):589–597. doi: 10.4329/wjr.v6.i8.589. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4147439/
  195. Fuller PM, Sherman D, Pedersen NP, Saper CB, Lu J. Reassessment of the structural basis of the ascending arousal system. J Comp Neurol. 2011 Apr 1;519(5):933–956. doi: 10.1002/cne.22559. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3119596/
  196. Reinoso-Suárez F, de Andrés I, Garzón M. Functional anatomy of the sleep-wakefulness cycle: Wakefulness. Adv Anat Embryol Cell Biol. 2011;208:1–128.
  197. Schwartz JRL, Roth T. Neurophysiology of sleep and wakefulness: Basic science and clinical implications. Curr Neuropharmacol. 2008 Dec;6(4):367–378. doi: 10.2174/157015908787386050. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2701283/
  198. Schwartz MD, Kilduff TS. The neurobiology of sleep and wakefulness. Psychiatr Clin North Am. 2015 Dec;38(4):615–644. doi: 10.1016/j.psc.2015.07.002. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4660253/
  199. Bender A, Jox RJ, Grill E, Straube A, Lulé D. Persistent vegetative state and minimally conscious state. A systematic review and meta-analysis of diagnostic procedures. Dtsch Arztebl Int. 2015 Apr;112(14):235–242. doi: 10.3238/arztebl.2015.0235. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4413244/
  200. Bernat JL. Chronic consciousness disorders. Annu Rev Med. 2009;60:381–392.
  201. Bernat JL. Chronic disorders of consciousness. Lancet. 2006;367:1181–1192. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.330.3843&rep=rep1&type=pdf
  202. Blume C, Del Giudice R, Wislowska M, Lechinger J, Schabus M. Across the consciousness continuum-from unresponsive wakefulness to sleep. Front Hum Neurosci. 2015;9:105. doi: 10.3389/fnhum.2015.00105. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4354375/?report=classic
  203. Bruno MA, Vanhaudenhuyse A, Thibaut A, Moonen G, Laureys S. From unresponsive wakefulness to minimally conscious PLUS and functional locked-in syndromes: Recent advances in our understanding of disorders of consciousness. J Neurol. 2011 Jul;258(7):1373–1384. doi: 10.1007/s00415-011-6114-x. http://www.academia.edu/9792867/From_unresponsive_wakefulness_to_minimally_conscious_PLUS_and_functional_locked-in_syndromes_recent_advances_in_our_understanding_of_disorders_of_consciousness
  204. Cavanna AE, Cavanna SL, Servo S, Monaco F. The neural correlates of impaired consciousness in coma and unresponsive states. Discov Med. 2010 May;9(48):431–438. http://www.discoverymedicine.com/Andrea-Eugenio-Cavanna/2010/05/09/the-neural-correlates-of-impaired-consciousness-in-coma-and-unresponsive-states/
  205. Demertzi A, Soddu A, Laureys S. Consciousness supporting networks. Current Opinion in Neurobiology. 2013 Apr;23(2):239–244. https://doi.org/10.1016/j.conb.2012.12.003  https://orbi.uliege.be/bitstream/2268/156514/1/Demertzi_CurOpNeurob_2013.pdf
  206. Farisco M, Pennartz C, Annen J, Cecconi B, Evers K. Indicators and criteria of consciousness: Ethical implications for the care of behaviourally unresponsive patients. BMC Med Ethics. 2022 Mar;23:30. doi: 10.1186/s12910-022-00770-3. https://bmcmedethics.biomedcentral.com/articles/10.1186/s12910-022-00770-3
  207. Giacino JT, Ashwal S, Childs N, et al. The minimally conscious state: Definition and diagnostic criteria. Neurology. 2002 Feb 12;58(3):349–353. http://n.neurology.org/content/58/3/349.long
  208. Giacino JT. The minimally conscious state: Defining the borders of consciousness. Prog Brain Res. 2005;150:381–395.
  209. Gosseries O, Bruno MA, Chatelle C, et al. Disorders of consciousness: What’s in a name? NeuroRehabilitation. 2011;28(1):3–14. doi: 10.3233/NRE-2011-0625. https://www.researchgate.net/publication/49849902_Disorders_of_consciousness_What’s_in_a_name
  210. Hirschberg R, Giacino JT. The vegetative and minimally conscious states: Diagnosis, prognosis and treatment. Neurol Clin. 2011 Nov;29(4):773–786. doi: 10.1016/j.ncl.2011.07.009. https://www.researchgate.net/profile/Joseph_Giacino/publication/255971880_Hirschberg_Giacino_VS_MCS_Dx_Px_Tx_Neurol_Clinics_2011/links/0c9605212e924b0a19000000/Hirschberg-Giacino-VS-MCS-Dx-Px-Tx-Neurol-Clinics-2011.pdf
  211. Hodelín-Tablada R. Minimally conscious state: Evolution of concept, diagnosis and treatment. MEDICC Review. 2016 Oct;18(4):43–46. http://www.medicc.org/mediccreview/index.php?issue=41&id=566&a=vahtml
  212. Leisman G, Koch P. Networks of conscious experience: Computational neuroscience in understanding life, death, and consciousness. Rev Neurosci. 2009;20(3–4):151–176. https://www.researchgate.net/profile/Gerry_Leisman/publication/41449921_Networks_of_Conscious_Experience_Computational_Neuroscience_in_Understanding_Life_Death_and_Consciousness/links/0fcfd5023c1d28270d000000.pdf
  213. Kondziella D, Bender A, Diserens K, et al. European academy of neurology guideline on the diagnosis of coma and other disorders of consciousness. Eur J Neurol. 2020;27:741–756. https://onlinelibrary.wiley.com/doi/10.1111/ene.14151
  214. Aalten P, van Valen E, Clare L, Kenny G, Verhey F. Awareness in dementia: A review of clinical correlates. Aging & mental health. 2005;9:414–422. doi: 10.1080/13607860500143075. https://www.researchgate.net/publication/7721201_Awareness_in_dementia_A_review_of_clinical_correlates
  215. Mograbi DC, Huntley J, Critchley H. Self-awareness in dementia: A taxonomy of processes, overview of findings, and integrative framework. Curr Neurol Neurosci Rep. 2021 Nov 24;21(12):69. doi: 10.1007/s11910-021-01155-6. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8613100/
  216. Rankin KP, Baldwin E, Pace-Savitsky C, Kramer JH, Miller BL. Self awareness and personality change in dementia. J Neurol Neurosurg Psychiatry. 2005 May;76(5):632–639. doi: 10.1136/jnnp.2004.042879. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1739614/pdf/v076p00632.pdf
  217. Requena-Komuro MC, Marshall CR, Bond RL, et al. Altered time awareness in dementia. Front Neurol. 2020 Apr 21;11:291. doi: 10.3389/fneur.2020.00291. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7186333/
  218. Doricchi F, Thiebaut de Schotten M, Tomaiuolo F, Bartolomeo P. White matter (dis)connections and gray matter (dys)functions in visual neglect: gaining insights into the brain networks of spatial awareness. Cortex. 2008 Sep;44(8):983–995. doi: 10.1016/j.cortex.2008.03.006. https://www.academia.edu/68795233/White_matter_dis_connections_and_gray_matter_dys_functions_in_visual_neglect_gaining_insights_into_the_brain_networks_of_spatial_awareness
  219. Driver J, Vuilleumier P. Perceptual awareness and its loss in unilateral neglect and extinction. Cognition. 2001 Apr;79(1–2):39–88. doi: 10.1016/s0010-0277(00)00124-4. https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.513.2844&rep=rep1&type=pdf
  220. Ros T, Michela A, Mayer A, et al. Disruption of large-scale electrophysiological networks in stroke patients with visuospatial neglect. Netw Neurosci. 2022 Feb 1;6(1):69–89. doi: 10.1162/netn_a_00210. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8959119/
  221. Urbanski M, Thiebaut de Schotten M, Rodrigo S, et al. Brain networks of spatial awareness: Evidence from diffusion tensor imaging tractography. J Neurol Neurosurg Psychiatry. 2008 May;79(5):598–601. doi: 10.1136/jnnp.2007.126276. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2386830/
  222. Faugeras F, Rohaut B, Weiss N, et al. Probing consciousness with event-related potentials in the vegetative state. Neurology. 2011 Jul 19;77(3):264–268. doi: 10.1212/WNL.0b013e3182217ee8. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3136052/
  223. Bordini AL, Luiz TF, Fernandes M, Arruda WO, Teive HAG. Coma scales: A historical review. Arq Neuropsiquiatr. 2010 Dec;68(6):930–937. https://www.scielo.br/scielo.php?script=sci_arttext&pid=S0004-282X2010000600019&lng=en&nrm=iso&tlng=en
  224. Foo CC, Loan JJM, Brennan PM. The relationship of the FOUR Score to patient outcome: A systematic review. J Neurotrauma. 2019;36(17):2469–2483. doi: 10.1089/neu.2018.6243. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6709730/pdf/neu.2018.6243.pdf
  225. Jouvet M. Coma and other disorders of consciousness. In: Vinken PJ, Bruyn GW, editors. Handbook of clinical neurology. Vol. 3. Amsterdam: North-Holland Publishing Company; 1969. http://sommeil.univ-lyon1.fr/articles/jouvet/hcn_69/contents.php http://sommeil.univ-lyon1.fr/articles/jouvet/hcn_69/p7.php
  226. Heydari F, Azizkhani R, Ahmadi O, Majidinejad S, Nasr-Esfahani M, Ahmadi A. Physiologic scoring systems versus Glasgow coma scale in predicting in-hospital mortality of trauma patients; A diagnostic accuracy study. Arch Acad Emerg Med. 2021;9(1):e64. doi: 10.22037/aaem.v9i1.1376. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8628642/
  227. Kalita J, Misra UK. In search of a better measuring scale of consciousness. Ann Card Anaesth. 2019 Apr–Jun;22(2):149–150. doi: 10.4103/aca.ACA_193_18. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6489384/
  228. Lucca LF, Lofaro D, Pignolo L, et al. Outcome prediction in disorders of consciousness: the role of coma recovery scale revised. BMC Neurol. 2019;19(1):68. doi: 10.1186/s12883-019-1293-7. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6472098/
  229. Matis G, Birbilis T. The Glasgow Coma Scale—A brief review. Past, present, future. Acta neurologica Belgica. 2008;108:75–89. https://www.researchgate.net/publication/23712880_The_Glasgow_Coma_Scale_-_A_brief_review_Past_present_future
  230. Schnakers C, Giacino JT, Kalmar K, et al. Does the FOUR score correctly diagnose the vegetative and minimally conscious states? Ann Neurol. 2006;60(6):744–745. https://orbi.uliege.be/bitstream/2268/249922/2/Schnakers_fourscores_2006_PPA.pdf
  231. Wijidicks E, Bamlet WR, Maramatton BV, Manno EM, McClelland RL. Validation of a new coma scale: The FOUR Score. Ann Neurol 2005;58:585–593. https://www.medischcontact.nl/web/file?uuid=22e45a5d-5290-43ee-bf83- b31ffac45198&owner=1e836119-cfd1-4e33-a731-da3efbb2a701&contentid=39312
  232. Bayne T, Carter O. Dimensions of consciousness and the psychedelic state. Neurosci Conscious. 2018;2018(1):niy008. doi: 10.1093/nc/niy008. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6146157/
  233. Bayne T, Hohwy J, Owen AM. Are there levels of consciousness? Trends Cogn Sci. 2016;20(6):405–413. https://doi.org/10.1016/j.tics.2016.03.009  https://www.sciencedirect.com/science/article/pii/S136466131630002X
  234. Gamez D. The measurement of consciousness: a framework for the scientific study of consciousness. Front Psychol. 2014;5:714. doi: 10.3389/fpsyg.2014.00714. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4091309/?report=classic
  235. Ponte G, Chiandetti C, Edelman DB, Imperadore P, Pieroni EM, Fiorito G. Cephalopod behavior: From neural plasticity to consciousness. Front Syst Neurosci. 2022;15:787139. doi: 10.3389/fnsys.2021.787139. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9039538/?report=classic
  236. Walter J. Consciousness as a multidimensional phenomenon: Implications for the assessment of disorders of consciousness. Neurosci Conscious. 2021;2021(2):niab047. doi: 10.1093/nc/niab047. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8716840/?report=classic
  237. Casali AG, Gosseries O, Rosanova M, et al. A theoretically based index of consciousness independent of sensory processing and behaviour. Sci Transl Med. 2013 Aug 14;5(198):198ra105. doi: 10.1126/scitranslmed.3006294. https://orbi.uliege.be/bitstream/2268/171542/1/A%20theoretically%20based%20index%20of%20consciousness%20independent%20of%20sensory%20processing%20and%20behavior.pdf?fbclid=IwAR0ENlu_Dr99Om1T_AP9f-jqUDc1ZhL1pZxz-1bNz81Kbo1KSI4op5X4n3A
  238. Rosanova M, Gosseries O, Casarotto S, et al. Recovery of cortical effective connectivity and recovery of consciousness in vegetative patients. Brain. 2012;135(4):1308–1320. doi: 10.1093/brain/awr340. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3326248/
  239. Bekinschtein TA, Dehaene S, Rohaut B, Tadel F, Cohen L, Naccache L. Neural signature of the conscious processing of auditory regularities. Proc Natl Acad Sci U S A. 2009 Feb 3;106(5):1672–1677. doi: 10.1073/pnas.0809667106. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2635770/
  240. King JR, Faugeras F, Gramfort A, et al. Single-trial decoding of auditory novelty responses facilitates the detection of residual consciousness. Neuroimage. 2013 Dec;83:726–738. doi: 10.1016/j.neuroimage.2013.07.013. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5635957/
  241. Engel AK, Singer W. Temporal binding and the neural correlates of sensory awareness. Trends Cogn Sci. 2001 Jan 1;5(1):16–25. doi: 10.1016/s1364-6613(00)01568-0. http://andreas-engel.com/engel_2001_tics.pdf
  242. Aru J, Bachmann T, Singer W, Melloni L. Distilling the neural correlates of consciousness. Neurosci Biobehav Rev. 2012 Feb;36(2):737–746. https://doi.org/10.1016/j.neubiorev.2011.12.003 https://www.sciencedirect.com/science/article/pii/S0149763411002107
  243. Baars, BJ, Laureys, S. One, not two, neural correlates of consciousness. Trends Cogn Sci. 2005 Jun;9(6). http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.367.9177&rep=rep1&type=pdf
  244. Block N. Two neural correlates of consciousness. Trends Cogn Sci. 2005;9;46–52. http://www.nyu.edu/gsas/dept/philo/faculty/block/papers/final_revised_proof.pdf
  245. Boly M, Massimini M, Tsuchiya N, Postle BR, Koch C, Tononi G. Are the neural correlates of consciousness in the front or in the back of the cerebral cortex? Clinical and neuroimaging evidence. J Neurosci. 2017 Oct;37(40):9603–9613. doi: 10.1523/JNEUROSCI.3218-16.2017. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5628406/
  246. Chalmers DJ. What is a neural correlate of consciousness? In: Metzinger T, editor. Neural correlates of consciousness: Empirical and conceptual questions. Cambridge: The MIT Press; 2000. https://pdfs.semanticscholar.org/35c4/ecd86863e84d2b2b0a31294b7b0223d7204e.pdf
  247. Cosmelli D, Lachaux JP, Thompson E. Chapter 26. Neurodynamical approaches to consciousness. In: Zelazo PD, Moscovitch M, Thompson E, editors. The Cambridge handbook of consciousness. Cambridge University Press; 2007:731–772. http://perpus.univpancasila.ac.id/repository/EBUPT181231.pdf
  248. Crick F, Koch C. Consciousness and neuroscience. Cereb Cortex. 1998 Mar;8(2):97–107. https://authors.library.caltech.edu/40355/1/feature_article.pdf
  249. Crick F, Koch C. Some reflections on visual awareness. Cold Spring Harb Symp Quant Biol. 1990;55:953–962. https://authors.library.caltech.edu/40351/1/61.pdf
  250. Dehaene S. Chapter 4. The signature of a conscious thought. In: Consciousness and the brain: Deciphering how the brain codes our thoughts. New York, New York: The Penguin Group; 2014:115–160.
  251. Del Cul A, Baillet S, Dehaene S. Brain dynamics underlying the nonlinear threshold for access to consciousness. PLoS Biol. 2007 Oct;5(10):e260. https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.0050260
  252. Edelman GM. Neural Darwinism: Selection and reentrant signaling in higher brain function. Neuron. 1993 Feb;10:115–125. http://brainmaps.org/pdf/edelman1993.pdf http://www.acamedia.info/letters/an_Peter_von_Salis/references/neurosciences_institute/edelman1993.pdf
  253. Edelman GM, Gally JA, Baars BJ. Biology of Consciousness. Front Psychol. 2011;2:4. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3111444/
  254. Fink SB. A deeper look at the “Neural correlate of consciousness.” Front Psychol. 2016;7:144. doi: 10.3389/fpsyg.2016.01044. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4960249/
  255. Frith C, Perry R, Lumer E. The neural correlates of conscious experience: An experimental framework. Trends Cogn Sci. 1999 Mar;3(3):105–114. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.575.6134&rep=rep1&type=pdf
  256. Koch C, Massimini M, Boly M, Tononi G. Neural correlates of consciousness: Progress and problems. Nat Rev Neurosci. 2016;17:307–321. https://puredhamma.net/wp-content/uploads/Neural-correlates-of-consciousness-Koch-et-al-2016.pdf
  257. Lamme VAF. Can neuroscience reveal the true nature of consciousness? citeseerx.ist.psu.edu. http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.217.2789  https://www.nyu.edu/gsas/dept/philo/courses/consciousness05/LammeNeuroscience.pdf
  258. Lamy D, Salti M, Bar-Haim Y. Neural correlates of subjective awareness and unconscious processing: An ERP study. J Cogn Neurosci. 2009 Jul;21(7):1435–1446. doi: 10.1162/jocn.2009.21064. https://www.researchgate.net/publication/23170315_Neural_Correlates_of_Subjective_Awareness_and_Unconscious_Processing_An_ERP_Study
  259. Melloni L, Molina C, Pena M, Torres D, Singer W, Rodriguez E. Synchronization of neural activity across cortical areas correlates with conscious perception. J Neurosci. 2007 Mar 14;27(11):2858–2865. https://doi.org/10.1523/JNEUROSCI.4623-06.2007 http://www.jneurosci.org/content/27/11/2858.long
  260. Morales J, Lau H. The neural correlates of consciousness. In: Kriegel U, editor. Penultimate draft: Forthcoming in The Oxford Handbook of the Philosophy of Consciousness. Oxford University Press; 2018 Jan. https://philpapers.org/archive/MORTNC-7.pdf
  261. Owen M. Neural correlates of consciousness and the nature of the mind. In: Guta MP, editor. Consciousness and the Ontology of Properties. 1st ed. New York: Routledge; 2018. https://www.newdualism.org/papers-Jul2020/Owen-OWENCOv3.pdf
  262. Owen M, Guta MP. Physically sufficient neural mechanisms of consciousness. Front Syst Neurosci. 2019 Jul 4;13:24. doi: 10.3389/fnsys.2019.00024. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6622321/
  263. Polák M, Marvan T. Neural correlates of consciousness meet the theory of identity. Front Psychol. 2018;9:1269. doi: 10.3389/fpsyg.2018.01269. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6066586/
  264. Pollen DA. On the neural correlates of visual perception. Cereb Cortex. 1999;9(1):4–19. https://doi.org/10.1093/cercor/9.1.4  https://academic.oup.com/cercor/article/9/1/4/314915/On-the-Neural-Correlates-of-Visual-Perception
  265. Rees G, Kreiman G, Koch C. Neural correlates of consciousness in humans. Nat Rev Neurosci. 2002;3(4):261–270. https://www.researchgate.net/publication/11399830_Neural_correlates_of_consciousness_in_humans
  266. Tononi G, Koch C. The neural correlates of consciousness: An update. Ann N Y Acad Sci. 2008;1124:239–261. doi: 10.1196/annals.1440.004. https://authors.library.caltech.edu/40650/1/Tononi-Koch-08.pdf
  267. Alves PM, Foulon C, Karolis V, et al. An improved neuroanatomical model of the default-mode network reconciles previous neuroimaging and neuropathological findings. Commun Biol. 2019 Oct 10;2:370. doi: 10.1038/s42003-019-0611-3. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6787009/
  268. Andrews-Hanna JR. The brain’s default network and its adaptive role in internal mentation. Neuroscientist. 2012 Jun;18(3):251–270. doi: 10.1177/1073858411403316. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3553600/
  269. Andrews-Hanna JR, Smallwood J, Spreng RN. The default network and self-generated thought: component processes, dynamic control, and clinical relevance. Ann N Y Acad Sci. 2014;1316(1):29–52. doi: 10.1111/nyas.12360. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4039623/
  270. Buckner RL, Andrews-Hanna JR, Schacter DL. The brain’s default network: Anatomy, function, and relevance to disease. Ann N Y Acad Sci. 2008 Mar;1124:1–38. doi: 10.1196/annals.1440.011. https://www.researchgate.net/publication/5451668_The_Brain’s_Default_Network
  271. Calster LV, D’Argembeau A, Salmon E, Peters F, Majerus S. Fluctuations of attentional networks and default mode network during the resting state reflect variations in cognitive states: Evidence from a novel resting-state experience sampling method. J Cogn Neurosci. 2017 Jan;29(1):95–113.
  272. Christoff K, Gordon AM, Smallwood J, Smith R, Schooler JW. Experience sampling during fMRI reveals default network and executive system contributions to mind wandering. Proc Natl Acad Sci USA. 2009 May;106(21):8719–8724. doi: 10.1073/pnas.0900234106. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2689035/  https://www.pnas.org/doi/10.1073/pnas.0900234106
  273. Fox MD, Snyder AZ, Vincent JL, Corbetta M, Van Essen DC, Raichle ME. The human brain is intrinsically organized into dynamic, anticorrelated functional networks. Proc Natl Acad Sci U S A. 2005 Jul 5;102(27):9673–9678. doi: 10.1073/pnas.0504136102. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1157105/
  274. Greicius MD, Krasnow B, Reiss AL, Menon V. Functional connectivity in the resting brain: A network analysis of the default mode hypothesis. Proc Natl Acad Sci U S A. 2003;100(1):253–258. doi: 10.1073/pnas.0135058100. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC140943/
  275. Greicius MD, Supekar K, Menon V, Dougherty RF. Resting-state functional connectivity reflects structural connectivity in the default mode network. Cereb Cortex. 2009;19(1):72–78. doi: 10.1093/cercor/bhn059. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2605172/
  276. Hagmann P, Cammoun L, Gigandet X, et al. Mapping the structural core of human cerebral cortex. PLosBio. 2008 Jul:6l(7):e159. https://doi.org/10.1371/journal.pbio.0060159. http://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.0060159
  277. Leech R, Kamourieh S, Beckmann CF, Sharp DJ. Fractionating the default mode network: Distinct contributions of the ventral and dorsal posterior cingulate cortex to cognitive control. J Neurosci. 2011;31(9):3217–3224. doi: 10.1523/JNEUROSCI.5626-10.2011. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6623935/
  278. Mason MF, Norton MI, Van Horn JD, Wegner DM, Grafton ST, Macrae CN. Wandering minds: The default network and stimulus-independent thought. Science. 2007;315(5810):393–395. doi: 10.1126/science.1131295. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1821121/
  279. Menon V, Uddin LQ. Saliency, switching, attention and control: A network model of insula function. Brain Struct Funct. 2010;214(5–6):655–667. doi: 10.1007/s00429-010-0262-0. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2899886/
  280. Raichle ME, MacLeod AM, Snyder AZ, Powers WJ, Gusnard DA, Shulman GL. A default mode of brain function. Proc Natl Acad Sci U S A. 2001 Jan 16;98(2):676–682. doi: 10.1073/pnas.98.2.676. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC14647/
  281. Salomon R, Levy DR, Malach R. Deconstructing the default: Cortical subdivision of the default mode/intrinsic system during self-related processing. Hum Brain Mapp. 2014;35(4):1491–1502. doi: 10.1002/hbm.22268. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6869590/
  282. Sporns O. Structure and function of complex brain networks. Dialogues Clin Neurosci. 2013 Sep;15(3):247–262. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3811098/
  283. Sridharan D, Levitin DJ, Menon V. A critical role for the right fronto-insular cortex in switching between central-executive and default-mode networks. Proc Natl Acad Sci U S A. 2008;105(34):12569–12574. doi: 10.1073/pnas.0800005105. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2527952/
  284. Toro R, Fox PT, Paus T. Functional coactivation map of the human brain. Cereb Cortex. 2008;18(11):2553–2559. doi: 10.1093/cercor/bhn014. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2567424/
  285. Uddin LQ, Kelly AM, Biswal BB, Castellanos FX, Milham MP. Functional connectivity of default mode network components: Correlation, anticorrelation, and causality. Hum Brain Mapp. 2009;30(2):625–637. doi: 10.1002/hbm.20531. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3654104/
  286. Vanhaudenhuyse A, Demertzi A, Schabus M, et al. Two distinct neuronal networks mediate the awareness of environment and of self. J Cogn Neurosci. 2011 Mar;23(3):570–578. doi: 10.1162/jocn.2010.21488. https://www.researchgate.net/publication/44642456_Two_Distinct_Neuronal_Networks_Mediate_the_Awareness_of_Environment_and_of_Self/link/02faf4f86c485956a0000000/download
  287. Vanhaudenhuyse A, Noirhomme Q, Tshibanda L, et al. Default network connectivity reflects the level of consciousness in non-communicative brain-damaged patients. Brain. 2010;133:161–171. doi: 10.1093/brain/awp313. https://www.researchgate.net/publication/40774069_Default_network_connectivity_reflects_the_level_of_consciousness_in_non-communicative_brain-damaged_patients/link/02e7e521eff1173292000000/download
  288. Zhang M, Bernhardt BC, Wang X, et al. Perceptual coupling and decoupling of the default mode network during mind-wandering and reading. Elife. 2022 Mar;11:e74011. doi: 10.7554/eLife.74011. https://elifesciences.org/articles/74011
  289. Odegaard B, Knight RT, Lau H. Should a few null findings falsify prefrontal theories of conscious perception? J Neurosci. 2017 Oct 4;37(40):9593–9602. doi: 10.1523/JNEUROSCI.3217-16.2017. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5628405/
  290. Frigato G. The neural correlates of access consciousness and phenomenal consciousness seem to coincide and would correspond to a memory center, an activation center and eight parallel convergence centers. Front Psychol. 2021;12:749610. doi: 10.3389/fpsyg.2021.749610. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8511498/?report=classic
  291. Deco G, Vidaurre D, Kringelbach ML. Revisiting the global workspace orchestrating the hierarchical organization of the human brain. Nat Hum Behav. 2021;5(4):497–511. doi: 10.1038/s41562-020-01003-6. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8060164/

> Go to Chapter 7

< Go back to Chapter 5

<< Go back to Homepage