Theorem V: Consciousness is a Special Kind of Reentrant Signaling State
Consciousness is one of the very important functions of the mind. It can even be considered the supreme mental function because it makes us sentient in a special way and differentiates us from present-day artificial intelligence. Without it, no matter how immensely intelligent we are, we will be just another kind of present-day robot, existing in this world without having the awareness and experiences of what it is like to be ourselves, to see the red rose, to hear the music, to feel the tender touch, to feel the happiness, to think of something, and so on in our mind.
But consciousness is the term that has different meanings for different people. What does consciousness in the sense used in the preceding paragraph refer to? Let’s examine consciousness in this sense, which will be the sense that will be used in this theory, in detail as follows:
6.1. Definition of consciousness
If we inspect our mind while we are not doing anything in particular, we will find that there are many mental processes going on simultaneously as resting-state mental processes, such as thinking casually, experiencing light mood, perceiving trivial sensations (inactive vision, soft sound, light touch, etc. from the uneventful surroundings), being aware of space and time, being aware of self, and being aware of these on-going conscious mental processes. When there is a strong enough stimulus, either external (a flash of light, a loud call, a sharp pain, etc.) or internal (a popping-up thought of an urgent task, a sudden reliving of a past exciting incident, a pang of sorrow, etc.), there is a mental process or mental processes that register the stimulus, direct our attention to, and keep our concentration on that stimulus, and our mind will become less aware of the resting-state mental processes that are not related to the stimulus. For example, when there is an exciting news on the television and we turn to watch it, we may not be aware of the sounds in the surroundings, not register the pain in our back, and not feel any emotion that we had before. So, normally, when we are conscious, there are many mental processes going on in a dynamic state – some become dominant for some time and some become marginal or even disappear for some time. But, whether we are doing nothing in particular or concentrating on something, there is one mental process that is always working when we are conscious. That mental process functions to be aware of and experience the functions of other mental processes (such as a visual perception, an auditory perception, a touch perception, an emotion perception, and a recollection of a past event) and of itself, with three important characteristics as stated in section 3.1. The three important characteristics are as follows:
#1. The awareness of the existence of the mental processes’ functions occurs (that is, the existence of the mental processes’ functions is registered into the information and the processing systems of the mind).
#2. The awareness and experiences of what the mental processes’ functions are like (e.g., what the vision of the house is like, what the sound of the song is like, what the odor of the rose is like, what the thinking of something is like, what the feeling of happiness is like, etc.) occur.
#3. The information of the mental processes’ functions becomes available to the mind’s various parts that include the cognition part, the symbolizing part, and the storage part (thus, the mind can think about, analyze, compare, and do other mental activities with those mental functions intentionally; can directly represent those mental functions with symbols – written signs, sounds, gestures, etc.; and can intentionally memorize and recall those mental functions, at least with some details and for some time).
Thus, according to the definition in section 3.1, this mental process functions to be consciously aware of and consciously experiences the functions of other mental processes and of itself. In this theory, this mental process will be called consciousness mental process, and it can be defined as:
Definition. The consciousness mental process is the mental process that functions to be consciously aware of and consciously experience mental process functions.
Now, when the consciousness mental process functions to be consciously aware of and consciously experiences something, there is a special kind of mental awareness and experience occurring – the mental awareness and experience that are associated with the three important characteristics as stated above – which, according to the definitions in section 3.2, is called conscious awareness and conscious experience, respectively. Thus, in other words, the consciousness mental process functions to create conscious awareness and conscious experiences. But, in a normal condition, the consciousness mental process creates conscious awareness and conscious experiences of several mental processes’ functions (such as those of a visual perception, an auditory perception, a touch perception, an emotion perception, and a recollection of a past event) simultaneously. So, in a normal condition, the consciousness mental process creates the composite of all conscious awareness and conscious experiences. If we defined that “consciousness” is what the consciousness mental process creates – then “consciousness” is the composite of all conscious awareness and conscious experiences. This is the definition of “consciousness” in this theory:
Definition. Consciousness is the composite of all conscious awareness and conscious experiences.
It is important not to lose sight of the fact that this actually defines that consciousness is the composite of all mental awareness and experiences that have the three important characteristics, as stated above. Also, as noted in section 3.2, conscious awareness is mental awareness of a quale, and a conscious experience is a mental experience of a quale; therefore, another equivalent definition of consciousness is: consciousness is the composite of all mental awareness and experiences of qualia. If we re-check these three definitions of consciousness, we can see that they indeed define the same entity.
Phenomenality and other definitions of consciousness
Characteristic #2 specifies that the awareness and experiences of what mental processes’ functions are like (e.g., what the vision of the house is like, what the sound of the song is like, what the odor of the rose is like, what the thinking of something is like, what the feeling of happiness is like, etc.) occur in consciousness, that is, the awareness and experiences of phenomenality of mental processes’ functions occur in consciousness. Moreover, because we can be consciously aware of and experience our consciousness*, consciousness itself is consciously experienceable. And, because we can be aware of and experience what consciousness is like in the mind, consciousness itself must manifest what it is like in the mind. Thus, consciousness itself has phenomenality.
(*That is, we can be aware of the existence of our consciousness, can be aware of and experience what our consciousness is like, and can share the awareness and experiences of our consciousness to the mind’s various parts that include the cognition part, the symbolizing part, and the storage part – we can intentionally think about our consciousness, can represent it with symbols, can intentionally memorize and recall it, etc.)
Consciousness in this theory, therefore, involves phenomenality in two aspects: a) the awareness and experiences of the phenomenality of mental processes’ functions occur in consciousness and b) consciousness itself has phenomenality. Sometimes, to emphasize this involvement of phenomenality, the word phenomenal consciousness [1-4] will be used. However, in this theory, the word consciousness and phenomenal consciousness have identical meaning. The word consciousness will be used in general cases, and the word phenomenal consciousness will be used when the involvement of phenomenality is to be emphasized.
The above is the definition of consciousness in this theory, the definition that specifies phenomenality as an important aspect of consciousness. However, in the literature, “consciousness” is the term that has varied meanings [1-3,5-12], many of which differ significantly from the meaning used in this theory. In some of the most frequent usages, the term consciousness is used to refer to the following:
- a mental state of being awake and sentient (such as, he lost his consciousness in the accident and became conscious after a while),
- a mental state of being aware of something (such as, he is conscious of the girl’s presence and her scent, or he is conscious of the pain in his body),
- a mental state of being aware of and experiencing something with the awareness and experiences of what that something is like occurring (such as consciousness is the awareness and experience of the vision, sound, emotion, and thought with the awareness and experiences of what the vision, sound, emotion, and thought are like occurring), and
- a command center or workspace mental process that integrates other mental processes and enables them to function together (such as consciousness is the center or workspace that functions to direct attention to some signal, keep concentration on that signal, amplify and integrate that signal, and make information of that signal available to other mental processes).
Neurologists and general doctors usually use the terms conscious and consciousness in the first meaning. General people usually use the terms in the second meaning. Philosophers, as well as some neuroscientists, usually discuss the terms in the third meaning, and the hard problem of consciousness [1-4,5,6,13,14] is the problem about consciousness in this meaning. The terms in the last meaning are used mostly by neuroscientists, and the Global Workspace theory [15-19], the Global Neuronal Workspace theory [20-25], and the extended theory of global workspace of consciousness  are about consciousness in this meaning also.
There are other, less frequent usages of the word consciousness. For example, some use consciousness interchangeably with the mind, which is considered too broad a usage, and some use consciousness to mean self-consciousness [3,7,11,27], which is obviously too narrow a usage . Therefore, one should be careful what the terms consciousness and conscious mean when one reads, writes, or discuss these terms.
In this theory, the meaning of consciousness is similar to the third meaning above, that is, consciousness is the composite of all conscious awareness and conscious experiences – the composite that has the awareness and experiences of phenomenality (of what things are like) and that itself has phenomenality (manifests what it is like in the mind). This theory deals with consciousness in this meaning because consciousness in only this meaning is phenomenal consciousness, which differentiates sentient beings from non-sentient beings and differentiates us from artificial intelligence. Without having the awareness and experiences of phenomenality (of what things are like) and without the consciousness itself having phenomenality (manifesting what it is like in the mind), even the most highly integrated and very capable brain will function just like electronic processes in present-day computers/robots. For other authors, the opinions may be that the term consciousness should also include some other integrative/command functions (directing attention, maintaining attention, making information widely available, distributing information, sensory binding, creating awareness of self, making decisions, etc. [15,27-29]). But, then again, these functions can be set up even in today’s computers and robots – these functions do not differentiate us from current artificial intelligence (AI). Moreover, AI can even be better at these functions than we are, so these integrative/command functions are not the elements that make consciousness “phenomenal consciousness” as we experience it in our mind and do not make us different from computers, robots, or other AI. Also, the opinions may vary on how many of these functions and exactly which ones of them are to be included in the consciousness. Therefore, as the term consciousness as defined in the definition above is the one that differentiates us from computers, robots and other AI and suffices in discussions in this theory, it will be used as the working definition in this theory. This definition can be considered the minimum definition of consciousness; the wider definition that includes other functions is possible but not needed in this theory. And, in this theory, when the mental processes that perform the integrative/command functions are to be referred to, they will be collectively referred to as the integration-center mental process, not consciousness.
Other aspects of consciousness
It is to be noted that the consciousness mental process does not functions to be consciously aware of and consciously experience all mental process functions. Some mental processes are not accessible to the consciousness mental process, and the consciousness mental process can neither be aware of nor experience their functions [30-37]. These mental processes are called sub-conscious or unconscious mental processes. Mental processes whose functions the consciousness mental process can be aware of and experience are called conscious mental processes (which are different from the consciousness mental process). And it is only the functions of these conscious mental processes that the consciousness mental process functions to be consciously aware of and consciously experience.
Regarding the operation of consciousness, it should be noted that, although, at any moment in a normal condition, consciousness consists of conscious awareness and experiences of several mental processes’ functions simultaneously, the intensity and accuracy of awareness and experiences of these mental processes’ functions are usually not all equal. Only one or a few mental processes’ functions have the primary conscious awareness and experiences at a time; the others have only secondary conscious awareness and experiences. The former have intense and accurate awareness and experiences while the latter have less intense and less accurate awareness and experiences. For example, when we are watching the movie, our primary conscious awareness and experiences will be on the visual perception, auditory perception, and emotion that are movie-related while our conscious awareness and experiences of other sensory perceptions, other emotions, time, etc. will become less intense and less accurate.
Regarding the content of consciousness, it should be noted that the awareness that occurs in consciousness can vary in content: from basic awareness, such as awareness of sensory perception, to complex awareness, such as awareness of one’s own mental activities, awareness of one’s own self, and awareness of being aware. Some authors call this “difference in content” of conscious awareness “different levels” of consciousness [38-40], but it must be understood that this difference in levels is the difference in levels of awareness content, not the difference in levels of alertness, of consciousness (which will be discussed in “conscious alertness” in the next section). Several authors have specific terms for different consciousness that has different content of awareness. The most basic consciousness that has only the awareness of sensory perception is usually called “primary consciousness” [39,41-43], “sensory consciousness” , “perceptual consciousness” [39,40], or “anoetic consciousness” . The more complex consciousness is called “higher-order consciousness” [39,41]. In addition to awareness of sensory perception, the higher-order consciousness that has awareness of executive control, of decision making, of voluntary action, of one’s own thoughts, and of one’s own awareness is called “reflective consciousness” , and the high-order consciousness that also has awareness of one own’s identity over time is called “autonoetic consciousness” [38,44]. To be noted is that the definitions of these terms do not specify that phenomenality must be part of consciousness. So, as discussed in the previous paragraph, even with current technology, we can make computers or robots that have consciousness with these kinds of awareness content. On the contrary, this theory is about “phenomenal consciousness”, which requires that phenomenality be part of consciousness but does not specify what its content must be. The consciousness’s content can be just simple awareness of sensory perception or can be complex awareness of executive control, of decision make, of voluntary action, etc., but as long as phenomenality occurs in the consciousness, it is consciousness in this theory.
6.2. Consciousness neural process
As the consciousness mental process is a mental process, it cannot occur alone by itself but must occur with some neural process (Theorem I). In this theory, this neural process will be called the consciousness neural process. From the properties of consciousness, it can be predicted that some important characteristics that the consciousness neural process and its circuit must have are as follows:
- It must be an extensive network that connects with all neural processes that have qualia so that conscious awareness and experiences of all the qualia are possible. These qualia neural processes are important neural processes: all external sensory perception neural processes, some internal sensory perception neural processes (mostly involved in the control of balance and volitional movement, and the perception of damage or dysfunction of internal organs), all the highest cognitive and executive neural processes (thinking, reasoning, planning, deciding, language processing, etc.), emotion neural processes, volitional movement neural processes, etc. But it is not connected to or not connected in the way that conscious awareness and experiences can occur to the lower mental processes that perform more basic functions, such as mental processes in the basal ganglia, brainstem, cerebellum, and autonomic nervous system, so there are no conscious awareness and experiences of the functions of these processes.
- It must be able to read and/or synchronize with the neural processes that have qualia so that it can get information from them to process and then create the conscious awareness and experiences for these neural processes.
- It must have its signals fed back to itself so that it can be aware of the function of itself, that is, its signaling state must be a reentrant signaling state.
- Its signaling patterns must be special signaling patterns (SSPs) because the conscious awareness and conscious experiences, which form the consciousness, themselves are qualia (i.e., conscious awareness and conscious experiences themselves have phenomenality) and because all qualia are SSPs (Theorem IV).
The questions of “by what neural circuits and by what neural mechanisms can the consciousness neural process create conscious awareness and conscious experiences or what is the neural correlates of consciousness – the minimal neuronal mechanisms that are jointly sufﬁcient for any one speciﬁc conscious percept [45-59]?” is being studied by many neuroscientists, and, although the conclusive answers are not settled yet, some models are strongly supported by experimental evidence and are widely accepted. Regarding the neural circuits of consciousness, as far as one can conclude from the evidence at present, it seems very likely that the Default Mode Network or Resting State Network [60-66], which includes the medial prefrontal cortex, posterior cingulate cortex, precuneus, hippocampal formation, parahippocampal cortex, retrosplenial cortex, posterior inferior parietal, temporoparietal junction, and lateral temporal cortex, with the anterior medial prefrontal cortex and posterior cingulate cortex as the hubs, is the neural network for background consciousness, when the mind is in the state of passive internal mentation, not performing cognitively demanding tasks, and not concentrating on outside stimuli. This network overlaps with the network that functions for access consciousness (consciousness of episodic stimuli that gain access into the consciousness, not background consciousness that deals with continuous background stimuli, as in the former case). At present, evidence shows that the network of Global Workspace theory proposed by Baars [15-19] or the network of Global Neuronal Workspace hypothesis of Dehaene [20-25], which includes the cortico-thalamic (C-T) core and a network of neurons with long-range axons densely distributed in the prefrontal, fronto-parietal, parieto-temporal, and cingulate cortices, is the network of access consciousness. Other studies also point to similar areas, i.e., superior parietal and dorsolateral prefrontal cortex [47,58], but others stress more on the posterior hot zone including temporal, parietal, and occipital cortices that play a direct role in the occurrence of consciousness [48,54]. So, the complete neural network of consciousness that functions to be conscious of continuous background events and episodic accessing events is likely to be some form of combination of these networks, such as the neural network in the extended theory of global workspace of consciousness proposed by Song .
It should be noted that the functions of both kinds of networks are not mutually exclusive, that is, the two kinds of networks do not work alternately in the way that one is shut off completely while the other is working. For example, when the Global Workspace or the Global Neuronal Workspace is functioning, such as when attending to a visual stimulus on a computer screen, the Resting State Network or Default Mode Network is still functioning, as is evident by the fact that the subject is still aware of the background vision, the ambient sound, one own self, etc., although the awareness of these background stimuli becomes less intense and less accurate. It is not that, while the mind is attending to something, it is not aware of anything else at all and that everything else blacks out.
Conscious alertness and awareness
Functionally, it has been found that the network of the consciousness neural process requires another neural network – the ascending reticular activating system (ARAS) – for its alertness or the level of its function. The ARAS is the neural network in the core of the brain extending from the rostral pons to both thalami and the basal forebrain, and from there it sends extensive axonal projections to the cerebral cortex, including the consciousness neural network [7,67-69]. There can be no consciousness if there is no stimulation from the ARAS, even if the consciousness neural network is intact and able to function, such as in the case of extensive brainstem hemorrhage, infarct, and trauma. And the level of conscious alertness depends on the activation from the ARAS – the more activation, the higher level of alertness. This results in various conscious level’s states, such as (from lower to higher level of alertness) coma, stupor, drowsiness, normal alertness, and heightened alertness.
On the contrary, if the ARAS is working normally but the consciousness neural process is not, there can be alertness without, with minimal, or with some awareness. The degree of impairment of awareness depends on how extensive and how severe the consciousness neural process is dysfunctional. For example, in the case of damages to the consciousness neural process from diffuse cerebral hypoxia, extensive bilateral cerebral infarcts, or diffuse cerebral cortical injury, the results can be various abnormal conscious awareness’s states that range in severity of abnormal conscious awareness, such as (from mild to severe) acute confusional state, akinetic mutism, minimally conscious state (MCS), and vegetative state (VS) [70-77]. In the latter two categories, which are severe conditions, the patients can open eyes and have some reflex responses, such as blinking, chewing, and yawning, but show no (in VS) or minimal (in MCS) signs of conscious awareness of self and the environment (by clinical testing or by special investigations such as EEG, evoked potentials, and fMRI) [70,71,74,77-81].
Both of the two kinds of variable abnormality in consciousness show that consciousness is not an all-or-none phenomenon but is a graded phenomenon, depending on both the strength of stimulation from the ARAS and the functioning quality and quantity of the consciousness neural process.
6.3. Theorem V
As consciousness is the composite of all conscious awareness and conscious experiences and as both conscious awareness and conscious experiences are some kinds of reentrant signaling state (or reentrant signaling states in some specific forms, see section 5.2), consciousness is the composite of all reentrant signaling states of some kinds. But how can some kinds of signaling state be conscious awareness and conscious experiences, and thus consciousness? To answer this question, let’s consider various signals in Figure 6.1.
Figure 6.1 Signals representing various information
In Figure 6.1A, the signal’s information is “House”. When this signal is read, those who can read this signal will get the information of merely “House”, without anything else, such as what the house looks like. Likewise, in Figure 6.1B, the information of the signaling pattern of a certain neural process, such as some subconscious neural process, is “House”. It may have some other subconscious physical information, such as the values of the size, the shape, and the color of the house, in this signaling pattern, but if it does not have the information of what the house looks like, other neural processes, including the consciousness neural process, that read the signal from this signaling pattern will get information of only “House” and some other physical information but no information of what the house looks like. So, conscious awareness and a conscious experience of what the house looks like cannot and do not occur.
In Figure 6.1C, the signal’s information is “House and what the house looks like”. When this signal is read, those who read the signal will get the information of “House and what the house looks like”; they will not get information of “House” alone. Likewise, in Figure 6.1D, the information of the signaling pattern of a neural process is “House and what the house looks like”. Other neural processes that read this signaling pattern will get the information of “House and what the house looks like”. For the consciousness neural process, because of the available information and because of its specialized ability, the signaling state in Figure 6.1E that is the conscious awareness and experience of the house and of what the house looks like can be generated. This is how the house can appear with in our mind with all its phenomenality (Figure 6.1D), and this is how awareness and experience of the phenomenality of the house occur in our mind (Figure 6.1E). This is similarly true for all qualia and all conscious awareness and experiences of qualia.
Now, the question is how can a signaling pattern that has the information of what the house looks like (the signaling pattern in Figure 6.1D) and a signaling state that has the information of the conscious awareness and experience of what the house looks like (the signaling state in 6.1E) be created? The answer is, because it is evident that they can be created as can be seen in billions of humans everywhere, nowadays and in the past, it must be basically possible for some signaling patterns and signaling states to be so and for some neural processes – the visual perception neural process and the consciousness neural process – to be able to create such signaling patterns and signaling states, respectively. It is a brute fact.
If we consider the matter in general, it is not very surprising or improbable that these happen. There have been innumerable kinds of neural process with myriad kinds of signaling pattern and signaling state for various neural functions in various parts of the nervous system since the nervous system appeared in this world hundreds of millions of years ago: to maintain basic, vital functions, to keep various blood constituents within normal ranges, to process various kinds of sensory signal, to do diverse kinds of cognitive function, to control different types of effector, and so forth. All of these need evolved specialized neural processes with specific signaling patterns and specific signaling states of their own. The quale neural processes are just neural processes with their own specific functions – to create signaling patterns that are qualia, and the consciousness neural process is just one neural process with its own specific function – to create signaling states that are conscious awareness and conscious experiences of qualia. It is just the fundamental nature of this universe that these are possible: some signaling patterns can be qualia, and some signaling states can be conscious awareness and conscious experiences of qualia; some neural processes can evolve to create such signaling patterns, and some neural process – the consciousness neural process – can evolve to create such signaling states
Now, reentrant signaling states also occur in other neural processes [82-86], but, unlike in the consciousness neural process, there are no conscious awareness and conscious experiences occurring in them. This means that the reentrant signaling states in the consciousness neural process must be in some special forms that are conscious awareness and conscious experiences. For conciseness, this theory will assign the term “special reentrant signaling state” to this specific kind of reentrant signaling state.
Definition. A special reentrant signaling state is a reentrant signaling state that is conscious awareness and a conscious experience.
As stated at the beginning of this section, consciousness is the composite of all reentrant signaling states that are conscious awareness and conscious experiences. But special reentrant signaling states are reentrant signaling states that are conscious awareness and conscious experiences. Therefore, the theorem for consciousness can be stated as:
Theorem V: Consciousness is the composite of all special reentrant signaling states.
Basically, however, the composite of all special reentrant signaling states is just a special kind of reentrant signaling state. Therefore, the basic form of Theorem V can be stated as:
Theorem V: Consciousness is a special kind of reentrant signaling state.
Information-wise, because any signaling state of a neural process is the information existing in that neural process, basically, consciousness is just a special kind of information existing in the consciousness neural process. This information describes conscious awareness and experiences of some things (which include the awareness and experiences of what those things are like) and of the conscious awareness and experiences itself. Therefore, when the consciousness neural process reads this information by the process of reentering this information in its process, conscious awareness and experiences of those things (which include the awareness and experiences of what those things are like) and of the conscious awareness and experiences itself naturally and inevitably occur in it.
Now one may ask whether a special reentrant signaling state has properties or features that some neuroscientists hold that consciousness must have, such as widespread brain eﬀects, informative conscious contents, rapid adaptivity and ﬂeetingness  or dynamic, unitary and integrated, enormously diverse and differentiated, temporally ordered, serial and changeable [87-88]. The answer is yes. Because any signaling state is the information of that neural process and because information of a neural process is the physical entity that determines the neural process’s functions, any special signaling state is the entity that is responsible for whatever functions the consciousness neural process has. Now, because properties and features of any neural process, including the consciousness neural process, come from its functions, the special signaling states has properties and features of the consciousness process, such as the properties and features described above.
6.4. Effects of consciousness
Like qualia, which are proved in section 5.4 to have physical effects, consciousness has physical effects. This can be proved as follows.
(IC-A = integration center A, IC-B = integration center B)
(NP = neural processes, * = different signals, different effects)
Figure 6.2 Effects of integration centers without and with consciousness
Consider two integration-center mental processes in Figure 6.2. The integration-center mental process A (IC-A) functions to receive information from various mental processes, analyze the information, distribute appropriate information to appropriate mental processes, enhance and maintain information input from a certain mental process while suppressing information input from unrelated mental processes, etc. It may have unconscious awareness of these functions too, but it does not have conscious awareness and conscious experiences of these functions, that is, there are no awareness and experiences of what these functions are like occurring – it does not have phenomenal consciousness – in its operations, like the cerebellum or present-day computers.
On the other hand, the integration-center mental process B (IC-B) performs the same functions as the integration-center mental process A (IC-A) does, but in addition, it has conscious awareness and conscious experiences of these functions, that is, there are awareness and experiences of what these functions are like occurring – it has phenomenal consciousness – in its operations. Thus, IC-B has information about conscious awareness and conscious experiences in its process, while IC-A does not have this information. As IC-A and IC-B have different information, they require different signaling patterns and different signaling states in their neural processes to represent different information. Consequently, the physical characteristics of the two neural processes and their physical effects on other neural processes must be different, at least because of the differences in different signaling patterns and different signaling states. Thus, it can be concluded that consciousness has physical effects.
So, other functions else being the same, the integration center with consciousness will have different effects from the one without consciousness. At present, there is not enough evidence to state conclusively what these different effects are. But evolving anything (including the integration center) by adding a new function (such as consciousness in this case) must cost a being some resources (material to build the circuit, energy to maintain the process, time to handle additional signals, etc.); if the new function does not yield enough useful effects to make its overall effects beneficial to the being, the new function will result in a disadvantage for that being. This will be true for its species as a whole. Its species will likely be losing from this disadvantage and will become extinct in competition with the species that are similar but do not have the new function, which in this case is consciousness. But this kind of extinction has not happened to human beings, who definitely possess consciousness, and to many high-level animal species that probably have consciousness [8,41,70,89-98]. On the contrary, these species seem to be thriving and dominating in the evolutionary process. Therefore, it can be deduced that consciousness must have overall effects that are beneficial to the beings that possess it. Similar to the effects of qualia on perception, it is probable that consciousness augments the functions of the integration-center mental process to become more effective [15,28,29,99].
6.4.1. But why does it have to be consciousness to gain these augmenting effects?
This question is similar to the one regarding qualia, which has been answered in section 5.4.1. The answer in the case of consciousness will be very similar. It will be repeated here again with some modifications and in an abridged form. (If already content with the answer in section 5.4.1, the reader may just as well skip the next three paragraphs to the next question.)
Before the augmentation of the nervous system functions by creating consciousness emerged, the nervous system had been augmenting its functions with other means all along since it appeared on this planet: by developing a more and more complex and more and more capable nervous system. Even after consciousness had emerged, other parts of the cerebral cortex that functioned without consciousness occurring in their processes did not stop evolving: they have been continuing to evolve to be more complex and more capable [100,101], resulting in higher intelligence, more language facilities, more manual skill abilities, etc. (these abilities do not need consciousness to function as can be seen that they exist in computers, robots, and other automated machines, and these abilities are anatomically and functionally separate from consciousness as is evident by the fact that they can be interfered or destroyed without affecting consciousness). Even the cerebellum and other parts of the brain have proved to continue to co-evolve with the cerebral cortex [102,103] (which is the seat of consciousness) all the time, even if they do not have consciousness. Thus, the nervous system has always been augmenting its functions with various means, notwithstanding the absence or presence of consciousness.
Creation of consciousness is just one possible path in the nervous system evolution. Consciousness emerged in this world simply because, at a certain stage in the nervous system evolution, it was possible – the neural processes were advanced enough to evolve the new kind of neural processes that was capable of creating consciousness. After consciousness began to appear in this world, it has been selected to persist up to the present time possibly because it yields some advantages to the animals that possess it. That is why today it still exists in humans and probably in some other animals too [8,41,70,89-98]. And consciousness emerged because of no other cause than evolution; that is why it took more than two billion years after life had appeared on earth and hundreds of millions of years after the nervous system had emerged before sufficiently-evolved neural processes that were able to create consciousness could come into existence.
It remains to be seen whether creating consciousness is the end of this evolutionary path or whether neural processes with consciousness can evolve further into new kinds of neural process with new capabilities that are even more phenomenal than consciousness. To augment its functions, the nervous system has been evolving many kinds of neural process, signaling pattern, and signaling state all along. The generation of consciousness is thus not the necessary path, the most effective path, or the final path, but just one possible path, in augmenting the nervous system functions.
6.4.2. Then, why not augment all mental processes with consciousness, why augment only some of them?
This question is similar to the question regarding qualia in section 5.4.2; the answer in the case of consciousness will also be similar.
The cerebral cortex has evolved to be a broadly adaptable system that does not depend on fixed hard wiring of neural circuits alone but is also able to make use of information gathering and information processing: gathering encountered information, extracting the essence of each information, storing it, retrieving it, and dealing with it cognitively (i.e. by processes of forming possible scenarios, retrieving experiences in the past, comparing available choices, using rules that have been learned, etc.). But this system requires an integration center that can coordinate signal communications between several cortical areas: those involved in the final-stage perceptions of all external sensations and some internal sensations, cognitive perceptions (of space, time, self, non-self, etc.), other cognitive and executive functions (thinking, recalling past events, planning, deciding, language processing, etc.), emotion, and volitional motor control. It can certainly operate without consciousness, as is evident by the fact that the computer and the cerebellum can also do this kind of integrations without consciousness, but as shown at the beginning of this section, the integration center without consciousness has different effects from the one with consciousness, with the one with consciousness having potentials to add something beneficial to the system. Thus, creating consciousness in such a vital system, with conscious awareness and experiences of all important mental processes, can yield critical advantages to the organism. It can enhance perception, cognition, emotion, memory functions, volitional motor functions, etc. It is likely that, because perceiving the situation in the outside world correctly, analyzing them and forming responses to them effectively, and executing purposive motor movement appropriately are the most crucial mental functions of the being, it is beneficial and advantageous to augment the mental processes that perform these crucial functions with a novel function: consciousness. This is why consciousness evolved to occur in these mental processes and this is why some mental processes have consciousness while some do not – consciousness in some mental processes is beneficial and can yield advantages while consciousness in some mental processes is unnecessary and may create disadvantages.
6.5. The hard problem of consciousness
The hard problem of consciousness [1-6,13,14] is the problem of why consciousness has phenomenality in its functions:
- why consciousness has awareness and experiences of what things are like (of things’ phenomenality) – why consciousness does not have just awareness and experiences of things without awareness and experiences of what things are like (of things’ phenomenality), and
- why consciousness itself has phenomenality (manifests what it is like in the mind) – why it is not the case that consciousness itself does not have phenomenality (does not manifest what it is like in the mind).
Another important problem that is closely related to the hard problem is the explanatory gap: how immaterial phenomenal consciousness can arise from the material neural process. These questions are not expected to be answered by scientific experiments alone; some novel basic concepts are needed to answer them. On the contrary, other aspects of consciousness, such as what kind of neural circuits can function to create phenomenal consciousness and what kind of signaling among those neural circuits can create phenomenal consciousness, are “not hard” in the sense that they can theoretically be found out eventually by scientific investigations. Without having phenomenality in its functions, the consciousness neural process is just a complex neural process that is not functionally different from electronic processes in complex integrated circuits of computers or robots nowadays … or even from complex neural processes in other parts (the cerebellum, brainstem, basal ganglia, etc.) of the same brain, and there would not be the hard problem of why there is phenomenality in consciousness and the explanatory gap of how non-material phenomenal consciousness can arise from the material neural process.
6.5.1. What is consciousness?
Consciousness is a special kind of reentrant signaling state. This kind of signaling state is a special kind of neural process information – the kind of information that, in the neural language, means conscious awareness and experiences of some things (which include the awareness and experiences of what those things are like) and of the conscious awareness and experiences itself. Therefore, when the consciousness neural process reads this information by the process of reentering this information in its process, conscious awareness and experiences of those things (which include the awareness and experiences of what those things are like) and of the conscious awareness and experiences itself naturally and inevitably occur in it. This is how consciousness occurs in the mind.
It is very important to emphasize that this special kind of signaling state itself is consciousness – nothing else is created to be consciousness. Because a signaling state is inherent in its neural process, consciousness is inherent in the consciousness neural process. This explains the hard problem of consciousness and the explanatory gap of how non-material consciousness arises from the material consciousness neural process. Strictly considering, consciousness does not arise out of anything in the consciousness neural process, consciousness is inherent and always there in the consciousness neural processes – naturally existing in the consciousness neural process whenever and wherever there is the consciousness neural process – it is the information part or the information aspect of the consciousness neural processes.
6.5.2. Why does consciousness exist?
Consciousness exists in this universe because it is fundamentally possible to have consciousness (conscious awareness and experiences) in this universe. And it exists in the nervous system – in the composite of all special reentrant signaling states – because, in this universe, it is fundamental in the nature of neural signaling states that some of them are consciousness (conscious awareness and experiences): some kinds of neural signaling state in this universe mean, in the neural language, consciousness (conscious awareness and experiences). It is a brute fact.
6.5.3. Why does consciousness occur?
At first, consciousness occurred because, at some evolutionary stage about 520 – 315 million years ago (see the note at the end of section 5.5.3), it was possible for the nervous system, which was advanced enough then, to evolve a new kind of neural process that was capable of creating special reentrant signaling states, which were conscious awareness and experiences. Thus, consciousness occurred simply because it was evolutionarily possible for it to occur.
Consciousness still occurs today because there still is that kind of neural process that is capable of producing conscious awareness and experiences functioning. That kind of neural process is the consciousness neural process. Evidently, that the consciousness neural process still exists today means that it has been preserved in the evolutionary process since it appeared in this world. This, in turn, indicates that the consciousness, which the consciousness neural process produces, must yield some effects that increase the chance of survival of the species that possess it. Thus, consciousness still occurs today because it must somehow help increase the chance of survival of the species that possess it.
N.B. 6.5.2 and 6.5.3 answer the hard problem of consciousness of why consciousness in the form of phenomenal consciousness exist in this universe. Summarily: phenomenal consciousness exists in this universe because it is fundamental in the nature of this universe that there exists phenomenal consciousness in this universe and that phenomenal consciousness exists in some kinds of neural signaling state, and phenomenal consciousness still exists and occurs today because it very probably helps increase the survival chances of the species that have it.
6.5.4. How does consciousness occur?
As, discussed in 6.5.1, consciousness occurs by the functioning of a specific neural process (the consciousness neural process). Specifically, consciousness occurs by this neural process functioning in a specific signaling state: the composite of all special reentrant signaling states. This signaling state is the consciousness – no additional entity occurs to be the consciousness. Again, this answers the hard problem of consciousness and the explanatory gap of how non-material consciousness arises from the material consciousness neural process, as discussed in 6.5.1.
6.5.5. When does consciousness occur?
As discussed in 6.5.3, consciousness occurred in this world around 520 – 315 million years ago when the nervous system was advanced enough to evolve the neural process that was able to produce the composite of all special reentrant signaling states. This neural process is the consciousness neural process. After that, consciousness occurs whenever the consciousness neural process produces this composite. It should also be noted that it is impossible for consciousness to not occur when the consciousness neural circuit circulates its signals in the form of special reentrant signaling state because the special reentrant signaling state itself is consciousness.
6.5.6. Why can there not be just nervous systems without consciousness?
There can be just nervous systems without consciousness, and this had been the case for a long time – hundreds of millions of years – since the nervous system had appeared on this planet to the time when evolving neural processes became advanced enough to produce conscious awareness and experiences. When that time came, consciousness emerged simply because it was possible. It is only after that time that there are nervous systems with consciousness. But it should be noted that, even nowadays, there still are nervous systems without consciousness (as we know it), such as the nervous systems of animals that do not have a well-formed cerebrum (e.g., cnidaria, ctenophora, and flatworm).
6.5.7. Why can the consciousness neural process not produce conscious awareness and experiences but still preserves the same particular effects?
Because it cannot do that – those particular effects result from the conscious awareness and experiences themselves. When the consciousness neural process produces the conscious awareness and experience of something, it creates a certain reentrant signaling state – a special reentrant signaling state – which is the conscious awareness and experience of that thing. This specific reentrant signaling state (the special reentrant signaling state) is the very neural activity that yields those particular effects to other neural processes (Figure 6.3A).
(SP = signaling pattern, A = SP that is a quale, B = SP that is not a quale)
(UA = unconscious awareness, CA = conscious awareness, SRSS = special reentrant signaling state)
Figure 6.3. Consciousness with and without a special reentrant signaling state
herefore, the consciousness neural process cannot have those particular effects without creating this specific signaling state, that is, without producing the conscious awareness and experience (Figure 6.3B). Even if it produces the unconscious awareness and unconscious experience of the same thing (i.e., the awareness and experience of that thing without the awareness and experience of what that thing is like occurring), the signaling state will be different, and its effects will accordingly be different.
In conclusion, consciousness, as it is – phenomenal consciousness, occurs in the nervous system because, in this form, it has particular effects on the nervous system and thus has particular functions in the nervous system. If it were not in this form (i.e., not phenomenal consciousness), these particular effects and functions would not occur.
6.6. Consciousness in other beings, other entities, and other mental states
6.6.1. Do animals have consciousness?
Because similar neural processes have similar signaling states, any animals that have similar consciousness neural processes to ours have similar signaling states of the consciousness neural processes to ours. And, because consciousness is just a kind of signaling state, these animals should have similar consciousness to ours too. Animals that have fairly similar brain structures and neural processes to ours, such as mammals (chimpanzees, elephants, dolphins, dogs, etc.), are likely to have fairly similar consciousness to ours. Animals that have less similar brain structures and neural processes to ours, such as reptiles or amphibians (crocodiles, frogs, etc.), are likely to have less similar consciousness to ours. Animals that have very different brain structures and neural processes from ours, such as arthropods (ants, bees, flies, etc.) or cephalopods (octopus, squid, nautilus, etc.), may have different kinds of consciousness from ours or may not have consciousness at all. This can be definitely answered if the kind of neural circuits and signaling states that create consciousness in humans and the probable consciousness neural circuits and their signaling states in other animals can be identified and compared.
6.6.2. Do computers and robots have consciousness?
As computers and robots have parts that function similarly to parts of the nervous systems of animals – sensors (mouse, keyboard, camera, microphone, etc.), central processor and controller (computer motherboard, robot mainboard, etc.), and effectors (display monitor, speaker, robotic arm, etc.) – and as they use electrical signals in processing and communicating information between these parts and thus have electrical signals representing various kinds of information circulating in their circuits, it is logical to speculate that there might be qualia-like phenomena (like the red color or the sound of note C in our mind) occurring in their systems because some aspects of their signaling might be qualia-like phenomena as some of human neural processes’ signaling patterns, the special signaling patterns, are qualia. At present, it cannot be answered definitely whether there are qualia-like phenomena occurring in their signaling processes because we still do not know the specific characteristics of signaling patterns that can be qualia and qualia-like phenomena.
However, if qualia-like phenomena do occur in the computers and robots, they definitely have no physical effects on any of the computers’ and robots’ circuits because, evidently, all the circuits always operate in the expected way according to the program that is running. It has never been found that the circuits in the computers and robots do any operations that have not been programmed to occur or that their operations are affected by some unplanned, unknown phenomena. And it is definite that present-day computers and robots, operating with the present-day programs, do not and cannot have conscious awareness and experiences of qualia-like phenomena that may occur in them. This is because conscious awareness and experiences of anything, including qualia-like phenomena, are additional functions and have additional information. So, computers and robots need specific, dedicated programs and even some new specific, dedicated circuits to perform these specific functions and manage the new information that occurs in the processes. But there are no such programs and circuits in the present-day computers and robots – all the programs and circuits in the present-day computers and robots are created to do some predetermined, specific tasks else, such as word-processor or spreadsheet operations, picture- or video- related operations, and the control of their peripheral devices (e.g., disc drive, mechanical motor, robotic arms, etc.). None are built to be aware of and experience qualia-like phenomena that may occur in their circuits, and none are built to be aware of and experience something with the awareness and experiences of what that thing is like occurring. (At present, we do not know how to build such programs and circuits yet.) Therefore, it is definite that present-day computers and robots, with their current programs and circuits, do not and cannot have conscious awareness and experiences of anything (i.e., no awareness and experiences of what things are like); therefore, they do not and cannot have phenomenal consciousness. Similarly, this conclusion is true for all other present-day electronic devices/machines and all current devices/machines that process information in other ways: a thermometer, an analog odometer, a mobile phone, an industrial robot, an automated industrial machine, etc. – these present-day devices/machines, with their existing programs and circuits, do not and cannot have phenomenal consciousness – again, this is simply because they do not have programs and circuits for it to occur.
N.B. At this point, some may argue that consciousness is a non-material phenomenon and may occur in the computer by something that is not the function of the computer’s electronic circuits or may occur out of nothing by itself. However, if that is the case, then this kind of consciousness will not and cannot have physical effects on the computer; or, if it does have physical effects on the computer, the effects will be non-specific and unstructured like the non-specific, unstructured effects from heat, electrical short-circuiting, or mechanical force. This is because specific, structured physical effects in the computer occur via only the functions of their electronic circuits. If this kind of consciousness does not physical effects or if its physical effects are non-specific and unstructured, then it cannot affect the functions of the computer at all or can affect the computer’s functions in only non-specific, unstructured ways. Thus, such consciousness will be different from consciousness that occurs in us, because consciousness that occurs in us affects us and affects us in very specific, structured ways, resulting in various complex, structured behaviors. For example, we think, talk, and write about it; we have conferences and debates on it; and we do researches and experiments on it. Therefore, although consciousness that does not occur from the function of the computer’s electronic circuits is theoretically or philosophically possible, it is not the kind of consciousness that we have and that is discussed in this theory.
How do we build computers or robots that have phenomenal consciousness as we do?
As discussed above, to have phenomenal consciousness as we do, computers and robots must have awareness and experiences that have awareness and experiences of what things are like occurring and that the awareness and experiences themselves must be consciously experienceable and have phenomenality. But to have such awareness and experiences needs specifically arranged circuits with specifically functioning processes that can create these special kinds of awareness and experience. Therefore, no matter how complex the circuits of the computers or robots are, if they do not have the correct specific characteristics, they will not be able to create those special kinds of awareness and experience, and phenomenal consciousness (phenomenal conscious awareness and experience) like the one that occurs in us will not occur. Complexity of the circuit is not the determining factor for phenomenal consciousness to occur, but correct specificity is.
Although some principal characteristics that are necessary for a neural circuit to generate phenomenal consciousness are outlined in section 6.2, those characteristics are not sufficient. To have phenomenal consciousness in the neural circuit, the circuit must create a signaling state that, in the neural language, means phenomenal consciousness. However, the exact signaling state that means phenomenal consciousness in the neural language and the exact details of the circuit that can create such a signaling state are not yet known at present. Theoretically, if they are known and if the possibility of creating such a circuit and such a signaling state is not restricted to only biological processes but extends to electronic processes, then computers and robots that have phenomenal consciousness can be built by imitating the neural circuit and the signaling state. This is in accordance with the principle of organizational invariance of Chalmers [1,6,104].
Can we build robots or can there be automatons that behave identically to us but do not have qualia and consciousness?
If they do not have qualia and consciousness, they cannot behave identically to us because the lack of conscious awareness and experiences of qualia will cause the lack of information of these phenomena. Therefore, they will not be able to behave identically to us in the activities that depend on or are related to this kind of information because they do not know what behaviors are to be generated, such as to answer the question of “Are there extra phenomena occurring in your perception of the red color compared with your perception of the level of your metabolism?”. It is obvious that their answers will be “no”, which is different from our answer “yes”, because, for them, no extra phenomena occur in the perception of the red color. Another obvious example is that, if they do not have qualia and consciousness, they will not talk about, discuss, write books about, have conferences on, and do experiments on qualia and consciousness while we obviously do, so their behavior cannot be identical to ours. Thus, the answer to this question is we cannot build robots/automatons that do not have qualia and consciousness but can behave identically to us.
6.6.3. When we are asleep in NREM stages, under general anesthesia, or in a deep coma, do we have consciousness?
Obviously, under those conditions, the consciousness neural process is not functioning at the alert level – the composite of reentrant signaling states that is generated is not signaling alert consciousness; thus, there is no alert consciousness in such conditions. However, although the consciousness neural process is not functioning to produce alert consciousness, it is not dead and is still functioning at the sleep or depressed level, generating the composite of reentrant signaling states that signals sleep or depressed consciousness. So, the consciousness does not completely disappear but is at a different alertness level and changes form. Also, the state of no alert-consciousness will be only temporary if there is no fatal condition involved. After those conditions disappear, the consciousness neural process will resume producing the composite of normal special reentrant signaling states, and the alert consciousness will return. So, physiologically, consciousness can function at various alertness levels and in various forms, and it can be considered that we still have consciousness, but not at the alert level or in the normal form, when we are asleep in NREM (non-rapid eye movement) stages, under general anesthesia, or in a deep coma.
6.6.4. When does consciousness appear in a being?
Consciousness does not come suddenly once the brain begins to form in a fetus, nor does it come when an infant is born. Consciousness will appear when the consciousness neural process begins to function and produce the composite of special reentrant signaling states. Obviously, the mature consciousness cannot appear all of a sudden in the fetus but develops gradually as the consciousness neural circuit and process and the composite of special reentrant signaling states that it produces mature little by little**. This is similarly true for a person who slowly wakes up from sleep or gradually recovers from general anesthesia, cerebral concussion, hypoglycemia, etc. – consciousness returns gradually, not reappears suddenly like a light bulb that suddenly lights up when switched on. The consciousness neural process functions not in an all-or-none manner but in various graded levels, as discussed previously, so does the consciousness, which occurs with the consciousness neural process.
(** The developmental anlage of the thalamus is present from around day 22 or 23 post-conception, and thalamocortical connections are thought to be formed by week 26 of gestation. Around the same time of gestation (week 25-29), electrical activity from the cerebral hemispheres shifts from an isolated to a more continuous pattern, with sleep–wake distinctions appreciable from week 30 of gestation. Thus, both the structural and functional prerequisites for consciousness are in place by the third trimester” . “consciousness cannot emerge before 24 gestational weeks when the thalamocortical connections from the sense organs are established. Thus, the limit of legal abortion at 22-24 weeks in many countries makes sense” . The concept of gradual emergence of consciousness in fetal and neonatal life has been detailed before, such as ref .)
- Consciousness (the composite of conscious awareness and experiences) can be identified, quantified, monitored, created, modified, tested, or destroyed by performing the respective action on only the composite of all special reentrant signaling states. The actions on the composite are both necessary and sufficient for the corresponding actions on the consciousness to occur, and these actions on anything else without having the actions on the composite will not result in the corresponding actions on the consciousness.
- In any event or experiment, all predictions that are valid for the composite of all special reentrant signaling states, such as that the composite will occur, change, or disappear, will be identically valid for the consciousness, that is, the changes that occur in the composite and those that occur in the consciousness will be identical in all aspects (quality, quantity, temporal pattern, etc.). For example, if the composite of all special reentrant signaling states changes abruptly from the composite of special reentrant signaling states of vision, hearing, and thinking to the composite of special reentrant signaling state of only emotion, the changes in the consciousness will be identical in all aspects, such as identical changes from conscious awareness and experiences of vision, hearing, and thinking to those of only emotion (quality), from the composite of three signaling states to the composite of one signaling state (quantity), and abruptly (temporal pattern).
- Regarding the consciousness neural process, it will be found that the consciousness neural process has connections to all neural processes that have qualia, that it has its signals fed back to itself, and that it has the signaling pattern that is special signaling pattern when it functions to be consciously aware of and experience something.
N.B. All of the above predictions can be verified by clinical events and experiments on conscious human subjects, and like the cases of mental processes and qualia, isolated actions on the special reentrant signaling state, without having actions on anything else, can be done by electrical stimulations and/or magnetic stimulations to specifically affect the reentrant signaling state of the consciousness neural process. Up to the present time, the predictions of this theorem have been borne out by or are in agreement with everyday clinical events and experimental outcomes.
Presently, there are several theories about neural correlates of consciousness – the minimal neuronal mechanisms that are jointly sufﬁcient for any one speciﬁc conscious percept [45-59]. Many of them involve the concepts that consciousness arises from the integrated activities of widespread cortical and subcortical areas and involving bottom-up and top-down signaling, reentrant information processing, synchronous electrical oscillatory activities, and resonant states, such as a neurobiological theory of consciousness [51,52], the neuronal basis for consciousness , The Dynamic Core or Neural Darwinism hypothesis [29,87,88], The Global Workspace theory [15-19], The Global Neuronal Workspace hypothesis [20-25], an extended theory of global workspace of consciousness , The Adaptive Resonance Theory (ART) [108-111], and hypotheses of Lamme VAF and van Gaal S [112,113] and other authors [43,114-116]. Other theories have various different concepts, such as The Information Integration Theory of Consciousness [117-121] involves integration of information, The Orchestrated Objective reduction [Orch OR] Theory [122-124] involves quantum vibrations of microtubule inside brain neurons, The Operational Architectonic [125,126] involves brain’s structure of operational architectonics, The Conscious Electromagnetic Information (Cemi) Field Theory [127,128] involves the brain’s electromagnetic information field, and other theories [129-133] involve other concepts. Some of these theories have been instructively reviewed by several authors [such as ref 3,7,134-137]. As it is not an objective of this theorem to discuss the existing theories of consciousness in details, the author would like to refer interested readers to these works.
Regarding the question of the exact nature of consciousness – what the consciousness truly is, which this theorem asserts that consciousness is a composite of special reentrant signaling states, there are concepts that are similar or close to this assertion before, that is, the concept that consciousness is a kind of brain process or brain activity (like a signaling state that this theorem asserts) has existed before this theory was born. For example, Place (1956) wrote that consciousness is a process in the brain and can be identified with a pattern of brain activity , Feigl (1958) stated that consciousness is identical with certain (presumably configurational) aspects of the neural processes , and Loorits (2014) argues that consciousness as a whole can be seen as a complex neural pattern and that the phenomenal consciousness simply is a certain complex pattern of neural activity, a pattern of patterns of some simple neural events, and he has the prospect that the entire structure of consciousness will be discovered in some patterns of neural activity . To be more specific, consider reentrant signaling states, which this theorem asserts that some special kinds of them are consciousness, many theories (see the previous paragraph) have proposed very similar concepts before. Notably, the concept in this theorem, Theorem V, is almost identical to that of The Adaptive Resonance Theory (ART) [108-111], which states that all conscious states are resonance states. ART gives comprehensive and strong evidence-based accounts/explanations of which resonant states can be conscious, why not all resonant states are conscious, and why all conscious state must be resonant. However, ART and other theories that have the concept that reentrant signaling states or the complex patterns of neural activities are consciousness or are essential in consciousness occurrence do not give solid rationale of why phenomenality should occur in those resonant states or complex patterns of neural activities (that is, why we are not just highly complex and highly intelligent but phenomenality-deprived biological robots). Also, it does not show how non-material phenomenality should occur out of material resonant states or out of complex patterns of material neural activities. In these theories, phenomenality and non-material phenomenal consciousness “just occur” in these material neural networks, resonant states, or patterns of neural activities .
On the other hand, the present theorem presents the required rationales: the special reentrant signaling states are the information that means phenomenal consciousness in the neural process language and, because they mean phenomenal consciousness in the neural language, they are interpreted to be phenomenal consciousness when they are read by the consciousness neural process by the process of reentrant signaling – phenomenal consciousness thus naturally and inevitably occurs in the consciousness neural process. The reason for how reentrant signaling states can be information that means phenomenal consciousness in the neural process language is that it is inherent in the abilities of reentrant signaling states that some kinds of them, the special reentrant signaling states, can be such information: it is a brute fact that some signaling states mean and are “phenomenal consciousness” in the neural process language and that special reentrant signaling states simply are such signaling states. And the reason why special reentrant signaling states occur is because it is possible for them to occur when the nervous system has evolved to sufficiently advanced state for neural processes to produce them, and the reason why special reentrant signaling states still remain nowadays is very likely because they can yield advantage to the animals that have them.
Another important point to be noted is about the concept that consciousness is a kind of information. Again, although this theorem asserts that consciousness is some kind of information, this is also not a novel idea. Other theories such as the double-aspect theory of information [1,6], The Integrated Information Theory (ITT) [117-121], and The Conscious Electromagnetic Information (Cemi) Field Theory [127,128] have proposed this concept before. In the double-aspect theory of information, Chalmers (1995) noted that “there is a direct isomorphism between certain physically embodied information spaces and certain phenomenal (or experiential) information spaces” and formed a hypothesis that “Information (or at least some information) has two basic aspects, a physical aspect and a phenomenal aspect. … Experience arises by virtue of its status as one aspect of information, when the other aspect is found embodied in physical processing … We might say that phenomenal properties are the internal aspect of information” . Notably, this is very similar to the concept of the identity between information processing processes and the mind, the concept of the identity between information of signaling pattern and qualia, and the concept of the identity between information of reentrant signaling states and consciousness (which form Theorem II – the mind is the composite of all information processing processes, Theorem IV – qualia are special kinds of signaling pattern, and Theorem V – consciousness is a special kind of reentrant signaling state). However, Chalmers thought that double-aspect theory of information was extremely speculative and also underdetermined and did not develop it into a full-fledged theory. Regarding the Integrated Information Theory (ITT), the theory asserts that a conscious experience is a maximally irreducible conceptual structure (MICS), which corresponds to a local maximum of integrated conceptual information (or a local maximum of Φ or Φmax). It uses these quantities to predict which system is or is not conscious fairly successfully in varieties of cases, such as not being conscious during sleeps or generalized seizures or in the cerebellum. It also provides mathematical formulations to calculate Φ. However, it does not give rationales of why such MICS, which corresponds to a local maximum of Φ, should become a phenomenal conscious experience or why phenomenality should occur in MICS. Phenomenality and phenomenal consciousness just occur in MICS.
This theorem, on the other hand, asserts that information in some specific forms (the special reentrant signaling states, which are very likely to be but may not necessarily be highly complex) means phenomenal consciousness in the neural process language. So, when these kinds of information are read and interpreted (by the process of reentrant signaling) by the consciousness neural process, phenomenal consciousness (not non-phenomenal consciousness) naturally and unavoidably occurs in the consciousness neural process. Therefore, the most crucial factor whether any information is or can be phenomenal consciousness is that its form must be in some specific form that means phenomenal consciousness when interpreted by the consciousness neural processes – that is, the specificity, not the complexity, of the form of information is the determining factor***.And the reasons for why some forms of information should be phenomenal consciousness is that it is basically possible in this universe that information in some forms means phenomenal consciousness in the neural process language. And such information that means phenomenal consciousness still exists today probably because information in these forms has physical effects that are different from those of information in other forms that means non-phenomenal consciousness (see section 6.4) and because these differential effects can help increase the survival chance of the species that has phenomenal consciousness.
(*** Information can be very, very complex yet not be able to be consciousness because, due to its not having the right form, it does not mean phenomenal consciousness in the neural process language and is thus not interpreted as phenomenal consciousness.)
To be noted is that the predictions of this theorem are in conflict with or not exactly the same as those of the theories referred to above. For example, the Orchestrated Objective reduction [Orch OR] Theory [122-124] proposes that consciousness arises at the quantum level from the quantum vibrations of microtubules inside brain neurons, so anything that significantly affect the proposed quantum vibrations without significantly affecting anything else should significantly affect consciousness. On the contrary, according to this Theorem V, (a) experiments that significantly interfere with the signaling state of the consciousness neural process without or without significantly interfering with the quantum vibration of the microtubules in the brain will significantly affect the consciousness and (b) experiments that significantly interfere with the quantum vibration of the microtubules in the brain without or without significantly interfering with the signaling state of the consciousness neural process will not or will not significantly affect the consciousness. So, these experiments will be able to tell which theory is incorrect. These kinds of differential experiment can be done to check the predictions of this theory against those of other theories as well. For example, experiments that can significantly affect the composite of special reentrant signaling states without significantly affect the Φmax there will be able to differentiate the validity of this theorem and the ITT: according to this theorem, consciousness will be significantly affected by such experiments, but according to the ITT, consciousness will not be significantly affected. To differentiate between this theorem and the ART is the most difficult because of the very similar concept of both theories – the exact characteristics of reentrant signaling states that are consciousness must be known to differentiate the two theories. If those characteristics are as predicted by this theorem, that is, all special reentrant signaling states have the same special form that is categorically different from those of other signaling states, this theorem will be validated, but if those characteristics are as predicted by the ART, the ART will be validated. The results of these differential experiments, no matter how they come out, will certainly help us refine the theory of consciousness, and we will be able to get to the perfect theory of consciousness someday.
Like a spectrum of light, of which some frequencies do not elicit a color in the brain but some do, even if they all are spectrums of the same light, some information does not elicit qualia and consciousness, but some do, even if they are information in the same mind.
Qualia and consciousness are just parts of the spectrum of information, the parts that have phenomenality.
- Chalmers DJ. Facing up to the problem of consciousness. J Conscious Stud. 1995;2(3):200-219. http://consc.net/papers/facing.html
- Gennaro RJ. Consciousness. Internet Encyclopedia of Philosophy. Retrieved 2017 Apr 18 from http://www.iep.utm.edu/consciou/
- Van Gulick R. Consciousness. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy (Summer 2017 Edition). Retrieved 2017 Sep 8 from https://plato.stanford.edu/archives/sum2017/entries/consciousness/
- Weisberg J. The hard problem of consciousness. The Internet Encyclopedia of Philosophy. Retrieved 2018 Jan 29 from https://www.iep.utm.edu/hard-con/
- Chalmers DJ. Consciousness and its place in nature. In: Chalmers DJ, editor. Philosophy of mind: Classical and contemporary readings. Oxford: Oxford University Press; 2002. ISBN-13: 978-0195145816 ISBN-10: 019514581X. Retrieved 2017 Sep 20 from http://consc.net/papers/nature.html
- Chalmers DJ. Moving forward on the problem of consciousness. J Conscious Stud. 1997;4(1):3-46. http://consc.net/papers/moving.html
- De Sousa A. Towards an integrative theory of consciousness: Part 1 (neurobiological and cognitive models). Mens Sana Monogr. 2013 Jan-Dec;11(1):100–150. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3653219/
- Mashour GA, Alkire MT. Evolution of consciousness: Phylogeny, ontogeny, and emergence from general anesthesia. Proc Natl Acad Sci U S A. 2013 Jun 18;110(Suppl 2): 10357–10364. DOI: 10.1073/pnas.1301188110. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3690605/
- Rosenthal D. Concepts and Definitions of Consciousness. In: Banks WP, editor. Encyclopedia of Consciousness. Amsterdam: Elsevier; 2009:157-169. https://www.researchgate.net/publication/280529005_Concepts_and_Definitions_of_Consciousness_in_Encyclopedia_of_Consciousness_ed_William_P_Banks_Amsterdam_Elsevier_2009_pp_157-169
- Sturm T. Consciousness regained? Philosophical arguments for and against reductive physicalism. Dialogues Clin Neurosci. 2012 Mar;14(1):55–63. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3341650/
- Zeman A. Consciousness. Brain. 2001 Jul;124(Pt 7):1263-1289. https://academic.oup.com/brain/article-pdf/124/7/1263/802709/1241263.pdf
- Zeman A. What do we mean by “conscious” and “aware”? Neuropsychol Rehabil. 2006 Aug;16(4):356-376. DOI: 10.1080/09602010500484581. https://www.ncbi.nlm.nih.gov/pubmed/16864477
- Brogaard B, Electra Gatzia DE. What can neuroscience tell us about the hard problem of consciousness? Front Neurosci. 2016;10:395. doi: 10.3389/fnins.2016.00395. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5013033/
- Loorits K. Structural qualia: A solution to the hard problem of consciousness. Front Psychol. 2014;5:237. DOI: 10.3389/fpsyg.2014.00237. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3957492/
- Baars BJ. Chapter Ten. The functions of consciousness. In: A cognitive theory of consciousness. New York: Cambridge University Press; 1988. http://bernardbaars.pbworks.com/f/++++Functions+of+Consciousness.pdf
- Baars BJ. Global workspace theory of consciousness: Toward a cognitive neuroscience of human experience. Prog Brain Res. 2005;150:45-53. DOI: 10.1016/S0079-6123(05)50004-9. https://www.cs.helsinki.fi/u/ahyvarin/teaching/niseminar4/Baars2004.pdf
- Baars BJ. How does a serial, integrated and very limited stream of consciousness emerge from a nervous system that is mostly unconscious, distributed, parallel and of enormous capacity? Ciba Found Symp. 1993;174:282-290; discussion 291-303
- Baars BJ, Franklin S, Ramsoy TZ. Global workspace dynamics: Cortical “Binding and propagation” enables conscious contents. Front Psychol. 2013;4:200. DOI: 10.3389/fpsyg.2013.00200. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3664777/
- Newman J, Baars BJ, Cho SB. A neural global workspace model for conscious attention. Neural Networks. 1997;10(7):1195–1206. http://ccrg.cs.memphis.edu/assets/papers/1997/Newman%20Baars%201997.pdf
- Dehaene S, Changeux JP. Experimental and theoretical approaches to conscious processing. Neuron. 2011 Apr 28;70(2):200-227. DOI: 10.1016/j.neuron.2011.03.018. https://www.sciencedirect.com/science/article/pii/S0896627311002583
- Dehaene S, Changeux JP, Naccache L. The Global Neuronal Workspace Model of conscious access: From neuronal architectures to clinical applications. 2011. In: Dehaene S, Christen Y, editors. Characterizing consciousness: From cognition to the clinic? Research and Perspectives in Neurosciences. Berlin, Heidelberg: Springer-Verlag, 2011. https://doi.org/10.1007/978-3-642-18015-6_4. http://www.antoniocasella.eu/dnlaw/Dehaene_Changeaux_Naccache_2011.pdf
- Dehaene S, Charles L, King JR, Marti S. Toward a computational theory of conscious processing. Curr Opin Neurobiol. 2014 Apr;25:76–84. doi: 10.1016/j.conb.2013.12.005. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5635963/
- Dehaene S, Naccache L. Towards a cognitive neuroscience of consciousness: Basic evidence and a workspace framework. Cognition. 2001 Apr;79(1-2):1-37. https://www.jsmf.org/meetings/2003/nov/Dehaene_Cognition_2001.pdf
- Dehaene S, Sergent C, Changeux JP. A neuronal network model linking subjective reports and objective physiological data during conscious perception. Proc Natl Acad Sci U S A. 2003 Jul 8;100(14):8520–8525. DOI: 10.1073/pnas.1332574100. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC166261/
- Sergent C, Dehaene S. Neural processes underlying conscious perception: Experimental findings and a global neuronal workspace framework. J Physiol Paris. 2004 Jul-Nov;98(4-6):374-384. DOI: 10.1016/j.jphysparis.2005.09.006 https://pdfs.semanticscholar.org/ae61/178a998b4e08851af8ba80e7815fd2c9e6d9.pdf
- Song X, Tang X. An extended theory of global workspace of consciousness. Progress in Natural Science. 2008 Jul 10;18(7):789–793. DOI: https://doi.org/10.1016/j.pnsc.2008.02.003. https://www.sciencedirect.com/science/article/pii/S100200710800138X
- Velmans M. How to define consciousness – and how not to define consciousness. J Conscious Stud. 2009;16(5):139-156. http://cogprints.org/6453/1/How_to_define_consciousness.pdf
- Dehaene S. What is consciousness good for? In: Consciousness and the brain. Penguin Books. 2014. New York, New York, USA. ISBN 978-0-670-02543-5, 978-0-14-312626-3. p 89-114.
- Seth AK, Baars BJ. Neural Darwinism and consciousness. Conscious Cogn. 2005 Mar;14(1):140-168. http://ccrg.cs.memphis.edu/assets/papers/2004/Seth%20&%20Baars,%20Neural%20Darwinism-2004.pdf
- Bargh JA, Morsella E. The Unconscious Mind. Perspect Psychol Sci. 2008 Jan;3(1):73–79. DOI: 10.1111/j.1745-6916.2008.00064.x. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2440575/
- Bergström F, Eriksson J. Neural Evidence for Non-conscious Working Memory. Cerebral Cortex. 2018 Sep;28(9):3217–3228. https://doi.org/10.1093/cercor/bhx193. https://academic.oup.com/cercor/article/28/9/3217/4058206
- Dehaene S. Fathoming unconscious depths. In: Consciousness and the brain. Penguin Books. 2014. New York, New York, USA. ISBN 978-0-670-02543-5, 978-0-14-312626-3. p 47-88.
- Dehaene S, Changeux JP, Naccache L, Sackur J, Sergent C. Conscious, preconscious, and subliminal processing: A testable taxonomy. Trends Cogn Sci. 2006 May;10(5):204-211. DOI: 10.1016/j.tics.2006.03.007. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.65.3821&rep=rep1&type=pdf
- Horga G, Maia TV. Conscious and unconscious processes in cognitive control: a theoretical perspective and a novel empirical approach. Front Hum Neurosci. 2012;6:199. doi: 10.3389/fnhum.2012.00199. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3458455/
- Kihlstrom JF. Cognition, unconscious process. Elsevier Ltd. 2007. Retreived Nov 28, 2019 from http://www.baars-gage.com/furtherreadinginstructors/Chapter08/Chapter8_Cognitive_Unconscious.pdf
- Morsella E, Poehlman TA. The inevitable contrast: Conscious vs. unconscious processes in action control. Front Psychol. 2013;4:590. doi: 10.3389/fpsyg.2013.00590. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3767904/
- Salti M, Monto S, Charles L, King JR, Parkkonen L, Dehaene S. Distinct cortical codes and temporal dynamics for conscious and unconscious percepts. eLife. 2015;4:e05652. doi: 10.7554/eLife.05652. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4467230/
- Panksepp J. Cross-Species affective neuroscience necoding of the primal affective experiences of humans and related animals. Sirigu A, editor. PLoS ONE. 2011 Sep 7;6(9):e21236. https://doi.org/10.1371/journal.pone.0021236 http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0021236
- Butler AB, Cotterill RMJ. Mammalian and Avian Neuroanatomy and the Question of Consciousness in Birds. The Biological Bulletin 2006;211(2):106-127. DOI: 10.2307/4134586. https://www.journals.uchicago.edu/doi/10.2307/4134586
- Pepperberg IM, Lynn SK. Possible Levels of Animal Consciousness with Reference to Grey Parrots. Amer Zool. 2000;40:893–901.
- Edelman DB, Baars BJ, Seth AK. Identifying hallmarks of consciousness in non-mammalian species. Conscious Cogn. 2005 Mar;14(1):169-187. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.83.4676&rep=rep1&type=pdf
- Feinberg TE, Mallatt J. The evolutionary and genetic origins of consciousness in the Cambrian Period over 500 million years ago. Front Psychol. 2013;4:667. DOI: 10.3389/fpsyg.2013.00667. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3790330/
- Babiloni C, Marzano N, Soricelli A, Cordone S, Millán-Calenti JC, Percio CD, et al. Cortical neural synchronization underlies primary visual consciousness of qualia: Evidence from event-related potentials. Front Hum Neurosci. 2016;10:310. DOI: 10.3389/fnhum.2016.00310. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4927634/
- Fabbro F, Aglioti SM, Bergamasco M, Clarici A, Panksepp J. Evolutionary aspects of self- and world consciousness in vertebrates. Front Hum Neurosci. 2015 Mar 26;9:157. doi: 10.3389/fnhum.2015.00157. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4374625/
- Aruab J, Bachmannc T, Singerabd W, Melloni L. Distilling the neural correlates of consciousness. Neuroscience & Biobehavioral Reviews. 2012 Feb;36(2):737-746. https://doi.org/10.1016/j.neubiorev.2011.12.003. https://www.sciencedirect.com/science/article/pii/S0149763411002107
- Baars BJ, Laureys, S. One, not two, neural correlates of consciousness. Trends in Cognitive Sciences. 2005 Jun;9(6). http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.367.9177&rep=rep1&type=pdf
- Block N. Two neural correlates of consciousness. Trends Cogn. Sci. 2005;9;46–52. http://www.nyu.edu/gsas/dept/philo/faculty/block/papers/final_revised_proof.pdf
- Boly M, Massimini M, Tsuchiya N, Postle BR, Koch C, Giulio Tononi G. Are the neural correlates of consciousness in the front or in the back of the cerebral cortex? Clinical and neuroimaging evidence. J Neurosci. 2017 Oct;37(40):9603–9613. doi: 10.1523/JNEUROSCI.3218-16.2017. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5628406/
- Chalmers DJ. What is a neural correlate of consciousness? In: Metzinger T, editor. Neural Correlates of Consciousness: Empirical and Conceptual Questions. MIT Press, Cambridge, MA. 2000
- Crick F, Koch C. Consciousness and neuroscience. Cereb Cortex. 1998 Mar;8(2):97-107. https://authors.library.caltech.edu/40355/1/feature_article.pdf
- Crick F, Koch C. Some reflections on visual awareness. Cold Spring Harb Symp Quant Biol. 1990;55:953-962. https://authors.library.caltech.edu/40351/1/61.pdf
- Crick F, Koch C. Towards a neurobiological theory of consciousness. Seminars in the Neurosciences. 1990;2:263-275. https://authors.library.caltech.edu/40352/1/148.pdf
- Fink SB. A deeper look at the “Neural correlate of consciousness”. Front Psychol. 2016; 7: 144. doi: 10.3389/fpsyg.2016.01044 https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4960249/
- Koch C, Massimini M, Boly M, Tononi G. Neural correlates of consciousness: Progress and problems. Nature Reviews Neuroscience. 2016;17: 307-321. https://puredhamma.net/wp-content/uploads/Neural-correlates-of-consciousness-Koch-et-al-2016.pdf
- Morales J, Lau H. The neural correlates of consciousness. Penultimate draft: January 2018. Forthcoming in The Oxford Handbook of the Philosophy of Consciousness, Uriah Kriegel (ed.), Oxford University Press. https://philpapers.org/archive/MORTNC-7.pdf
- Nani A, Manuello J, Mancuso L, Liloia D, Costa T, Cauda F. The neural correlates of consciousness and attention: Two sister processes of the brain. Front Neurosci. 2019;13:1169. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6842945/
- Polák M, Marvan T. Neural correlates of consciousness meet the theory of identity. Front Psychol. 2018; 9: 1269. doi: 3389/fpsyg.2018.01269 https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6066586/
- Rees G, Kreiman G, Koch C. Neural correlates of consciousness in humans. Nature Reviews Neuroscience 2002;3(4):261-70. https://www.researchgate.net/publication/11399830_Neural_correlates_of_consciousness_in_humans
- Tononi G, Koch C. The neural correlates of consciousness: An update. Annals of the New York Academy of Sciences. 2008;1124:239-61. 10.1196/annals.1440.004. https://authors.library.caltech.edu/40650/1/Tononi-Koch-08.pdf
- Alves PM, Foulon C, Vyacheslav Karolis V, Bzdok D, Margulies DS, Volle E, et al. An improved neuroanatomical model of the default-mode network reconciles previous neuroimaging and neuropathological findings. Commun Biol. 2019; 2: 370. doi: 10.1038/s42003-019-0611-3. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6787009/
- Andrews-Hanna JR. The brain’s default network and its adaptive role in internal mentation. Neuroscientist. 2012 Jun;18(3):251–270. DOI: 10.1177/1073858411403316. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3553600/
- Buckner RL, Andrews-Hanna JR, Schacter DL. The brain’s default network: Anatomy, function, and relevance to disease. Ann N Y Acad Sci. 2008 Mar;1124:1-38. DOI: 10.1196/annals.1440.011. https://www.researchgate.net/publication/5451668_The_Brain’s_Default_Network
- Calster LV, D’Argembeau A, Salmon E, Peters F, Majerus S. Fluctuations of attentional networks and default mode network during the resting state reflect variations in cognitive states: Evidence from a novel resting-state experience sampling method. Journal of Cognitive Neuroscience. 2017 Jan;29(1):95-113. DOI: 10.1162/jocn_a_01025. http://www.mitpressjournals.org/doi/full/10.1162/jocn_a_01025
- Hagmann P, Cammoun L, Gigandet X, Meuli R, Honey CJ, Wedeen VJ, et al. Mapping the structural core of human cerebral cortex. DOI: https://doi.org/10.1371/journal.pbio.0060159. Retrieved 2017 Aug 18 from http://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.0060159
- Raichle ME, MacLeod AM, Snyder AZ, Powers WJ, Gusnard DA, Shulman GL. A default mode of brain function. Proc Natl Acad Sci U S A. 2001 Jan 16;98(2):676–682. DOI: 10.1073/pnas.98.2.676. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC14647/
- Sporns O. Structure and function of complex brain networks. Dialogues Clin Neurosci. 2013 Sep;15(3):247–262. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3811098/
- Fuller PM, Sherman D, Pedersen NP, Saper CB, Lu J. Reassessment of the structural basis of the ascending arousal system. J Comp Neurol. 2011 Apr 1; 519(5):933–956. DOI: 10.1002/cne.22559. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3119596/
- Reinoso-Suárez F, de Andrés I, Garzón M. Functional anatomy of the sleep-wakefulness cycle: Wakefulness. Adv Anat Embryol Cell Biol. 2011;208:1-128. https://www.ncbi.nlm.nih.gov/pubmed/21166301
- Yeo SS, Chang PH, Jang SH. The ascending reticular activating system from pontine reticular formation to the thalamus in the human brain. Front Hum Neurosci. 2013;7: 416. DOI: 10.3389/fnhum.2013.00416. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3722571/
- Boly M, Seth AK, Wilke M, Ingmundson P, Baars B, Laureys S, et al. Consciousness in humans and non-human animals: Recent advances and future directions. Front Psychol. 2013;4: 625. DOI: 10.3389/fpsyg.2013.00625. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3814086/
- Bender A, Jox RJ, Grill E, Straube A, Lulé D. Persistent vegetative state and minimally conscious state. A systematic review and meta-analysis of diagnostic procedures. Dtsch Arztebl Int. 2015 Apr; 112(14): 235–242. DOI: 10.3238/arztebl.2015.0235. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4413244/
- Bernat JL. Chronic consciousness disorders. Annu Rev Med. 2009;60:381-392.
- Bernat JL. Chronic disorders of consciousness. Lancet. 2006;367:1181–1192. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.330.3843&rep=rep1&type=pdf
- Bruno MA, Vanhaudenhuyse A, Thibaut A, Moonen G, Laureys S. From unresponsive wakefulness to minimally conscious PLUS and functional locked-in syndromes: Recent advances in our understanding of disorders of consciousness. J Neurol. 2011 Jul;258(7):1373-1384. DOI: 10.1007/s00415-011-6114-x. http://www.academia.edu/9792867/From_unresponsive_wakefulness_to_minimally_conscious_PLUS_and_functional_locked-in_syndromes_recent_advances_in_our_understanding_of_disorders_of_consciousness
- Giacino JT, Ashwal S, Childs N, Cranford R, Jennett B, Katz DI, et al. The minimally conscious state: Definition and diagnostic criteria. Neurology. 2002 Feb 12;58(3):349-353. http://n.neurology.org/content/58/3/349.long
- Hodelín-Tablada R. Minimally conscious state: Evolution of concept, diagnosis and treatment. MEDICC Review. 2016 Oct;18(4):43–46. https://www.scielosp.org/pdf/medicc/2016.v18n4/43-46
- Perri CD, Thibaut A, Heine L, Soddu A, Demertzi A, Laureys S. Measuring consciousness in coma and related states. World J Radiol. 2014 Aug 28;6(8):589–597. DOI: 10.4329/wjr.v6.i8.589. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4147439/
- Cavanna AE, Cavanna SL, Servo S, Monaco F. The neural correlates of impaired consciousness in coma and unresponsive states. Discov Med. 2010 May;9(48):431-438. http://www.discoverymedicine.com/Andrea-Eugenio-Cavanna/2010/05/09/the-neural-correlates-of-impaired-consciousness-in-coma-and-unresponsive-states/
- Giacino JT. The minimally conscious state: Defining the borders of consciousness. Prog Brain Res. 2005;150:381-395. DOI: 10.1016/S0079-6123(05)50027-X. https://www.ncbi.nlm.nih.gov/pubmed/16186037
- Gosseries O, Bruno MA, Chatelle C, Vanhaudenhuyse A, Schnakers C, Soddu A, et al. Disorders of consciousness: What’s in a name? 2011;28(1):3-14. DOI: 10.3233/NRE-2011-0625. https://pdfs.semanticscholar.org/d8fd/17167b0fc2fb4ef7bb7b39828e1cfe1a8255.pdf
- Hirschberg R, Giacino JT. The vegetative and minimally conscious states: Diagnosis, prognosis and treatment. Neurol Clin. 2011 Nov;29(4):773-786. DOI: 10.1016/j.ncl.2011.07.009. https://www.researchgate.net/profile/Joseph_Giacino/publication/255971880_Hirschberg_Giacino_VS_MCS_Dx_Px_Tx_Neurol_Clinics_2011/links/0c9605212e924b0a19000000/Hirschberg-Giacino-VS-MCS-Dx-Px-Tx-Neurol-Clinics-2011.pdf
- Edelman GM, Gally JA. Reentry: A key mechanism for integration of brain function. Front Integr Neurosci. 2013;7: 63. DOI: 10.3389/fnint.2013.00063. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3753453/
- Edelman GM. Neural Darwinism: Selection and reentrant signaling in higher brain function. Neuron. 1993 Feb;10:115-125. http://brainmaps.org/pdf/edelman1993.pdf
- Lamme V, Roelfsema P. (2000). The distinct modes of vision offered by feedforward and recurrent processing. Trends in neurosciences. 2000 Nov; 23(11):571-579. DOI: 10.1016/S0166-2236(00)01657-X. https://www.researchgate.net/publication/12253934_The_Distinct_Modes_of_Vision_Offered_by_Feedforward_and_Recurrent_Processing
- Keren, H., Marom, S. Long-range synchrony and emergence of neural reentry. Sci Rep 6, 36837 (2016) doi:10.1038/srep36837. https://www.nature.com/articles/srep36837
- Sikkens T, Bosman CA, Olcese U. The role of top-down modulation in shaping sensory processing across brain states: Implications for consciousness. Front Syst Neurosci. 2019 Jul 24;13:31. doi: 10.3389/fnsys.2019.00031. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6802962/pdf/fnsys-13-00031.pdf
- Edelman GM, Gally JA, Baars BJ. Biology of consciousness. Front Psychol. 2011;2:4. DOI: 10.3389/fpsyg.2011.00004. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3111444/
- Edelman GM. Naturalizing consciousness: A theoretical framework. PNAS. April 29, 2003;100(9):5520-5524. https://doi.org/10.1073/pnas.0931349100. https://www.pnas.org/content/100/9/5520
- Baars BJ. Subjective experience is probably not limited to humans: The evidence from neurobiology and behavior. Conscious Cogn. 2005 Mar;14(1):7-21.
- Butler AB, Manger P, Lindahl BIB, Århem P. Evolution of the neural basis of consciousness: A bird-mammal comparison. BioEssays: News and reviews in molecular, cellular and developmental biology. 2005 Sep:27(9);923-936. DOI 10.1002/bies.20280. https://www.researchgate.net/publication/7652731_Evolution_of_the_neural_basis_of_consciousness_A_bird-mammal_comparison
- Butler AB. Evolution of brains, cognition, and consciousness. Brain Res Bull. 2008 Mar 18;75(2-4):442-449.
- Cabanac M, Cabanac AJ, Parent A. The emergence of consciousness in phylogeny. Behav Brain Res. 2009 Mar 17;198(2):267-272. doi: 10.1016/j.bbr.2008.11.028. https://cogs.sitehost.iu.edu/spackled/2009readings/The%20emergence%20of%20consciousness%20in%20phylogeny.pdf
- Edelman DB, Seth AK. Animal consciousness: A synthetic approach. Trends Neurosci. 2009 Sep;32(9):476-484.
- Griffin DR, Speck GB. New evidence of animal consciousness. Anim Cogn. 2004 Jan;7(1):5-18. https://cogs.sitehost.iu.edu/spackled/2010readings/griffin-speck-consciousness-evidence.pdf
- Klein C, Barron AB. Insects have the capacity for subjective experience. Animal Sentience. 2016;100:1-52. https://animalstudiesrepository.org/cgi/viewcontent.cgi?article=1113&context=animsent
- Low P. The Cambridge declaration on consciousness. Panksepp J, Reiss D, Edelman D, Van Swinderen B, Low P, Koch C, editors. Publicly proclaimed in Cambridge, UK, on July 7, 2012, at the Francis Crick Memorial Conference on Consciousness in Human and non-Human Animals. http://fcmconference.org/img/CambridgeDeclarationOnConsciousness.pdf
- Mather JA. Cephalopod consciousness: Behavioural evidence. Conscious Cogn. 2008 Mar;17(1):37-48.
- Seth AK, Baars BJ, Edelman DB. Criteria for consciousness in humans and other mammals. Conscious Cogn. 2005 Mar;14(1):119-139. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.317.8149&rep=rep1&type=pdf
- Mudrik L, Faivre N, Koch C. Information integration without awareness. Trends Cogn Sci. 2014 Sep;18(9):488-496. DOI: 10.1016/j.tics.2014.04.009. https://www.ncbi.nlm.nih.gov/pubmed/24933626
- Kaas JH. The evolution of neocortex in primates. Prog Brain Res. 2012;195:91–102. DOI: 10.1016/B978-0-444-53860-4.00005-2. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3787901/
- Roth G, Dicke U. Evolution of the brain and intelligence. Trends in Cognitive Sciences. 2005 May;9(5):250–257. https://sites.oxy.edu/clint/physio/article/Evolutionofthebrainandintelligence.pdf
- Herculano-Houzel S. Coordinated scaling of cortical and cerebellar numbers of neurons. Front Neuroanat. 2010;4:12. DOI: 10.3389/fnana.2010.00012. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2839851/
- Hofman MA. Evolution of the human brain: When bigger is better. Front Neuroanat. 2014;8:15. DOI: 10.3389/fnana.2014.00015 https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3973910/#B110
- Chalmers DJ. Absent qualia, fading qualia, dancing qualia. In Metzinger T, editor. Conscious Experience. Ferdinand Schoningh. 1995. p 309–328. http://consc.net/papers/qualia.html
- Lagercrantz H. The emergence of consciousness: Science and ethics. Semin Fetal Neonatal Med. 2014 Oct;19(5):300-5. doi: 10.1016/j.siny.2014.08.003. https://www.ncbi.nlm.nih.gov/pubmed/25160864
- Lagercrantz H, Changeux JP. The emergence of human consciousness: From fetal to neonatal life. Pediatr Res. 2009 Mar;65(3):255-60. doi: 10.1203/PDR.0b013e3181973b0d. https://www.nature.com/articles/pr200950.pdf
- Llinás R, Ribary U, Contreras D, Pedroarena C. The neuronal basis for consciousness. Philos Trans R Soc Lond B Biol Sci. 1998 Nov 29;353(1377):1841–1849. DOI: 10.1098/rstb.1998.0336. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1692417/pdf/9854256.pdf
- Grossberg S. Adaptive Resonance Theory: How a brain learns to consciously attend, learn, and recognize a changing world. Neural Netw. 2013 Jan;37:1-47. DOI: 10.1016/j.neunet.2012.09.017. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.294.4425&rep=rep1&type=pdf
- Grossberg S. Adaptive Resonance Theory. Technical Report CAS/CNS-2000-024. Boston University Center for Adaptive Systems and Department of Cognitive and Neural Systems. Sep 2009. Retrieved Jan 2, 2020 from https://open.bu.edu/bitstream/handle/2144/2272/00.024.pdf?sequence=1
- Grossberg S. Towards solving the hard problem of consciousness: The varieties of brain resonances and the conscious experiences that they support. Neural Netw. 2017 Mar;87:38-95. DOI: 10.1016/j.neunet.2016.11.003. https://linkinghub.elsevier.com/retrieve/pii/S0893-6080(16)30180-0
- Carpenter GA, Grossberg S. Adaptive Resonance Theory. CAS/CNS Technical Report 2009-008. Boston University Center for Adaptive Systems and Department of Cognitive and Neural Systems. May 2009. Retrieved Jan 2, 2020 from https://open.bu.edu/bitstream/handle/2144/1972/TR-09-008.pdf?sequence=1
- Lamme VAF. Can neuroscience reveal the true nature of consciousness? https://www.nyu.edu/gsas/dept/philo/courses/consciousness05/LammeNeuroscience.pdf
- van Gaal S, Lamme VA. Unconscious high-level information processing: Implication for neurobiological theories of consciousness. Neuroscientist. 2012 Jun;18(3):287-301. doi: 10.1177/1073858411404079. https://pdfs.semanticscholar.org/b9af/d0e8d460ba73cf197a96e2dd8b524e05390c.pdf
- Engel AK, Singer W. Temporal binding and the neural correlates of sensory awareness. Trends in Cognitive Sciences. 2001 Jan;5(1):16-24. http://andreas-engel.com/engel_2001_tics.pdf
- Fisch L, Privman E, Ramot M, Harel M, Nir Y, Kipervasser S, et al. Neural “Ignition”: Enhanced activation linked to perceptual awareness in human ventral stream visual cortex. Neuron. 2009 Nov 25;64(4):562–574. DOI: 10.1016/j.neuron.2009.11.001. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2854160/
- Pollen DA. On the neural correlates of visual perception. Cereb Cortex. 1999; 9(1):4-19. DOI: https://doi.org/10.1093/cercor/9.1.4. https://academic.oup.com/cercor/article/9/1/4/314915/On-the-Neural-Correlates-of-Visual-Perception
- Tononi G. An information integration theory of consciousness. BMC Neurosci 2004,5:42. DOI: 10.1186/1471-2202-5-42. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC543470/pdf/1471-2202-5-42.pdf
- Tononi G. Consciousness, information integration, and the brain. Prog Brain Res. 2005;150:109-126. https://www.rotman-baycrest.on.ca/files/publicationmodule/@random4824abb32cfea/Tononi_2005_Progress_in_Brain_Research.pdf
- Tononi G. Integrated information theory of consciousness: An updated account. Arch Ital Biol. 2012 Jun-Sep;150(2-3):56-90. DOI: 10.4449/aib.v149i5.1388. http://architalbiol.org/index.php/aib/article/viewFile/15056/23165867
- Tononi G, Koch C. Consciousness: Here, there and everywhere? Philos Trans R Soc Lond B Biol Sci. 2015 May 19;370(1668):20140167. DOI: 10.1098/rstb.2014.0167. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4387509/
- Oizumi M, Albantakis L, Tononi G. From the phenomenology to the mechanisms of consciousness: Integrated Information Theory 3.0. PLoS Comput Biol. 2014 May;10(5):e1003588. DOI: 10.1371/journal.pcbi.1003588. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4014402/pdf/pcbi.1003588.pdf
- Hameroff S, Marcer P. Quantum computation in brain microtubules? The PenroseHameroff “Orch OR” model of consciousness. Philosophical Transactions of the Royal Society (London) Series A. 1998 Aug 15:356(1743):1869-1896. https://www.quantumconsciousness.org/sites/default/files/hameroff-1998.pdf
- Hameroff S, Penrose R. Consciousness in the universe: A review of the ‘Orch OR’ theory. Phys Life Rev. 2014 Mar;11(1):39-78. doi: 10.1016/j.plrev.2013.08.002. https://www.galileocommission.org/wp-content/uploads/2019/02/Hameroff-Penrose-2016-Consciousness-In-The-Universe-An-Updated-Review-Of-The-Orch-Or-Theory.pdf
- Hameroff S, Penrose R. Orchestrated reduction of quantum coherence in brain microtubules: A model for consciousness. Mathematics and Computers in Simulation. 1996;40:453-480. http://www.alice.id.tue.nl/references/hameroff-penrose-1996.pdf
- Fingelkurts AA, Fingelkurts AA, Neves CFH. Phenomenological architecture of a mind and operational architectonics of the brain: The unified metastable continuum. New mathematics and natural computation. 2009;5(1):2212-2244. https://www.researchgate.net/publication/24108915_Phenomenological_architecture_of_a_mind_and_operational_architectonics_of_the_brain_The_unified_metastable_continuum
- Fingelkurts AA, Fingelkurts AA. Operational architectonics of the human brain biopotential field – Towards solving the mind-brain Problem. Brain and Mind. 2001;2(3):261-296. https://www.bm-science.com/images/bms/publ/art18.pdf
- McFadden J. The CEMI Field Theory gestalt information and the meaning of meaning. J Conscious Stud. 2013;20(3-4):3-4. https://philpapers.org/rec/MCFTCF-2
- McFadden J. The Conscious Electromagnetic Information (Cemi) Field Theory. The hard problem made easy?J Conscious Stud. 2001;9(8):45–60. https://philpapers.org/archive/MCFTCE.pdf
- Graziano MSA,Webb TW. The Attention Schema Theory: A mechanistic account of subjective awareness. Front Psychol. 2015;6:500. DOI: 10.3389/fpsyg.2015.00500. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4407481/
- Min BK. A thalamic reticular networking model of consciousness. Theor Biol Med Model. 2010;7:10. DOI: 10.1186/1742-4682-7-10. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2857829/
- Ward LM. The thalamic dynamic core theory of conscious experience. Conscious Cogn. 2011 Jun;20(2):464-486. DOI: 10.1016/j.concog.2011.01.007. http://ahuman.org/svn/ahengine/research/articles/Biological/2011-Consciousness-and-Cognition.pdf
- Sevush S. Single-neuron Theory of Consciousness. Journal of Theoretical Biology. 2005. http://cogprints.org/4432/1/single_neuron_theory.htm
- Merker B. Consciousness without a cerebral cortex: A challenge for neuroscience and medicine. Behav Brain Sci. 2007 Feb;30(1):63-81. https://pdfs.semanticscholar.org/3db9/c2f8ca0bc9d73532f2e950a1ecfe7063064d.pdf?_ga=2.95869660.1998888490.1578908270-1280018741.1573545843
- Block N. Comparing the major theories of consciousness. In: Gazzaniga MS, editor. The Cognitive Neurosciences (Chap 77). 4th ed., Cambridge, MA: MIT Press; 2009:1111–1122. https://www.nyu.edu/gsas/dept/philo/faculty/block/papers/Theories_of_Consciousness.pdf
- Prakash R. The conscious access hypothesis: Explaining the consciousness. Indian J Psychiatry. 2008 Jan-Mar;50(1):10–15. DOI: 10.4103/0019-5545.39752. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2745874/
- Seager W. Theories of consciousness. An introduction and assessment. 2nd ed. Abindon, Oxon: Routledge; 2016. ISBN 978-0-415-83409-4.
- Seth A. Model of consciousness. Scholarpedia. 2007;2(1):1328. http://www.scholarpedia.org/article/Models_of_consciousness
- Place UT. Is consciousness a brain process? Br J Psychol 1956;47(1):44-50. https://people.ucsc.edu/~jbowin/Ancient/place1956.pdf
- Feigl H. The ‘Mental’ and the ‘Physical’. In: Concepts, Theories, and the Mind-Body Problem. University of Minnesota Press. 1958. https://conservancy.umn.edu/handle/11299/184614
- Leisman G, Koch P. Networks of conscious experience: Computational neuroscience in understanding life, death, and consciousness. Rev Neurosci. 2009;20(3-4):151-176. https://www.researchgate.net/profile/Gerry_Leisman/publication/41449921_Networks_of_Conscious_Experience_Computational_Neuroscience_in_Understanding_Life_Death_and_Consciousness/links/0fcfd5023c1d28270d000000.pdf