Theorem V: Consciousness is a Signaling State
As discussed in the previous chapter, consciousness in this theory is conscious awareness and conscious experiences, or awareness and experiences of what qualia are like, such as awareness and experiences of what an image of a house, a sound of a song, and an odor of a flower are like. It is not the complex mental functions that are responsible for our intelligence and abilities, although they are frequently conflated with it. The merging happens because it does not occur in isolation but always with them. Therefore, it is very important to note from the beginning of this chapter what kind of consciousness this theory attempts to investigate.
As the basic components of consciousness in this theory are conscious awareness and experiences, it is necessary to determine the nature of conscious awareness and experiences first. This can be done as follows:
7.1 The Nature of Consciousness
Based on Chapters 3 to 5, we know that, after the consciousness neural process reads the signaling pattern of a neural process that has a quale, conscious awareness and a conscious experience of the quale occur. But what are they? Are they separate entities from the consciousness neural process or intrinsic entities within it, and what are they physically and ontologically?
Let us first investigate conscious awareness. Suppose conscious awareness is an entity C that functionally occurs separately from all neural processes (Figure 7.1A). As conscious awareness is a quale (because it is consciously experienceable), when we investigate entity C in the same manner as we do entity Q of Possibility I in Section 5.1.1, we will find that C faces similar problems as Q and that these problems lead to the same conclusions. Therefore, it is unnecessary to spend time repeating the entire investigation process but sufficient to go over only some important problems before drawing a conclusion. The major problems that C encounters are as follows:
A visual perception neural process N produces a visual perception and a visual quale V and sends their information to the consciousness neural process.
Figure 7.1 Conscious awareness occurs separately (C) or intrinsically (C’).
#1. What is the nature of C? Based on observation, it seems to be non-material; however, what is the exact nature of this non-material entity? Currently, there is no answer to this question, and a new hypothesis regarding the nature of this entity is needed.
#2. How can C occur? The questions regarding its occurrence are like those regarding Q’s occurrence. For example, if C occurs from the consciousness neural process, how does the consciousness neural process produce it? A neural process does not have an apparatus to produce anything that can contain signals or information except its signaling pattern, signaling state, and electromagnetic activities, all of which occur intrinsically in the neural process, not separately from it as this hypothetical C does. However, if it does not occur from the consciousness neural process, what is the entity that creates it, and how can this entity know when and where to create C? If C occurs out of nothing by itself, how can it do that, and how can it know when and where to occur? New hypotheses are required to address these problems.
#3. To be conscious awareness of Quale V, C must first have information about Quale V so that it can be conscious awareness of Quale V. For example, to be conscious awareness of an image quale of a house, C must first have information about the image quale of the house. However, if C is functionally separate from all neural processes, it cannot get the information about Quale V through the consciousness neural process, Neural Process N, or any other neural processes. Therefore, a new hypothesis for how C can get the information about Quale V is needed.
#4. Because we are aware of the conscious awareness C that has occurred, the consciousness neural process must be able to read the signals from this functionally separate C so that it can be aware of C. However, if C is an entity that occurs separately from all neural processes, its signals cannot be E/EC signals and must be something else. What is the nature of these signals, and how can the consciousness neural process read signals that are not E/EC signals and are functionally separate from it? New hypotheses for these problems are needed.
Evidently, the possibility that conscious awareness is a functionally separate entity from all neural processes raises several critical questions that currently have no answers, and several new hypotheses are needed to support it. This is also true for the possibility that a conscious experience is a functionally separate entity from the consciousness neural process.
Now, let us examine the possibility that the conscious awareness of Quale V occurs intrinsically in the consciousness neural process as C’ in Figure 7.1B. Although consciousness has been proposed to have many functional properties such as allocentricity [1,2], complexity [3,4], dynamicity [1,5–7], global availability and effects [1,5,7], immediacy, independence, indescribability [2], integration [8,9], intentionality [3,4,6], perspectivalness [2,4–7], phenomenality [2,5–7,10], privacy [4,11,12], reportability [1,4], subjectivity [1,4,6,12,13], temporal continuity [13], transparency [5,7], and unity [4,6,9,10,11,13], the physical nature of the conscious awareness C’ can simply be found from the fact that we can be consciously aware of and consciously experience C’—we can be aware of and experience its occurrence and what it is like in our mind. This means that C’ must be consciously experienceable; hence, it must be a quale, as noted earlier. Because it is a quale, we can prove in the same way as we did for qualia in general in Section 5.1.2 that conscious awareness is the consciousness neural process’s special signaling pattern (SSP). This can answer all questions that the case of C cannot, as follows:
#1. The nature of C’ is that it is an SSP, which is a type of neural signaling pattern or neural information. Since a neural signaling pattern or information is a known entity, C’ is not a novel entity; thus, a new hypothesis to account for C’ is not needed.
#2. C’ occurs by the consciousness neural process creating an SSP with information that means conscious awareness of Quale V to the consciousness neural process itself as follows: After the consciousness neural process reads Neural process N’s signaling pattern, which has information about Quale V, it processes this information and then, with its specialized ability, incorporates this information into a new signaling pattern—an SSP—that has the information that means conscious awareness of Quale V to the consciousness neural process itself. This new signaling pattern, the SSP, is C’.
#3. C’ contains information about Quale V as discussed in #2.
#4. The consciousness neural process can read the signals from C’, which is its own signaling pattern by the process of reentrant signaling, and thus can become aware of C’. In other words, the SSP of C’ is fed back to the consciousness neural process. This type of reentrant signaling is not exceptional; it is common in the nervous system and occurs in many other neural processes [14–17].
Therefore, that conscious awareness is the consciousness neural process’s SSP having information that means conscious awareness of Quale V to the consciousness neural process itself readily explains everything without requiring a new hypothesis. This is similarly true for a conscious experience of Quale V. Accordingly, this theory proposes that conscious awareness and a conscious experience of a quale are the consciousness neural process’s SSPs that have information that means conscious awareness and a conscious experience of the quale, respectively, to the consciousness neural process itself.
The next question concerns the nature of consciousness. Because consciousness is defined as the composite of all conscious awareness and conscious experiences, it is the composite of all SSPs with information about conscious awareness and conscious experiences of all qualia present, including consciousness itself (because consciousness is one of the qualia that are present). Accordingly, the composite that consciousness is must be a form of signaling pattern that makes this possible. Let us examine how this is possible as follows:
Again, for simplicity, let us analyze conscious awareness and a conscious experience separately. Consider the conscious awareness of Quale V (C’ in Figure 7.1B), which is now proved to be the consciousness neural process’s signaling pattern that has information about conscious awareness of Quale V. Next, a new, secondary conscious awareness—conscious awareness of conscious awareness of Quale V—occurs after the consciousness neural process reads and processes the signaling pattern of conscious awareness of Quale V by the process of reentrant signaling. It is possible that reentrant signaling occurs repeatedly and that additional (tertiary and higher) conscious awareness occurs. In any case, finally, the consciousness neural process will reach a stable state that has circulating signals having information that means the unified conscious awareness of both Quale V and conscious awareness of Quale V to the consciousness neural process itself because we feel them unified and not fragmented in multiple pieces of conscious awareness. In addition, in this stable signaling state, the signal that is read by the consciousness neural process, passes through the consciousness neural process, comes out as its products, and is re-routed to be re-read, must be information-wise, the same signal (that is, no additional changes occur) because we feel the unified awareness as stable and not changing. This is also true for the conscious experience of Quale V.
Therefore, in the final stable reentrant signaling state, the signals circulating in the consciousness neural process are stable and have information about the conscious awareness and experiences of both Quale V and conscious awareness and experience of Quale V. This is true for all other qualia that are occurring. Thus, normally, the signaling state of the consciousness neural process has information about conscious awareness and experiences of both all occurring qualia and conscious awareness and experiences of all occurring qualia. This theory asserts that, physically and ontologically, consciousness is this stable reentrant signaling state of the consciousness neural process. This concept is similar to those in many current theories of consciousness [1,14,18–29].
7.2 Theorem V
However, reentrant signaling states also occur in many other neural processes [14–17], but, unlike in the consciousness neural process, conscious awareness and experiences do not occur in them. This means that reentrant signaling states in other neural processes do not have information that means conscious awareness and experiences to those neural processes, whereas some reentrant signaling states in the consciousness neural process have information with this meaning. Thus, this theory will assign the term “special signaling state” or “SSS” to this kind of reentrant signaling state that has information that means “conscious awareness and a conscious experience” to the consciousness neural process.
Definition: A special signaling state (SSS) is a reentrant signaling state with information that means conscious awareness and a conscious experience to the consciousness neural process.
Because of its meaning, when an SSS is read by the consciousness neural process through reentrant signaling, it appears to be conscious awareness and a conscious experience to the consciousness neural process. Hence, conscious awareness and a conscious experience naturally and inevitably appear in the consciousness neural process and the brain.
Based on the above definition, a composite of all SSSs is a composite of all reentrant signaling states with information that means conscious awareness and conscious experiences to the consciousness neural process. Evidently, this composite has the overall information that means all conscious awareness and conscious experiences to the consciousness neural process. Now, according to the definition of consciousness (Section 6.1), consciousness is all conscious awareness and conscious experiences. Therefore, the above composite contains information that means consciousness to the consciousness neural process. Thus, when this composite is read by the consciousness neural process through reentrant signaling, it is interpreted as such, and consciousness naturally and inevitably appears in the consciousness neural process and the brain. Because this composite appears as consciousness in the brain, this theory asserts that it is consciousness, and vice versa. This is the assertion of Theorem V of this theory.
Theorem V: Consciousness is the composite of all special signaling states (SSSs).
The term special signaling state can be re-expanded to understand the true meaning of consciousness:
Consciousness is the composite of all reentrant signaling states with information that means conscious awareness and conscious experiences to the consciousness neural process.
The terms conscious awareness and conscious experiences can also be re-expanded to make it clearer:
Consciousness is the composite of all reentrant signaling states with information that means awareness and experiences of what qualia are like to the consciousness neural process.
However, the composite of all SSSs is just a special kind of signaling state, and a special kind of signaling state is just a signaling state. Therefore, a more and the most basic form of Theorem V can be stated respectively as:
Theorem V: Consciousness is a special kind of signaling state.
Theorem V: Consciousness is a signaling state.
In plain language, this theorem means that awareness and experiences of what qualia are like (what an image of a house, a sound of a song, an odor of a flower, a feeling of happiness, a thought of one’s self, and so on are like) in our minds are simply a special kind of signaling state or, most basically, a signaling state in our brains.
Because a neural process’s signaling state is the neural process’s information that exists in that neural process, consciousness is a special type of neural process information existing in the consciousness neural process. This special type of information means all awareness and experiences of what qualia are like to the consciousness neural process itself.
The fundamental question is: How is it possible that a special signaling state and its information can have such a meaning—the meaning of awareness and experiences of what qualia are like? The answer is that it is a brute fact. In this universe, it is basically possible for some signaling states and their information to have such a meaning. During evolution, countless types of signaling states have evolved in the nervous system, with uncountable kinds of information and meanings for neural processes to perform various functions. SSSs are just signaling states that have evolved to be a new kind of information that means awareness and experiences of what qualia are like for a new function—to be aware of and experience what qualia are like—to be possible in a new kind of neural process, the consciousness neural process. To an outside observer who sees an SSS from a third-person point of view, it does not and cannot appear as consciousness because its information is not read and interpreted—it is merely looked at. On the contrary, to the consciousness neural process, an SSS can and must appear as consciousness because its information is read and interpreted. This interesting and important matter will be revisited and discussed in detail in Chapter 8: The Explanatory Gap.
Therefore, physically and ontologically, consciousness is neither a new physical entity nor a novel non-physical entity but just an underrecognized part of a well-recognized physical entity. It is the signaling state of a neural process—the inherent, informational part of a physical process. Because signaling states are inherent in neural processes, nothing in the brain emerges to be consciousness [10,30,31]. It seems as if something emerged to be consciousness only because SSSs have new information—the information that means awareness and experiences of what qualia are like—while other signaling states do not. Yet they also have their own information; it is only that their information means something else. To reiterate this crucial point, signaling states and information are inherent in neural processes; only the evolution of their meanings into “consciousness” makes it look as if something (consciousness) emerged. Since the nervous system appeared, new signaling states with new information, meanings, and functions have constantly been evolving from old signaling states with old information, meanings, and functions, with no novel entities emerging in the process. This fact has always been true and, according to all the evidence we have, will always be.
7.3 Physical Effects of Consciousness
Similar to qualia, which have been shown in Section 5.3 to have physical effects, consciousness has physical effects. This is because it is a signaling state, so it can affect the neural process itself. Also, it can convey its information to other neural processes, thereby affecting them. This matter can be investigated in more detail as follows:
Consider two integration-center neural processes as depicted in Figure 7.2. The integration-center neural process A (IC-A) receives information from various neural processes, analyzes and integrates information, and distributes processed information to appropriate neural processes. It may also have unconscious awareness of these functions, but it does not have conscious awareness and experiences; that is, awareness and experiences of what these functions, their products, or their other components are like do not occur in IC-A, like the cerebellum or present-day computers.
Figure 7.2 Effects of integration centers without and with consciousness
On the other hand, the integration-center neural process B (IC-B) performs the same functions as IC-A, but, in addition, it has conscious awareness and experiences. That is, awareness and experiences of what the products of the mentioned functions above are like occur in IC-B. Thus, IC-B has information about conscious awareness and experiences of these functions in its process, whereas IC-A does not. As IC-A and IC-B have different information, they have different signaling states. When they send their information to other neural processes, IC-B can send signaling patterns that contain information about conscious awareness and experiences to them, whereas IC-A cannot. Consequently, the physical effects on other neural processes must be different, at least when IC-B sends signaling patterns with such information, which IC-A cannot send.
Moreover, the physical impact of each signaling state on its own neural process is different because different signaling states entail different production, operation, and maintenance processes, which expend different physical resources. Thus, apart from the different effects of conscious awareness and experiences—i.e., consciousness—on other neural processes, having consciousness in a neural process also has specific physical effects on the neural process itself. Therefore, a system with consciousness is physically different from a system without it.
At present, we still cannot definitively conclude from existing research and studies what the physical effects of consciousness are. However, evolving anything (such as evolving the integration center) by adding a new function (such as consciousness) must cost resources (such as material, energy, and time to build and maintain the new circuit, process, and signal). If the new function does not yield sufficient beneficial effects to make its overall effects advantageous to the being, the new function will result in a disadvantage for that being and its species. This basis is especially true in the case of a major function in a crucial organ, such as in the case of consciousness in the brain. The species will likely be losing from this disadvantage and, during evolution, will become extinct due to competition with species that are similar but do not have that new function, such as the original species. However, this kind of extinction has not happened to humans, which possess consciousness, or to many high-level animal species that may have consciousness (because they have evolved similar brain structures to those supporting consciousness in humans [32–47]). Instead, these species appear to be thriving. Therefore, it can be deduced that consciousness must have overall effects that are beneficial to the beings that possess it. Similar to the effects of qualia on perception, it is probable that consciousness augments the functions of conscious mental processes so that they become more effective in certain ways.
At this point, it should be noted that the effects of consciousness have been widely studied and discussed in both cognitive neuroscience and philosophy. However, the definitions of consciousness in these studies are not always the same. Usually, in the literature on the effects of consciousness, the term consciousness has either the third or fourth meaning of consciousness discussed in Section 6.2. Consciousness in these two meanings has been found to have various beneficial effects, such as enabling us to solve problems that unconscious processes cannot, to have non-reflexive, flexible behavior, and to gain insight into the minds of others by illuminating our own workings [1,6,10,21,31, 41,44,48–60]. However, because of differences in definitions, these effects are not proven effects of consciousness as defined in this theory, which is awareness and experiences of what qualia are like. Therefore, although beneficial, these reported effects cannot be claimed to be effects of consciousness as defined in this theory.
Lastly, it should also be remarked that consciousness certainly not only has beneficial but also deleterious effects. Some authors have pointed out that consciousness’s seemingly beneficial effects are probably not enduring—it may be beneficial in the present ecosystem but disadvantageous in another. Thus, it is possible that consciousness itself is not intrinsically valuable and that its overall effects depend on external factors. However, this matter is still debatable [see References 61–63].
7.4 Predictions
- Consciousness will be found to occur with a certain neural process—it will not be found to occur without this neural process. This means that, when and where there is consciousness, the neural process that the consciousness is completely associated with will be found then and there.
Note. A neural process can be identified as this particular neural process by investigations that observe and may also manipulate potential neural processes concurrently with observing the consciousness in question. In various investigations, the neural process that consistently changes concomitantly and correspondingly with the consciousness will be that particular neural process, named the consciousness neural process. - This neural process will be found to have connections to all neural processes that have qualia, to have its signals fed back to itself, and to have, when it functions to be conscious of something, a signaling state that is categorically different from other signaling states of other neural processes. This signaling state is called a special signaling state (SSS).
- Consciousness can be identified, quantified, or monitored by identifying, quantifying, or monitoring, respectively, only the composite of all SSSs. These actions on the composite are both necessary and sufficient for the corresponding consciousness investigations to result, while these actions on anything else without involving the composite will not result in the corresponding consciousness investigations.
- Consciousness can be created, modified, tested, or destroyed by creating, modifying, testing, or destroying, respectively, only the composite of all SSSs. These actions on the composite are both necessary and sufficient for the corresponding actions on the consciousness to occur, and these actions on anything else without involving the composite will not result in the corresponding actions on the consciousness.
- In an event or experiment, all predictions that are valid for the composite of all SSSs, such as whether the composite will occur, change, or disappear, will be identically valid for the consciousness. That is, the changes that occur in the composite and those in the consciousness will be identical in all aspects (quality, quantity, temporal pattern, etc.). For example, if it is predicted that the composite of all SSSs will change abruptly from the composite involving vision, hearing, and thinking to the composite involving only emotion, it will be found that the changes in the consciousness will be identical in all aspects, such as identical changes from consciousness of vision, hearing, and thinking to that of emotion (quality) and from consciousness of three things to that of only one thing (quantity), and identical abruptness in changes (temporal pattern).
All the above predictions can be verified by experiments in conscious, communicative human subjects. A typical experiment is to monitor consciousness by having the subject report what happens to his or her consciousness while concomitantly monitoring the composite of all SSSs of the subject’s consciousness neural process by methods such as MEG, ECoG, and intracortical recordings and while the composite is being manipulated such as by drug administration or transcranial magnetic or electrical stimulation.
7.5 Remarks
At present, there is a great deal of research and many theories about consciousness, but they use varied consciousness definitions, aim to explain different aspects (anatomical, functional, phenomenological, etc.), and employ different investigation methods (experimental, philosophical, psychological, etc.) [6,64–67]. One of the most active and interesting topics is the neural correlates of consciousness—the minimal neuronal mechanisms that are jointly sufficient for any one specific conscious percept [1,14,18–22,24–27,68–93]. This topic has attracted substantial research and discussion, and various theories have been generated. Many theories about consciousness and its neural correlates involve the concept that consciousness arises from the integrated activities of widespread cortical and subcortical areas with bottom-up and top-down signaling, reentrant information processing, synchronous electrical oscillatory activities, and resonant states in the gamma frequency range (30–70 Hz). Examples of these theories are as follows: A Neurobiological Theory of Consciousness [76,77], The Neuronal Basis for Consciousness [94], The Neural Darwinism hypothesis or Theory of Neuronal Group Selection (TNGS) [1,14,21,22], The Global Workspace Theory [95–100], The Global Neuronal Workspace Hypothesis [20,101–105], An Extended Theory of Global Workspace of Consciousness [106], The Adaptive Resonance Theory (ART) [107–110], and Recurrent Processing Theory [17,111,112]. Other theories encompass a range of concepts. For instance, The Integrated Information Theory (IIT) entails the integration of information [9,113–117], The Orchestrated Objective Reduction (Orch OR) Theory centers on the principle of quantum vibrations within microtubules in brain neurons [118–121], The Operational Architectonic embraces the notion of levels of brain operational organization [122–124], and The Conscious Electromagnetic Information (Cemi) Field Theory as well as the Electromagnetic Field Theory of Consciousness links consciousness to the brain’s electromagnetic information field [125–128]. Additionally, there are many theories and hypotheses that try to explain consciousness in various other aspects using various other concepts [such as References 129–144]. Some of the frequently cited theories have been reviewed by several authors [such as References 6,64,66,67,145–147]. As it is not an objective of this theorem to review and discuss existing theories of consciousness or its neural correlates in detail, the author would like to refer interested readers to the cited references.
Regarding the physical and ontological nature of consciousness, which this theorem asserts to be a special kind of signaling state or simply a signaling state, there were similar concepts before. That is, the concept that consciousness is a kind of brain process or activity, like a signaling state in this theorem, had existed before this theory was developed. For example, Place (1956) said that consciousness is a process in the brain and can be identified with a pattern of brain activity [148]; Feigl (1958) said that consciousness is identical with certain (presumably configurational) aspects of the neural processes [149]; Lamme (2006) said that we could even define consciousness as recurrent processing [24]; Thagard and Stewart (2014) said that consciousness results from three mechanisms, the first one of which is representation by firing patterns in neural populations [141]; Dehaene (2014) said that consciousness lives in a loop and that reverberating neuronal activity, circulating in the web of our cortical connections, causes our conscious experiences [77]; lastly, Loorits (2014) said that consciousness as a whole can be seen as a complex neural pattern and that the phenomenal consciousness simply is a certain complex pattern of neural activity, a pattern of patterns of some simple neural events, and he proposed that the entire structure of consciousness will be discovered in some patterns of neural activity [150]. For more specific discussions, let us consider reentrant signaling states. The present theorem asserts that a certain special kind of them, the SSS, is consciousness. Many theories (in the previous paragraph) have proposed very similar concepts. Notably, the concept in the present theorem is almost identical to that in The Adaptive Resonance Theory (ART) [107–110], which states that all conscious states are resonance states. The ART provides comprehensive and strong evidence-based accounts and explanations regarding which resonant states can become conscious, why not all resonant states become conscious, and why all conscious states must be resonant. However, the ART and other theories that have the concept that reentrant signaling states or complex patterns of neural activities are consciousness or essential in consciousness occurrence do not provide a solid rationale for how and why phenomenality should occur in these resonant states or complex patterns of neural activities. In these theories, phenomenality and phenomenal consciousness “just occur” in these material neural networks, resonant states, or patterns of neural activities [151].
On the other hand, the present theorem presents the required rationale: Special signaling states (SSSs) are the information that means phenomenal consciousness to the consciousness neural process that is reading these signaling states, and, because of their meanings, they are interpreted as such—phenomenal consciousness thus naturally and inevitably occurs in the consciousness neural process and the brain. The answer to how it is possible that signaling states can be information having such a meaning is that it is fundamental in the nature of signaling states that some of them, the SSSs, can be such information. It is a brute fact. Some signaling states are information that means phenomenal consciousness to the consciousness neural process, and SSSs simply are such signaling states. Moreover, the reason why SSSs emerged or, to be precise, evolved in the nervous system is just because that was evolutionarily possible. When the nervous system had evolved to a sufficiently advanced stage, it was possible to evolve new neural circuits and processes that were able to produce SSSs. At present, the reason why SSSs persist is likely because they can yield advantages to the animals that have them.
As for the concept that consciousness is a kind of information, this is not a novel idea either. Several researchers have proposed this before. For example, Earl (2014) said that “consciousness is solely information in various forms.” [51], and Dehaene et al. (2001) said that “this global availability of information through the workplace is what we subjectively experience as a conscious state.” [103]. Other theories such as The Double-aspect Theory of Information [152,153] or The Dual-aspect Theory of Information [59,154], The Integrated Information Theory (IIT) [9,113–117], The Conscious Electromagnetic Information (Cemi) Field Theory [125–127], and a new (2020) theory of consciousness, The Information Closure Theory of Consciousness (ICT) [155], have also proposed similar concepts.
In The Double-Aspect Theory of Information, Chalmers (1995) explained that “there is a direct isomorphism between certain physically embodied information spaces and certain phenomenal (or experiential) information spaces” and formed a hypothesis that “Information (or at least some information) has two basic aspects, a physical aspect and a phenomenal aspect. … Experience arises by virtue of its status as one aspect of information, when the other aspect is found embodied in physical processing … We might say that phenomenal properties are the internal aspect of information.” [152]. Regarding the Integrated Information Theory (IIT), Tononi et al. (2004–2016) proposed that a conscious experience is a maximally irreducible conceptual structure (MICS), which corresponds to a local maximum of integrated conceptual information (or a local maximum of Φ or Φmax) [9,113,116,117]. The IIT uses these quantities to successfully predict which system is or is not conscious in a variety of cases, such as not being conscious during sleep, generalized seizures, or in the cerebellum. Mathematical methods and formulas to calculate Φ have been proposed [113–116,156], and, although these methods and formulas are complex and difficult, actual calculations have been attempted in human subjects [157–158]. However, the IIT does not provide rationales for why such MICS, which correspond to a local maximum of Φ, should become a phenomenal conscious experience or why phenomenality should occur in MICS [159,160]—phenomenality and phenomenal consciousness simply occur in MICS. Recently, Dubrovsky (2019) proposed that “… SR [subjective reality] phenomenon (for example, my sensory image in the form of visual perception of some object A, experienced at a given interval) can be considered as information (about this object) … it has its own definite carrier … Thus, the phenomenon of subjective reality is necessarily related to an appropriate brain process as information to its carrier.” [161]. Also, Chang et al. (2020) theorized in the Information Closure Theory of Consciousness (ICT) that consciousness comprises processes that form non-trivial informational closure (NTIC) but does not explain why such NTIC should have phenomenality [155].
On the other hand, this theorem asserts that information in specific forms—the SSSs, which are very likely to be highly complex but are not required by the theorem per se to be so—means phenomenal consciousness to the consciousness neural process. Thus, when the consciousness neural process reads this kind of information, it interprets it as phenomenal consciousness, not anything else, and phenomenal consciousness naturally and inevitably appears in the consciousness neural process and the brain. Therefore, the most crucial factor that determines whether any information is or can be phenomenal consciousness is the meaning, not the complexity, of the information.
(Note. Information can be very, very complex but unable to be consciousness because it does not mean consciousness to the consciousness neural process and is thus not interpreted as such—consciousness hence does not and cannot appear in the brain. Remember, the word “pneumonoultramicroscopicsilicovolcanoconiosis” does not mean consciousness to English-speaking people; on the other hand, because of its meaning, the word “consciousness” does, even though it is much shorter and much less complex. Also, from an evolutionary point of view, it seems more reasonable that the nervous system would evolve to respond to the meaning of information, which has definite survival value, rather than to the complexity of information, of which the survival value is questionable.)
Concerning predictions, the predictions of this theorem conflict with or are not the same as those of some important theories in the preceding paragraphs. For example, the Orchestrated Objective reduction (Orch OR) Theory [119–122] proposes that consciousness arises at the quantum level from quantum vibrations of microtubules inside brain neurons; therefore, experiments that significantly affect the proposed quantum vibrations without significantly affecting anything else should significantly affect consciousness. In contrast, Theorem V predicts that a) experiments significantly interfering with the signaling state of the consciousness neural process without or without significantly interfering with the quantum vibrations of the microtubules in the brain will significantly affect consciousness and b) experiments significantly interfering with the quantum vibrations of the microtubules without or without significantly interfering with the signaling state of the consciousness neural process will not or will not significantly affect consciousness. Therefore, such experiments enable us to assess which theory is incorrect. At present, however, it is obvious from clinical and experimental evidence that consciousness can be changed predictably by macroscopic interference with the brain signaling through mechanical, electrical, magnetic, or pharmacological interventions, without microscopic interference of microtubules at the quantum level. Even in daily-life events, the evidence contradicts the Orch OR Theory. For example, when one watches a movie and the information in the form of images and sounds enters the nervous system, one’s conscious visual and auditory content changes concurrently with and correspondingly to the movie’s information; however, the quantum vibrations of the visual and auditory neurons’ neural tubules cannot change concurrently with and correspondingly to the input information because there is no mechanism to convey the outside information to the neural tubules, which are inside the neurons. Thus, this everyday-life evidence demands explanations from the Orch OR Theory for how the quantum vibrations of neural tubules can correspond to consciousness, at least in one of the most important aspects—the content aspect of consciousness.
Various differential experiments can be performed to verify the predictions of this theory against those of other theories. For example, experiments that can significantly affect the composite of SSSs without significantly affecting Φmax in that brain area will be able to differentiate the validity of this theorem against that of the IIT. According to this theorem, consciousness will be significantly affected by such experiments, but according to the IIT, consciousness will not be significantly affected. Differentiating between this theorem and the ART [107–110] is the most difficult because of the very similar concepts of both theories. The exact characteristics of the reentrant signaling states and the resonant states that are consciousness must be known to differentiate the two theories. If these characteristics are as predicted by this theorem—all SSSs have the same special form that is categorically different from those of other signaling states—this theorem will be validated. However, if these characteristics are as predicted by the ART, the ART will be validated. The results of such differential experiments, no matter what they may be, will undoubtedly help us understand consciousness more comprehensively. They will assist us in refining existing consciousness theories to achieve the complete consciousness theory someday.
⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓
Like a spectrum of electromagnetic waves,
of which some frequencies do not elicit
color perception in the brain,
but some do,
In a spectrum of information,
some information does not elicit
qualia and consciousness in the brain,
but some does.
Qualia’s and consciousness’s information
are the parts of an information spectrum
that do.
⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓ ⁓
References
- Seth AK, Baars BJ. Neural Darwinism and consciousness. Conscious Cogn. 2005 Mar;14(1):140–168. http://ccrg.cs.memphis.edu/assets/papers/2004/Seth%20&%20Baars,%20Neural%20Darwinism-2004.pdf
- Weisberg J. The hard problem of consciousness. In: Internet Encyclopedia of Philosophy. https://www.iep.utm.edu/hard-con/
- Seth A. Explanatory correlates of consciousness: Theoretical and computational challenges. Cogn Comput. 2009;1:50–63. doi: 10.1007/s12559-009-9007-x. https://neuro.bstu.by/ai/To-dom/My_research/Papers-2.1-done/Cognitive-S/1/fulltext-3.pdf
- Seth AK, Izhikevich E, Reeke GN, Edelman GM. Theories and measures of consciousness: An extended framework. Proc Natl Acad Sci U S A. 2006 Jul 11;103(28):10799–10804. doi: 10.1073/pnas.0604347103. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1487169/
- Clowes RW, Seth AK. Axioms, properties and criteria: Roles for synthesis in the science of consciousness. Artif Intell Med. 2008;44:93–104. http://users.sussex.ac.uk/~anils/Papers/ClowesSethAIMedicine2008.pdf
- Van Gulick R. Consciousness. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy (Summer 2017 Edition). https://plato.stanford.edu/archives/sum2017/entries/consciousness
- Metzinger T. Pre´cis: Being no one. Psyche 2005;11(5):1–35. https://www.researchgate.net/publication/228661607_Precis_Being_No_One
- Haun A, Tononi G. Why does space feel the way it does? Towards a principled account of spatial experience. Entropy. 2019 Nov 27;21(12):1160. doi: 10.3390/e21121160. https://www.mdpi.com/1099-4300/21/12/1160/htm
- Tononi G, Boly M, Massimini M, Koch C. Integrated Information Theory: From consciousness to its physical substrate. Nature Reviews Neuroscience. 2016;17:450–461. doi: 10.1038/nrn.2016.44. https://www.researchgate.net/publication/303551101_Integrated_information_theory_From_consciousness_to_its_physical_substrate
- Feinberg TE, Mallatt J. The nature of primary consciousness. A new synthesis. Conscious Cogn. 2016;43:113–127. doi: 10.1016/j.concog.2016.05.009. https://www.gwern.net/docs/psychology/2016-feinberg.pdf
- Tyler CW. Ten Testable Properties of Consciousness. Front Psychol. 2020;11:1144. doi: 10.3389/fpsyg.2020.01144. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7326790/?report=classic
- Zeman A. What do we mean by “conscious” and “aware”? Neuropsychol Rehabil. 2006 Aug;16(4):356–376. doi: 10.1080/09602010500484581. https://www.ncbi.nlm.nih.gov/pubmed/16864477
- Winters JJ. The Temporally-Integrated Causality Landscape: Reconciling neuroscientific theories with the phenomenology of consciousness. Front Hum Neurosci. 2021;15:768459. doi: 10.3389/fnhum.2021.768459. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8599361/
- Edelman GM. Neural Darwinism: Selection and reentrant signaling in higher brain function. Neuron. 1993 Feb;10:115–125. http://brainmaps.org/pdf/edelman1993.pdf http://www.acamedia.info/letters/an_Peter_von_Salis/references/neurosciences_institute/edelman1993.pdf
- Edelman GM, Gally JA. Reentry: A key mechanism for integration of brain function. Front Integr Neurosci. 2013;7:63. doi: 10.3389/fnint.2013.00063. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3753453/
- Keren H, Marom S. Long-range synchrony and emergence of neural reentry. Sci Rep. 2016 Nov 22;6:36837. doi: 10.1038/srep36837. https://www.nature.com/articles/srep36837
- Lamme VA, Roelfsema PR. The distinct modes of vision offered by feedforward and recurrent processing. Trends Neurosci. 2000 Nov;23(11):571–579. doi: 10.1016/S0166-2236(00)01657-X. https://www.researchgate.net/publication/12253934_The_Distinct_Modes_of_Vision_Offered_by_Feedforward_and_Recurrent_Processing
- Chalmers DJ. What is a neural correlate of consciousness? In: Metzinger T, editor. Neural correlates of consciousness: Empirical and conceptual questions. Cambridge: The MIT Press; 2000. https://pdfs.semanticscholar.org/35c4/ecd86863e84d2b2b0a31294b7b0223d7204e.pdf
- De Sousa A. Towards an integrative theory of consciousness: Part 1 (neurobiological and cognitive models). Mens Sana Monogr. 2013 Jan–Dec;11(1):100–150. doi: 10.4103/0973-1229.109335. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3653219/
- Dehaene S, Sergent C, Changeux JP. A neuronal network model linking subjective reports and objective physiological data during conscious perception. Proc Natl Acad Sci U S A. 2003 Jul 8;100(14):8520–8525. doi: 10.1073/pnas.1332574100. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC166261/
- Edelman GM. Naturalizing consciousness: A theoretical framework. PNAS. 2003 Apr 29;100(9):5520–5524. https://doi.org/10.1073/pnas.0931349100 https://www.pnas.org/content/100/9/5520
- Edelman GM, Gally JA, Baars BJ. Biology of Consciousness. Front Psychol. 2011;2:4. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3111444/
- Fisch L, Privman E, Ramot M, et al. Neural “Ignition”: Enhanced activation linked to perceptual awareness in human ventral stream visual cortex. Neuron. 2009 Nov 25;64(4):562–574. doi: 10.1016/j.neuron.2009.11.001. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2854160/
- Lamme VAF. Can neuroscience reveal the true nature of consciousness? citeseerx.ist.psu.edu. http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.217.2789 https://www.nyu.edu/gsas/dept/philo/courses/consciousness05/LammeNeuroscience.pdf
- Morales J, Lau H. The neural correlates of consciousness. In: Kriegel U, editor. Penultimate draft: Forthcoming in The Oxford Handbook of the Philosophy of Consciousness. Oxford University Press; 2018 Jan. https://philpapers.org/archive/MORTNC-7.pdf
- Nani A, Manuello J, Mancuso L, Liloia D, Costa T, Cauda F. The neural correlates of consciousness and attention: Two sister processes of the brain. Front Neurosci. 2019;13:1169. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6842945/
- Pollen DA. On the neural correlates of visual perception. Cereb Cortex. 1999;9(1):4–19. https://doi.org/10.1093/cercor/9.1.4. https://academic.oup.com/cercor/article/9/1/4/314915/On-the-Neural-Correlates-of-Visual-Perception
- Sikkens T, Bosman CA, Olcese U. The role of top-down modulation in shaping sensory processing across brain states: Implications for consciousness. Front Syst Neurosci. 2019 Jul 24;13:31. doi: 10.3389/fnsys.2019.00031. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6802962/pdf/fnsys-13-00031.pdf
- van Gaal S, Lamme VA. Unconscious high-level information processing: Implication for neurobiological theories of consciousness. Neuroscientist. 2012 Jun;18(3):287–301. doi: 10.1177/1073858411404079. https://pdfs.semanticscholar.org/b9af/d0e8d460ba73cf197a96e2dd8b524e05390c.pdf
- Clark TW. Function and phenomenology: Closing the explanatory gap. J Conscious Stud. 1995;2(3):241–254. https://www.naturalism.org/philosophy/consciousness/the-explanatory-gap
- Feinberg TE, Mallatt J. Phenomenal consciousness and emergence: Eliminating the explanatory gap. Front Psychol. 2020;11:1041. doi: 10.3389/fpsyg.2020.01041. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7304239/
- Baars BJ. Subjective experience is probably not limited to humans: The evidence from neurobiology and behavior. Conscious Cogn. 2005 Mar;14(1):7–21. https://ccrg.cs.memphis.edu/assets/papers/2005/Baars-Subjective%20animals-2005.pdf
- Birch J, Schnell AK, Clayton NS. Dimensions of animal consciousness. Trends Cogn Sci. 2020;24(10):789–801. doi: 10.1016/j.tics.2020.07.007. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7116194/?report=classic
- Boly M, Seth AK, Wilke M, et al. Consciousness in humans and non-human animals: Recent advances and future directions. Front Psychol. 2013;4:625. doi: 10.3389/fpsyg.2013.00625. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3814086/
- Butler AB. Evolution of brains, cognition, and consciousness. Brain Res Bull. 2008 Mar 18;75(2–4):442–449.
- Butler A, Manger P, Lindahl BIB, Århem P. Evolution of the neural basis of consciousness: A bird-mammal comparison. BioEssays: News and reviews in molecular, cellular and developmental biology. 2005 Sep:27(9);923–936. doi: 10.1002/bies.20280. https://www.researchgate.net/publication/7652731_Evolution_of_the_neural_basis_of_consciousness_A_bird-mammal_comparison
- Cabanac M, Cabanac AJ, Parent A. The emergence of consciousness in phylogeny. Behav Brain Res. 2009 Mar 17;198(2):267–272. doi: 10.1016/j.bbr.2008.11.028. https://cogs.sitehost.iu.edu/spackled/2009readings/The%20emergence%20of%20consciousness%20in%20phylogeny.pdf
- Edelman DB, Baars BJ, Seth AK. Identifying hallmarks of consciousness in non-mammalian species. Conscious Cogn. 2005 Mar;14(1):169–187. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.83.4676&rep=rep1&type=pdf
- Edelman DB, Seth AK. Animal consciousness: A synthetic approach. Trends Neurosci. 2009 Sep;32(9):476–484. http://users.sussex.ac.uk/~anils/Papers/EdelmanSethTiNS2009Preprint.pdf
- Griffin DR, Speck GB. New evidence of animal consciousness. Anim Cogn. 2004 Jan;7(1):5–18. doi: 10.1007/s10071-003-0203-x. http://eebweb.arizona.edu/faculty/dornhaus/courses/materials/papers/Griffin%20Speck%20consciousness%20cognition.pdf
- Irwin LN. Renewed perspectives on the deep roots and broad distribution of animal consciousness. Front Syst Neurosci. 2020;14:57. doi: 10.3389/fnsys.2020.00057. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7438986/?report=classic
- Klein C, Barron AB. Insects have the capacity for subjective experience. Animal Sentience. 2016;100:1–52. https://animalstudiesrepository.org/cgi/viewcontent.cgi?article=1113&context=animsent
- Low P. The Cambridge declaration on consciousness. Panksepp J, Reiss D, Edelman D, Van Swinderen B, Low P, Koch C, editors. Publicly proclaimed in Cambridge, UK, on July 7, 2012, at the Francis Crick Memorial Conference on Consciousness in Human and non-Human Animals. http://fcmconference.org/img/CambridgeDeclarationOnConsciousness.pdf
- Mallatt J, Feinberg TE. Multiple routes to animal consciousness: Constrained multiple realizability rather than Modest Identity Theory. Front Psychol. 2021 Sep;12:732336. doi: 10.3389/fpsyg.2021.732336. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8497802/?report=classic
- Mashour GA, Alkire MT. Evolution of consciousness: Phylogeny, ontogeny, and emergence from general anesthesia. Proc Natl Acad Sci U S A. 2013 Jun 18;110(Suppl 2):10357–10364. doi: 10.1073/pnas.1301188110. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3690605/
- Mather JA. Cephalopod consciousness: Behavioural evidence. Conscious Cogn. 2008 Mar;17(1):37–48.
- Seth AK, Baars BJ, Edelman DB. Criteria for consciousness in humans and other mammals. Conscious Cogn. 2005 Mar;14(1):119–139. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.317.8149&rep=rep1&type=pdf
- Baars BJ. Chapter Ten. The functions of consciousness. In: A cognitive theory of consciousness. New York: Cambridge University Press; 1988. http://bernardbaars.pbworks.com/f/++++Functions+of+Consciousness.pdf
- Block N. On a confusion about a function of consciousness. Behav Brain Sci. 1995;18(2):227–287. https://www.academia.edu/download/49640917/On_A_Confusion_About_a_Function_of_Consc20161016-5993-740vtb.pdf
- Dehaene S. Chapter 3. What is consciousness good for? In: Consciousness and the brain: Deciphering how the brain codes our thoughts. New York, New York: The Penguin Group; 2014:89–114.
- Earl B. The biological function of consciousness. Front Psychol. 2014;5:697. doi: 10.3389/fpsyg.2014.00697. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4122207/
- Frigato G. The neural correlates of access consciousness and phenomenal consciousness seem to coincide and would correspond to a memory center, an activation center and eight parallel convergence centers. Front Psychol. 2021;12:749610. doi: 10.3389/fpsyg.2021.749610. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8511498/?report=classic
- Kanai R, Chang A, Yu Y, de Abril IM, Biehl M, Guttenberg N. Information generation as a functional basis of consciousness. Neurosci Conscious. 2019;2019(1):niz016. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6884095/
- Koch C. The quest for consciousness: A neurobiological approach. Roberts & Company Publishers; 2004. https://www.researchgate.net/publication/232296537_The_Quest_for_Consciousness_A_Neurobiological_Approach
- Lamme V. The crack of dawn: Perceptual functions and neural mechanisms that mark the transition from unconscious processing to conscious vision. In: Metzinger T, Windt JM, editors. Open mind: 22(T). Frankfurt am Main: MIND Group. 2015. doi: 10.15502/9783958570092. https://open-mind.net/DOI?isbn=9783958570092
- Mudrik L, Faivre N, Koch C. Information integration without awareness. Trends Cogn Sci. 2014 Sep;18(9):488–496. doi: 10.1016/j.tics.2014.04.009. https://www.ncbi.nlm.nih.gov/pubmed/24933626
- Pennartz CMA, Farisco M, Evers K. Indicators and criteria of consciousness in animals and intelligent machines: an inside-out approach. Front. Syst. Neurosci. 2019;13(25). doi: 10.3389/fnsys.2019.00025. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6660257/
- Pierson LM, Trout M. What is consciousness for? New Ideas Psychol. 2017;47:62–71. https://www.sciencedirect.com/science/article/pii/S0732118X15300039
- Velmans M. Understanding consciousness. 2nd ed. Hove, East Sussex: Routledge; 2009. https://dl.uswr.ac.ir/bitstream/Hannan/130278/1/0415425158.Routledge.Understanding.Consciousness.Second.Edition.Apr.2009.pdf
- Zeman A. Consciousness. Brain. 2001 Jul;124(Pt 7):1263–1289. https://academic.oup.com/brain/article-pdf/124/7/1263/802709/1241263.pdf
- Cleeremans A, Tallon-Baudry C. Consciousness matters: Phenomenal experience has functional value. Neurosci Conscious. 2022;2022(1):niac007. doi: 10.1093/nc/niac007. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9036654/?report=classic
- Lee AY. Is consciousness intrinsically valuable? Philos Stud. 2019;175:1–17. https://philarchive.org/archive/LEEICI
- Kriegel U. The value of consciousness. Analysis. 2019;79:503–520. https://philpapers.org/archive/KRITVO-6.pdf
- Doerig A, Schurger A, Herzog MH. Hard criteria for empirical theories of consciousness. Cogn Neurosci. 2021 Jan–Jan;12(2):41–62. doi: 10.1080/17588928.2020.1772214. https://www.tandfonline.com/doi/full/10.1080/17588928.2020.1772214
- Niikawa T. A map of consciousness studies: Questions and approaches. Front Psychol. 2020;11:530152. doi: 10.3389/fpsyg.2020.530152. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7578362/
- Sattin D, Magnani FG, Bartesaghi L, et al. Theoretical models of consciousness: A scoping review. Brain Sci. 2021;11(5):535. doi: 10.3390/brainsci11050535. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8146510/
- Signorelli CM, Szczotka J, Prentner R. Explanatory profiles of models of consciousness—towards a systematic classification. Neurosci Conscious. 2021;2021(2):niab021. doi: 10.1093/nc/niab021. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8396118/
- Aru J, Bachmann T, Singer W, Melloni L. Distilling the neural correlates of consciousness. Neurosci Biobehav Rev. 2012 Feb;36(2):737–746. https://doi.org/10.1016/j.neubiorev.2011.12.003. https://www.sciencedirect.com/science/article/pii/S0149763411002107
- Baars, BJ, Laureys, S. One, not two, neural correlates of consciousness. Trends Cogn Sci. 2005 Jun;9(6). http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.367.9177&rep=rep1&type=pdf
- Babiloni C, Marzano N, Soricelli A, et al. Cortical neural synchronization underlies primary visual consciousness of qualia: Evidence from event-related potentials. Front Hum Neurosci. 2016;10:310. doi: 10.3389/fnhum.2016.00310. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4927634/
- Block N. Two neural correlates of consciousness. Trends Cogn Sci. 2005;9;46–52. http://www.nyu.edu/gsas/dept/philo/faculty/block/papers/final_revised_proof.pdf
- Boly M, Massimini M, Tsuchiya N, Postle BR, Koch C, Tononi G. Are the neural correlates of consciousness in the front or in the back of the cerebral cortex? Clinical and neuroimaging evidence. J Neurosci. 2017 Oct;37(40):9603–9613. doi: 10.1523/JNEUROSCI.3218-16.2017. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5628406/
- Cosmelli D, Lachaux JP, Thompson E. Chapter 26. Neurodynamical approaches to consciousness. In: Zelazo PD, Moscovitch M, Thompson E, editors. The Cambridge handbook of consciousness. Cambridge University Press; 2007:731–772. http://perpus.univpancasila.ac.id/repository/EBUPT181231.pdf
- Crick F, Koch C. Consciousness and neuroscience. Cereb Cortex. 1998 Mar;8(2):97–107. https://authors.library.caltech.edu/40355/1/feature_article.pdf
- Crick F, Koch C. Towards a neurobiological theory of consciousness. Seminars in the Neurosciences. 1990;2:263–275. https://authors.library.caltech.edu/40352/1/148.pdf
- Crick F, Koch C. Some reflections on visual awareness. Cold Spring Harb Symp Quant Biol. 1990;55:953–962. https://authors.library.caltech.edu/40351/1/61.pdf
- Dehaene S. Chapter 4. The signature of a conscious thought. In: Consciousness and the brain: Deciphering how the brain codes our thoughts. New York, New York: The Penguin Group; 2014:115–160.
- Dehaene S, Changeux JP. Experimental and theoretical approaches to conscious processing. Neuron. 2011 Apr 28;70(2):200–227. doi: 10.1016/j.neuron.2011.03.018. https://www.researchgate.net/publication/281109453_Experimental_and_Theoretical_Approaches_to_Conscious_Processing
- Dehaene S, Changeux JP, Naccache L, Sackur J, Sergent C. Conscious, preconscious, and subliminal processing: A testable taxonomy. Trends Cogn Sci. 2006 May;10(5):204–211. doi: 10.1016/j.tics.2006.03.007. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.65.3821&rep=rep1&type=pdf
- Del Cul A, Baillet S, Dehaene S. Brain dynamics underlying the nonlinear threshold for access to consciousness. PLoS Biol. 2007 Oct;5(10):e260. https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.0050260
- Engel AK, Singer W. Temporal binding and the neural correlates of sensory awareness. Trends Cogn Sci. 2001 Jan 1;5(1):16–25. doi: 10.1016/s1364-6613(00)01568-0. http://andreas-engel.com/engel_2001_tics.pdf
- Fink SB. A deeper look at the “Neural correlate of consciousness.” Front Psychol. 2016;7:144. doi: 10.3389/fpsyg.2016.01044. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4960249/
- Frith C, Perry R, Lumer E. The neural correlates of conscious experience: An experimental framework. Trends Cogn Sci. 1999 Mar;3(3):105–114. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.575.6134&rep=rep1&type=pdf
- Koch C, Massimini M, Boly M, Tononi G. Neural correlates of consciousness: Progress and problems. Nat Rev Neurosci. 2016;17:307–321. https://puredhamma.net/wp-content/uploads/Neural-correlates-of-consciousness-Koch-et-al-2016.pdf
- Lamme VAF. Visual functions generating conscious seeing. Front Psychol. 2020 Feb 14;11:83. doi: 10.3389/fpsyg.2020.00083. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7034432/
- Lamy D, Salti M, Bar-Haim Y. Neural correlates of subjective awareness and unconscious processing: An ERP study. J Cogn Neurosci. 2009 Jul;21(7):1435–1446. doi: 10.1162/jocn.2009.21064. https://www.researchgate.net/publication/23170315_Neural_Correlates_of_Subjective_Awareness_and_Unconscious_Processing_An_ERP_Study
- Melloni L, Molina C, Pena M, Torres D, Singer W, Rodriguez E. Synchronization of neural activity across cortical areas correlates with conscious perception. J Neurosci. 2007 Mar 14;27(11):2858–2865. https://doi.org/10.1523/JNEUROSCI.4623-06.2007 http://www.jneurosci.org/content/27/11/2858.long
- Modolo J, Hassan M, Wendling F, Benquet P. Decoding the circuitry of consciousness: From local microcircuits to brain-scale networks. Netw Neurosci. 2020;4(2):315–337. doi: 10.1162/netn_a_00119. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7286300/?report=classic
- Owen M. Neural correlates of consciousness and the nature of the mind. In: Guta MP, editor. Consciousness and the Ontology of Properties. 1st ed. New York: Routledge; 2018. https://www.newdualism.org/papers-Jul2020/Owen-OWENCOv3.pdf
- Owen M, Guta MP. Physically sufficient neural mechanisms of consciousness. Front Syst Neurosci. 2019 Jul 4;13:24. doi: 10.3389/fnsys.2019.00024. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6622321/
- Polák M, Marvan T. Neural correlates of consciousness meet the theory of identity. Front Psychol. 2018;9:1269. doi: 10.3389/fpsyg.2018.01269. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6066586/
- Rees G, Kreiman G, Koch C. Neural correlates of consciousness in humans. Nat Rev Neurosci. 2002;3(4):261–270. https://www.researchgate.net/publication/11399830_Neural_correlates_of_consciousness_in_humans
- Tononi G, Koch C. The neural correlates of consciousness: An update. Ann N Y Acad Sci. 2008;1124:239–261. doi: 10.1196/annals.1440.004. https://authors.library.caltech.edu/40650/1/Tononi-Koch-08.pdf
- Llinás R, Ribary U, Contreras D, Pedroarena C. The neuronal basis for consciousness. Philos Trans R Soc Lond B Biol Sci. 1998 Nov 29;353(1377):1841–1849. doi: 10.1098/rstb.1998.0336. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1692417/pdf/9854256.pdf
- Baars BJ. Global workspace theory of consciousness: Toward a cognitive neuroscience of human experience. Prog Brain Res. 2005;150:45–53. doi: 10.1016/S0079-6123(05)50004-9. https://www.cs.helsinki.fi/u/ahyvarin/teaching/niseminar4/Baars2004.pdf
- Baars BJ. How does a serial, integrated and very limited stream of consciousness emerge from a nervous system that is mostly unconscious, distributed, parallel and of enormous capacity? Ciba Found Symp. 1993;174:282–290; discussion 291–303.
- Baars BJ, Franklin S, Ramsoy TZ. Global workspace dynamics: Cortical “binding and propagation” enables conscious contents. Front Psychol. 2013;4:200. doi: 10.3389/fpsyg.2013.00200. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3664777/
- Baars BJ, Geld N, Kozma R. Global Workspace Theory (GWT) and prefrontal cortex: Recent developments. Front Psychol. 2021;12:749868. doi: 10.3389/fpsyg.2021.749868. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8660103/
- Dehaene S, Kerszberg M, Changeux JP. A neuronal model of a global workspace in effortful cognitive tasks. Proc Natl Acad Sci U S A. 1998;95(24):14529–14534. doi: 10.1073/pnas.95.24.14529. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC24407/
- Newman J, Baars BJ, Cho SB. A Neural Global Workspace Model for Conscious Attention. Neural Netw. 1997 Oct 1;10(7):1195–1206. doi: 10.1016/s0893-6080(97)00060-9. https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.453.6016&rep=rep1&type=pdf
- Dehaene S, Changeux JP, Naccache L. The Global Neuronal Workspace Model of conscious access: From neuronal architectures to clinical applications. 2011. In: Dehaene S, Christen Y, editors. Characterizing consciousness: From cognition to the clinic? Research and Perspectives in Neurosciences. Berlin, Heidelberg: Springer-Verlag; 2011. https://doi.org/10.1007/978-3-642-18015-6_4 http://www.antoniocasella.eu/dnlaw/Dehaene_Changeaux_Naccache_2011.pdf
- Dehaene S, Charles L, King JR, Marti S. Toward a computational theory of conscious processing. Curr Opin Neurobiol. 2014 Apr;25:76–84. doi: 10.1016/j.conb.2013.12.005. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5635963/
- Dehaene S, Naccache L. Towards a cognitive neuroscience of consciousness: Basic evidence and a workspace framework. Cognition. 2001 Apr;79(1–2):1–37. https://www.jsmf.org/meetings/2003/nov/Dehaene_Cognition_2001.pdf
- Mashour GA, Roelfsema P, Changeux JP, Dehaene S. Conscious processing and the global neuronal workspace hypothesis. Neuron. 2020;105(5):776–798. doi: 10.1016/j.neuron.2020.01.026. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8770991/
- Sergent C, Dehaene S. Neural processes underlying conscious perception: Experimental findings and a global neuronal workspace framework. J Physiol Paris. 2004 Jul–Nov;98(4–6):374–384. doi: 10.1016/j.jphysparis.2005.09.006 https://pdfs.semanticscholar.org/ae61/178a998b4e08851af8ba80e7815fd2c9e6d9.pdf
- Song X, Tang X. An extended theory of global workspace of consciousness. Prog Nat Sci. 2008 Jul 10;18(7):789–793. https://doi.org/10.1016/j.pnsc.2008.02.003 https://www.sciencedirect.com/science/article/pii/S100200710800138X
- Grossberg S. Adaptive Resonance Theory: How a brain learns to consciously attend, learn, and recognize a changing world. Neural Netw. 2013 Jan;37:1–47. doi: 10.1016/j.neunet.2012.09.017. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.294.4425&rep=rep1&type=pdf
- Grossberg S. Adaptive Resonance Theory. Technical Report CAS/CNS-2000-024. Boston University Center for Adaptive Systems and Department of Cognitive and Neural Systems. Sep 2009. https://open.bu.edu/bitstream/handle/2144/2272/00.024.pdf?sequence=1
- Grossberg S. Towards solving the hard problem of consciousness: The varieties of brain resonances and the conscious experiences that they support. Neural Netw. 2017 Mar;87:38–95. doi: 10.1016/j.neunet.2016.11.003. https://linkinghub.elsevier.com/retrieve/pii/S0893-6080(16)30180-0
- Carpenter GA, Grossberg S. Adaptive Resonance Theory. CAS/CNS Technical Report 2009-008. Boston University Center for Adaptive Systems and Department of Cognitive and Neural Systems. May 2009. https://open.bu.edu/bitstream/handle/2144/1972/TR-09-008.pdf?sequence=1
- Lamme VA. How neuroscience will change our view on consciousness. Cogn Neurosci. 2010 Sep;1(3):204–220. doi: 10.1080/17588921003731586. https://www.ncbi.nlm.nih.gov/pubmed/24168336
- Lamme VAF. Challenges for theories of consciousness: Seeing or knowing, the missing ingredient and how to deal with panpsychism. Philos Trans R Soc Lond B Biol Sci. 2018 Sep;373(1755):20170344. doi: 10.1098/rstb.2017.0344. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6074090/
- Oizumi M, Albantakis L, Tononi G. From the phenomenology to the mechanisms of consciousness: Integrated Information Theory 3.0. PLoS Comput Biol. 2014 May;10(5):e1003588. doi: 10.1371/journal.pcbi.1003588. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4014402/pdf/pcbi.1003588.pdf
- Tononi G. Consciousness, information integration, and the brain. Prog Brain Res. 2005;150:109–126. http://www.brainnrg.org/files/publicationmodule/@random4824abb32cfea/Tononi_2005_Progress_in_Brain_Research.pdf
- Tononi G. An information integration theory of consciousness. BMC Neurosci 2004,5:42. doi: 10.1186/1471-2202-5-42. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC543470/pdf/1471-2202-5-42.pdf
- Tononi G. Integrated information theory of consciousness: An updated account. Arch Ital Biol. 2012 Jun–Sep;150(2–3):56–90. doi: 10.4449/aib.v149i5.1388. http://www.architalbiol.org/aib/article/view/15056/23165867
- Tononi G, Koch C. Consciousness: Here, there and everywhere? Philos Trans R Soc Lond B Biol Sci. 2015 May 19;370(1668):20140167. doi: 10.1098/rstb.2014.0167. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4387509/
- Hameroff S. ‘Orch OR’ is the most complete, and most easily falsifiable theory of consciousness. Cogn Neurosci. 2021;12(2):74–76. doi: 10.1080/17588928.2020.1839037. https://www.tandfonline.com/doi/full/10.1080/17588928.2020.1839037
- Hameroff S, Marcer P. Quantum computation in brain microtubules? The PenroseHameroff “Orch OR” model of consciousness. Philos Trans R Soc Lond A. 1998 Aug 15:356(1743):1869–1896. https://www.quantumconsciousness.org/sites/default/files/hameroff-1998.pdf
- Hameroff S, Penrose R. Consciousness in the universe: A review of the ‘Orch OR’ theory. Phys Life Rev. 2014 Mar;11(1):39–78. doi: 10.1016/j.plrev.2013.08.002. https://www.galileocommission.org/wp-content/uploads/2019/02/Hameroff-Penrose-2016-Consciousness-In-The-Universe-An-Updated-Review-Of-The-Orch-Or-Theory.pdf
- Hameroff S, Penrose R. Orchestrated reduction of quantum coherence in brain microtubules: A model for consciousness. Math Comput Simul. 1996;40:453–480. http://www.alice.id.tue.nl/references/hameroff-penrose-1996.pdf
- Fingelkurts AA, Fingelkurts AA, Neves CFH. Consciousness as a phenomenon in the operational architectonics of brain organization: Criticality and self-organization considerations. Chaos Solitons Fractals. 2013;55:13–31. doi: 10.1016/j.chaos.2013.02.007. https://www.researchgate.net/publication/276169154_Consciousness_as_a_phenomenon_in_the_operational_architectonics_of_brain_organization_Criticality_and_self-organization_considerations
- Fingelkurts AA, Fingelkurts AA, Neves CFH. Phenomenological architecture of a mind and operational architectonics of the brain: The unified metastable continuum. New Mathematics and Natural Computation (NMNC). 2009;05:221–244. doi: 10.1142/S1793005709001258. https://www.researchgate.net/publication/24108915_Phenomenological_architecture_of_a_mind_and_operational_architectonics_of_the_brain_The_unified_metastable_continuum
- Fingelkurts AA, Fingelkurts AA. Operational architectonics of the human brain biopotential field: Towards solving the mind-brain problem. Brain and Mind. 2001;2:261–296. https://www.researchgate.net/publication/227287480_Operational_Architectonics_of_the_Human_Brain_Biopotential_Field_Towards_Solving_the_Mind-Brain_Problem
- McFadden J. Integrating information in the brain’s EM field: The cemi field theory of consciousness. Neurosci Conscious. 2020;10:1093/nc/niaa016. https://www.researchgate.net/publication/345370853_Integrating_information_in_the_brain’s_EM_field_the_cemi_field_theory_of_consciousness
- McFadden J. The CEMI Field Theory gestalt information and the meaning of meaning. J Conscious Stud. 2013;20(3–4):3–4. https://philpapers.org/archive/MCFTCF-2.pdf
- McFadden J. The Conscious Electromagnetic Information (Cemi) Field Theory. The hard problem made easy? J Conscious Stud. 2002;9(8):45–60. https://philpapers.org/archive/MCFTCE.pdf
- Pockett S. Difficulties with the electromagnetic field theory of consciousness. J Cons Stud. 2002;9:51–56. https://newdualism.org/papers/S.Pockett/Pockett-JCS2002.pdf
- Brown R, Lau H, LeDoux JE. Understanding the higher-order approach to consciousness. Trends Cogn Sci. 2019;23:754–768. https://www.sciencedirect.com/science/article/pii/S1364661319301615
- Cleeremans A, Achoui D, Beauny A, et al. Learning to be conscious. Trends Cogn Sci. 2020;24:112–123. https://axc.ulb.be/uploads/2019/12/2020ticsaxcetal-preprint-1577628822.pdf
- Dennett DC. Illusionism as the obvious default theory of consciousness. J Conscious Stud. 2016;23:65–72. https://www.researchgate.net/profile/Daniel-Dennett/publication/316513753_Illusionism_as_the_obvious_default_theory_of_consciousness/links/6087858c907dcf667bc70df1/Illusionism-as-the-obvious-default-theory-of-consciousness.pdf
- Graziano MSA, Webb TW. The attention schema theory: A mechanistic account of subjective awareness. Front Psychol. 2015 Apr 23;6:500. doi: 10.3389/fpsyg.2015.00500. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4407481/
- Guevara R, Mateos DM, Pérez Velázquez JL. Consciousness as an emergent phenomenon: A tale of different levels of description. Entropy (Basel). 2020;22(9):921. doi: 10.3390/e22090921. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7597170/
- Hunt T, Schooler JW. The easy part of the hard problem: A resonance theory of consciousness. (Erratum in: Front Hum Neurosci. 2020 Sep 04;14:596409) Front Hum Neurosci. 2019;13:378. doi: 10.3389/fnhum.2019.00378. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6834646/?report=classic
- Min BK. A thalamic reticular networking model of consciousness. Theor Biol Med Model. 2010;7:10. doi: 10.1186/1742-4682-7-10. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2857829/
- Northoff G, Huang Z. How do the brain’s time and space mediate consciousness and its different dimensions? Temporo-spatial Theory of Consciousness (TTC). Neurosci Biobehav Rev. 2017 Sep;80:630–645. doi: 10.1016/j.neubiorev.2017.07.013. https://static1.squarespace.com/static/528facb6e4b0a18b7e9cde91/t/59ba9b73d2b85759243b987b/1505401733286/2017+-+How+do+the+brain%E2%80%99s+time+and+space+mediate+consciousness+and+its+different+dimensions+Temporo-spatial+theory+of+consciousness+%28T.pdf
- Northoff G, Zilio F. Temporo-spatial Theory of Consciousness (TTC)—Bridging the gap of neuronal activity and phenomenal states. Behav Brain Res. 2022 Apr;424:113788. doi: 10.1016/j.bbr.2022.113788. https://static1.squarespace.com/static/528facb6e4b0a18b7e9cde91/t/6209095ede99f90f5935f940/1644759400210/northoff+zilio.pdf
- Rolls ET. Neural computations underlying phenomenal consciousness: A Higher Order Syntactic Thought Theory. Front Psychol. 2020 Apr 7;11:655. doi: 10.3389/fpsyg.2020.00655. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7154119/
- Ruffini G. An algorithmic information theory of consciousness. Neurosci Conscious. 2017;2017(1):nix019. doi: 10.1093/nc/nix019. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6007168/?report=classic
- Sevush S. Single-neuron Theory of Consciousness. Journal of Theoretical Biology. 2006;238(3):704–725. https://doi.org/10.1016/j.jtbi.2005.06.018. http://cogprints.org/4432/1/single_neuron_theory.htm
- Thagard P, Stewart TC. Two theories of consciousness: Semantic pointer competition vs. information integration. Conscious Cogn. 2014;30:73–90. doi: 10.1016/j.concog.2014.07.001. http://cogsci.uwaterloo.ca/Articles/thagard.two-theories.consc&cog.2014.pdf
- Ward LM. The thalamic dynamic core theory of conscious experience. Conscious Cogn. 2011 Jun;20(2):464–486. doi: 10.1016/j.concog.2011.01.007. http://ahuman.org/svn/ahengine/research/articles/Biological/2011-Consciousness-and-Cognition.pdf
- Jerath R, Beveridge C. Top mysteries of the mind: Insights from the default space model of consciousness. Front Hum Neurosci. 2018 Apr 24;12:162. doi: 10.3389/fnhum.2018.00162. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5932384/
- Kesserwani H. The clinical, philosophical, evolutionary and mathematical machinery of consciousness: An analytic dissection of the field theories and a consilience of ideas. Cureus. 2020 Dec 18;12(12):e12139. doi: 10.7759/cureus.12139. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7813534/
- Mallatt J. A traditional scientific perspective on the integrated information theory of consciousness. Entropy (Basel). 2021;23(6):650. doi: 10.3390/e23060650. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8224652/?report=classic
- McGovern K, Baars BJ. Chapter 8. Cognitive theories of consciousness. In: Zelazo PD, Moscovitch M, Thompson E, editors. The Cambridge handbook of consciousness. Cambridge University Press; 2007:177–205. http://perpus.univpancasila.ac.id/repository/EBUPT181231.pdf
- Rorot W. Bayesian theories of consciousness: A review in search for a minimal unifying model. Neurosci Conscious. 2021;2021(2):niab038. doi: 10.1093/nc/niab038. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8512254/?report=classic
- Place UT. Is consciousness a brain process? Br J Psychol 1956;47(1):44–50. https://people.ucsc.edu/~jbowin/Ancient/place1956.pdf
- Feigl H. The “mental” and the “physical.” In: Concepts, theories, and the mind-body problem. Minneapolis: University of Minnesota Press; 1958. https://conservancy.umn.edu/handle/11299/184614
- Loorits K. Structural qualia: A solution to the hard problem of consciousness. Front Psychol. 2014;5:237. doi: 10.3389/fpsyg.2014.00237. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3957492/
- Leisman G, Koch P. Networks of conscious experience: Computational neuroscience in understanding life, death, and consciousness. Rev Neurosci. 2009;20(3–4):151–176. https://www.researchgate.net/profile/Gerry_Leisman/publication/41449921_Networks_of_Conscious_Experience_Computational_Neuroscience_in_Understanding_Life_Death_and_Consciousness/links/0fcfd5023c1d28270d000000.pdf
- Chalmers DJ. Facing up to the problem of consciousness. J Conscious Stud. 1995;2(3):200–219. http://consc.net/papers/facing.html
- Chalmers DJ. Moving forward on the problem of consciousness. J Conscious Stud. 1997;4(1):3–46. http://consc.net/papers/moving.html
- Velmans M. How could conscious experiences affect brains? J Conscious Stud. 2002;9(11):3–29. http://cogprints.org/2750/1/JCSVelmans2001.final.htm
- Chang AYC, Biehl M, Yu Y, Kanai R. Information Closure Theory of Consciousness. Front Psychol. 2020;11:1504. doi: 10.3389/fpsyg.2020.01504. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7374725/?report=classic
- Oizumi M, Amari S, Yanagawa T, Fujii N, Tsuchiya N. Measuring integrated information from the decoding perspective. PLoS Comput Biol. 2016 Jan 21;12(1):e1004654. doi: 10.1371/journal.pcbi.1004654. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4721632/
- Boly M, Sasai S, Gosseries O, et al. Stimulus set meaningfulness and neurophysiological differentiation: a functional magnetic resonance imaging study. PLoS One. 2015 May 13;10(5):e0125337. doi: 10.1371/journal.pone.0125337. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4430458/
- Haun AM, Oizumi M, Kovach CK, et al. Conscious perception as integrated information patterns in human electrocorticography. eNeuro. 2017;4(5):ENEURO.0085-17.2017. doi: 10.1523/ENEURO.0085-17.2017. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5659238/
- Mindt G. Integrated Information Theory and the hard problem of consciousness. https://www.academia.edu/12294774/Integrated_Information_Theory_and_the_Hard_Problem_of_Consciousness
- Mindt G. The problem with the ‘information’ in Integrated Information Theory. J Conscious Stud. 2017;24:130–154. https://newdualism.org/papers/G.Mindt/Mindt-JCS2017.pdf
- Dubrovsky DI. “The Hard Problem of Consciousness.” Theoretical solution of its main questions. AIMS Neurosci. 2019;6(2):85–103. doi: 10.3934/Neuroscience.2019.2.85. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7179338/?report=classic