Introduction and Definitions (2022)

Important Notification:

If this is the webpage that the reader lands on first when visiting this website, please note that it is not the homepage. If the reader would like to visit the Hompage first, please click: Go to Homepage.

Also, this is an old version; there is a newer version of this chapter here: New Introduction and Definitions.


One thing that has always fascinated us is the mind. It is the only thing that we can be certain of existing, yet we do not know what it is. This contrasts with things outside it. We cannot be certain that they really exist—they may be just illusions. Still, we apparently know what they are. We also have ample information about them. For example, we know that an apple is a material object composed of fruit cells, with about 85% water content, full of vitamin C and fiber, and almost without fat. Even in the case of something that is non-material, such as an electromagnetic wave, we can tell that it has zero rest mass, travels at speed c in a vacuum regardless of the measuring frames, and possesses the dual nature of being a wave and a particle; we can even write formulas to describe its properties. Moreover, for both, we can answer the questions of what they are basically composed of (the apple, a mixture of several kinds of elementary particles; the electromagnetic wave, only one kind—the photon) and how and why they occur. We cannot do such things in the case of the mind. What we know about the mind is only that it is non-material, can perform various mental activities, such as sensing stimuli, thinking, and executing motor commands, and has observable functional properties, such as being private, subjective, and intentional [1–9]. However, we do not know, among other things, what it is physically and ontologically (i.e., what it is composed of), how it occurs in the material world, and why it occurs in this universe.

In the past, these conundrums persisted probably because there was little objective, scientific evidence for thinkers to resolve them; there were only subjective observations of their minds and others’ behaviors. However, the situation has changed over the past century. Objective methods of observation using various types of scientific instruments, such as electroencephalography, magnetoencephalography, and functional magnetic resonance imaging, have been developed and employed in investigations of the mind, resulting in a large number of empirical results. Accordingly, we now have a wealth of both subjective and objective evidence as a basis for building theories that can unravel the above puzzles. Owing to this, both physical and philosophical theories about the mind and its phenomena, especially qualia and consciousness, have been increasingly published over the past few decades (for example, References 8,10–23). Yet significant progress has been made mainly in the area of the easy problem [24–30], or the problem of what the neural circuits and processes for the mind and its functions (such as perception, cognition, and motor execution) are; little advance has been made regarding the ontological problem (or the problem of what the mind and its phenomena fundamentally are), the hard problem [9,11,24–38], and the explanatory gap [9,25–30,33,37–48] (or the problems of why and how they occur). Hence, this theory will strive to answer these last three problems and other associated issues using current scientific evidence.

However, before the attempt begins, it is imperative to note that one of the factors that hinder progress in attacking these three problems is that definite definitions of the mind, qualia, consciousness, and several related terms have not been established. This lack of definite definitions has persisted even up to now despite extensive debates on these matters for a long time. Without standard definitions, confusion and misunderstanding of these terms can occur, especially when they are used without specifying the definitions being used. Such confusion and misunderstanding limit the interpretation and utilization of those works and impede progress in solving the puzzles.

Because many special terms will be used in this theory, to avoid such setbacks, these terms will be defined specifically at appropriate points in the theory. The theory chooses, among various definitions, the ones that not only seem to be commonly used by the general public and academia but also define entities that are scientifically testable. The aim of this theory is not to assert the best definitions but to find what entities as defined by this theory’s definitions physically and ontologically are, why and how they occur, and other answers to associated questions. Thus, all set-up definitions are working definitions for use in this theory. Hence, caution should be exercised when comparing literature using these terms as they may have different meanings. 

Some terms are used throughout the theory from the beginning and are defined in this chapter as follows:

D1. Mind

“Mind” is the principal term in this theory. Therefore, it will be analyzed in detail before it is defined.

D1.1 Meanings of the word “mind”

To properly define mind for scientific investigations is not a straightforward task. We have to find something that is both (I) acceptable to be mind in the usual sense and (II) amenable to scientific investigations. Nevertheless, this can be achieved as follows: 

First, let us examine what the word “mind” can mean. We will find that it, like many other words that have no definite definitions, has various meanings depending on many factors, such as personal factors (e.g., the person’s belief, culture, and religion), the aspect that is being focused on (e.g., functional, anatomical, or ontological), and the context of usage (e.g., psychiatric, physical, or philosophical). For example, the words “mind” in the following sentences have at least slightly different meanings: “She tries to keep her mind on what she’s doing”; “Albert Einstein had a truly beautiful mind”; “Children usually develop theory of mind during preschool years”; “Universal Mind is the formless energy that animates all of life”; and “One of the most difficult philosophical problems is the mind-brain problem.” Moreover, it can sometimes be used interchangeably with closely related words, such as soul, spirit, psyche, and consciousness [1,9,26,36,49–70], which can cause even more confusion.

In the ontological sense, which is an important sense and the sense in which this theory examines the mind, the word “mind” can be used to mean an alive, non-material entity that exists in

     i) every physical object, which comprises both living and non-living objects (such as rocks, rivers, clouds, computers, and robots; and even up to stars, galaxies, and the universe; and down to molecules, atoms, and elementary particles)—the belief of panpsychism [66,71–78], or

     ii) only every living thing, which comprises animals and non-animal organisms (such as plants, fungi, algae, amoebae, and bacteria)—the belief that may involve the concepts of vitalism, life force, and the three forms of soul, that is, vegetative, sensitive, and rational souls [79–86], or

     iii) only every animal, which comprises humans and non-human animals (such as dogs, birds, fish, insects, and sponges) [87–89], or 

     iv) only every human (e.g., Descartes and people in some religions and cultures believe that only humans have minds, but animals—or mindless automata, as Descartes referred to them—and everything else do not [90–92]).

In the above description, that the entity is non-material means it is not composed of material substance, and that it is alive means it performs activities—not simply existing without doing anything, which would make it just an abstract, lifeless entity, such as beauty, goodness, roundness, equality, and singularity. If we look into the activities that the mind is generally supposed to perform, we find that they are

     a) sensing signals from the environment, its physical body, and its non-material self, such as seeing things in the outside world, feeling its body parts, and experiencing emotions in itself, 

     b) acting on signals, such as analyzing, integrating, synthesizing, storing, and retrieving signals, to control body functions and respond to stimuli, such as thinking, planning, making decisions, memorizing things, and recalling events, and

     c) sending signals to its body parts to control their functions and respond, such as commanding the mouth to speak, hands to work, and legs to run.

Therefore, to be alive in the usual sense means to perform these three kinds of signal-processing activities. An alive, non-material entity is thus a non-material entity that performs these activities, which will hereafter be referred to as the three activities. Because such an entity—a non-material entity that performs the three activities—can be acceptable to be mind in the usual sense, it fulfills our first criterion (I).

Next, let us consider the second criterion (II)—being amenable to scientific investigations. To be amenable to scientific investigation, an entity must be observable and testable using our usual senses or scientific instruments. However, for an entity to qualify as such, it must evidently execute its activities so that it can be observed and tested. On the other hand, if an entity cannot evidently execute its activities but can just hypothetically or imaginarily perform them, we will not be able to observe and test it. It will be a hypothetical or imaginary entity, not a scientific one, and will not be amenable to scientific investigation. Thus, to be amenable to scientific investigation and fulfill the second criterion (II), an entity must execute the three activities evidently.

Altogether, for an entity to be (I) acceptable to be mind in the usual sense and (II) amenable to scientific investigations, it must be a non-material entity that evidently executes the three activities. This is the type of mind that this theory investigates.

D1.2 Where can we find this kind of mind?

If we want to scientifically investigate this kind of mind to know what it is, why it occurs, how it occurs, and so on, we must first find it in the physical world. The question is where we can find this kind of mind. Can we find it in all entities in the four groups listed in D1.1 or just in some groups?

The answer is that we can find it in things that evidently exhibit the three activities. For things that do not evidently exhibit the three activities, any non-material entities that may exist in those things will naturally not be able to execute the three activities evidently; therefore, it will not be the kind of mind that this theory selects to investigate. However, to find things in which this kind of mind can exist, the previous grouping of entities in D1.1 will not be suitable. Instead, it would be more apposite to regroup entities from the previous four groups into six new groups according to not only what kind of entity each of them is but also what physical system is associated with the three activities in each of them, as follows:

     Group I: non-living things that are not computerized (rocks, clouds, stars, galaxies, molecules, atoms, etc.)

     Group II: unicellular organisms (amoebae, bacteria, some algae, etc.)

     Group III: multicellular organisms without a nervous system (both non-animals [some algae, most fungi, all plants] and animals without a nervous system [Trichoplax, sponges, and mesozoans])

     Group IV: multicellular organism parts (cells, tissues, and organs of multicellular organisms)

     Group V: multicellular organisms with a nervous system or, equivalently, animals with a nervous system (humans and other animals with a nervous system)

     Group VI: non-living things that are computerized (computers, robots, and other computerized machines)

Now, let us investigate them in detail.

     Group I: non-living things that are not computerized (rocks, clouds, stars, galaxies, molecules, atoms, etc.)

On the surface, it does not seem that entities in this group evidently exhibit any of the three activities, at least not in the usual forms that the general public thinks of. Thus, it may seem impossible for the mind, as stated at the end of Section D1.1, to exist in them. Nevertheless, if we examine these non-living things closely, we will find that they can also be considered to exhibit all the three activities evidently, at least in a technical sense. This is because their parts or components, both macroscopic (such as portions of a rock, cloud, or a star) and microscopic (such as constituent molecules, atoms, or elementary particles of those things), can be considered to a) sense signals, which in these cases are forces (any of the four fundamental forces) from other things, b) act on these signals, which means dealing with these forces by taking several parameters (directions, strength, patterns, etc.) of the forces and several parameters (positions, mass/energy, electricity, etc.) of all the interacting things into the involved physical rules for results to occur, and c) send signals (the resulting or responding forces) to other parts and interacting things as responses. Therefore, at least in the technical sense, these three kinds of signal processing can be considered the three activities although, manifestation-wise, these activities are very hard to recognize and are not familiar to us. 

Because things in this group evidently exhibit activities that can be considered as the three activities (sensing, acting on, and sending signals), it is possible that there exists a non-material entity that evidently executes the three activities, or a mind, in each of them. Still, the kind of mind in this group will not be the mind in the ordinary sense that general people think of because the manifestations of its three activities are very difficult to discern, are of the technical kind, and are not those that general people are familiar with or recognize in everyday life. 

Another point to be noted is that, in this group, the signal processing of the three activities is basically the processing of fundamental forces, which is the activity of the elementary physical system. Thus, the three activities and the mind in this group are associated with the elementary physical system. We will see that this kind of association differs from those in the other groups, making this kind of mind different from those in other groups.

Group II: unicellular organisms (amoebae, bacteria, some algae, etc.)

Group III: multicellular organisms without a nervous system (both non-animals [some algae, most fungi, plants] and animals without a nervous system [Trichoplax, sponges, and mesozoans])

Group IV: multicellular organisms’ parts (cells, tissues, and organs of multicellular organisms)

Similar to Group I, superficially, it may not seem that entities in these three groups evidently exhibit the three activities—at least not in the usual forms that the general public thinks of. Thus, it may seem impossible for the mind, as stated at the end of Section D1.1, to exist in them. Nevertheless, after scientists have scrutinized these entities, they find that they can also be considered to exhibit all the three activities evidently, but in a rather technical sense. This is because, as is now scientifically established, cells, tissues, and organs can a) sense signals, which are mostly in the form of biological molecules (but can also be in other forms such as light, electricity, or mechanical force), from both the outside and the inside of their physical extent, b) act on those signals (integrate, store, retrieve, etc.), mostly by biochemical reactions, and c) send signals, again mostly in the form of biological molecules, between their parts, and to their effectors to control the entity’s functions and respond to stimuli [93–117]. Therefore, at least in the technical sense, these three kinds of signal processing in these three groups can be considered the three activities although, manifestation-wise, these activities are not easily recognizable and are not familiar to us.

Because entities in these groups evidently exhibit activities that can be considered as the three activities, it is possible that there exists a non-material entity that evidently executes the three activities, or a mind, in each of them. Still, like that in Group I, the kind of mind in these groups will not be the mind in the ordinary sense that general people think of because the manifestations of its three activities are not obvious, are of the technical kind, and are not those that general people are familiar with and recognize in everyday life. 

In these groups, as discussed above, the signal processing of the three activities is basically biochemical reactions (intracellular and intercellular) of biological molecules and is principally for each individual cell’s, tissue’s, or organ’s survival, with some intercellular signal processing (in the case of multicellular organisms, tissues, and organs) for each organism’s survival as a whole to a lesser extent. Importantly, each entity in these groups does not have a dedicated system to connect all its parts into a functional network that can perform the three activities for the entity as a whole. We will see that this deficit makes these groups essentially different from the next group, animals with a nervous system. Lastly, since the signal processing of the three activities in these groups is basically carried out through biochemical reactions—a function of the biochemical system—the three activities and the mind in these groups are associated with the biochemical system. Thus, the kind of mind in these groups differs from that in Group I, and we will see that it also differs from those in the following two groups. 

The final point to be noted for these groups is that the three activities performed by the elementary physical system also exist in them because all entities in these groups, like all other physical entities, are composed of elementary particles. Thus, the kind of mind associated with the elementary physical system also exists in these groups. Hence, two kinds of minds coexist in these groups. This will be discussed in general later.

     Group V: animals with a nervous system (humans and other animals with a nervous system)

Unlike the entities in Groups I–IV, humans distinctly exhibit all the three activities. Our close relatives—other animals with a nervous system (mammals, birds, reptiles, fish, insects, etc.) also do—the more advanced in the evolutionary tree, the more distinct. For animals in this group, in addition to having a biochemical system like those in Groups II–IV performing the three activities at the cellular, tissue, and organ levels for the survival of their individual cells, tissues, and organs, respectively, each animal has a new dedicated system—the nervous system—overlaying the biochemical system to connect all parts of the animal into a functional network that performs a new set of the three activities at a higher level for the survival of the animal as a whole. This new system can a) sense (see, hear, smell, etc.) signals via its various specialized sensors (eyes, ears, nose, etc.), b) act on (analyze, integrate, store, retrieve, synthesize, etc.) signals, resulting in various mental activities (evaluating, planning, memorizing, recalling, decision-making, etc.), and c) send signals to its various parts and effectors (muscles, glands, and other specialized cells) to control body functions and respond to stimuli (by moving, making sounds, secreting something, etc.). Obviously, these three kinds of signal processing can readily be considered the three activities because, manifestation-wise, their activities are distinct and familiar to us.

Because humans and other animals with a nervous system evidently exhibit all the three activities, it is likely that there exists a non-material entity that evidently executes the three activities, or a mind, in each of them. The kind of mind in this group is the mind in the ordinary sense that the general public thinks of because the manifestations of its three activities are obvious and are those that people are familiar with and recognize in everyday life.

In this group, as discussed above, the signal processing of the three distinct activities is performed by the nervous system, which utilizes a new type of specialized cell—the neuron—and a new type of signaling that employs electrical and electrochemical signals and reactions. Thus, the physical machinery that operates the three activities in this group is different from that in Groups II–IV. In addition, functionally, the three activities of the nervous system are mainly for the survival of the whole animal, not just of each individual cell, tissue, or organ, as is the case for the biochemical system. Moreover, compared with those of the biochemical system, the three activities of the nervous system have more pronounced and coordinated effects on the whole animal and are more obvious, manifestation-wise. Hence, the three activities of the nervous system in this group are not only structurally but also functionally and manifestation-wise different from those of the biochemical system in Groups II–IV. Lastly, since the nervous system is responsible for the obvious three activities in animals in this group, the obvious three activities and the mind of the obvious three activities in this group are associated with the nervous system. Hence, this kind of mind differs from those in the four groups discussed earlier and, as we will see, from that in the next group. 

The final point to be noted for this group is that the three activities performed by the elementary physical system and those by the biochemical system also exist in this group because entities in this group, like all other physical entities, are basically composed of elementary particles and because cells, tissues, and organs compose entities in this group. Thus, the kind of mind associated with the elementary physical system and that associated with the biochemical system also exist in this group. Hence, three kinds of minds coexist in this group. This will be discussed in general again later.

     Group VI: non-living things that are computerized (computers, robots, and other computerized machines)

Because entities in this group are non-living things, it may seem impossible for minds, like the ones we think exist in people, to exist in them. Nevertheless, if we use the word mind in a broader sense as stated at the end of Section D1.1, a non-material entity that evidently executes the three activities, we will see that they can have some kind of mind existing in them. This is because they do exhibit all the three activities evidently, at least in a behavioral sense. That is, they can a) sense (see, hear, or receive in other ways) signals via their various sensors (camera, microphone, mouse/keyboard, radiation sensor, chemical sensor, etc.), b) act on (analyze, integrate, store, retrieve, synthesize, etc.) signals resulting in complex activities that can be considered to be evaluating, planning, memorizing, recalling, decision making, etc., and c) send signals to their various parts (speaker, monitor, printer, hard drive, robotic arms/legs, etc.) to control the functions of those parts to respond to stimuli (such as by sending out sounds, displaying pictures, printing out texts, storing information, working, etc.). Obviously, these three kinds of signal processing can readily be considered the three activities because, manifestation-wise, these activities are obvious and familiar to us.

Because things in this group evidently exhibit activities that can be considered as the three activities, it is very possible that there exists a non-material entity that evidently executes the three activities, or a mind, in each of them. The kind of mind in this group can be the mind in the ordinary sense that general people think of because the manifestations of its three activities are obvious and are those that general people are familiar with and recognize in everyday life. 

In this group, the signal processing of the obvious three activities is operated by the electronic system, which utilizes electronic circuits and signaling. Obviously, this kind of signaling is different from that of the nervous system in Group V. Operationally, the electronic system uses digital electronic signals of 0 and 1 bits (in the form of bit coding) and primarily processes data sequentially, with a limited number of parallel processing elements, while the nervous system uses analog signals of action potentials, quantal neurotransmitter releases, and postsynaptic potentials [118–123] (in the form of frequency, temporal, population, and other types of coding) [124–145], and processes information both sequentially and in parallel, with an uncountable number of parallel processing components. Structurally, the electronic system is silicon-based, whereas the nervous system is carbon-based. Thus, the physical machinery that executes the obvious three activities in this group is different from that in Group V. Clearly, it is also different from that in Groups I–IV. Lastly, since the electronic system is responsible for the obvious three activities in this group, the obvious three activities and the mind of the obvious three activities in this group are associated with the electronic system. Therefore, this kind of mind differs from those in all other groups. 

Finally, it is important to note that the three activities performed by the elementary physical system also exist in this group because the entities in this group, like all other physical entities, are basically composed of elementary particles. Thus, the kind of mind associated with the elementary physical system also exists in this group. Hence, two kinds of minds coexist in this group. This will be discussed in general below.

In conclusion, because all things in this universe can be considered to exhibit the three activities evidently, it is possible that there exists a non-material entity that evidently executes the three activities—the mind—in all of them. However, because the three activities in these various groups differ in at least three major aspects (please see Table 1), the minds of these various groups may be of different kinds, not just one single kind.

Table 1. The three activities in various groups

GroupManifestationsMainly forMost evident system
Iin a technical senseeach componentthe basic physical system
IInot obviouseach individual cellthe biochemical system
IIInot obviouseach cell, tissue, organthe biochemical system
IVnot obviouseach cell, tissue, organthe biochemical system
Vobviousthe whole entitythe nervous system
VIobviousthe whole entitythe electronic system
Groups I—non-computerized, non-living things, II—unicellular organisms, III—multicellular organisms without a nervous system, IV—parts of multicellular organisms, V—animals with a nervous system, and VI—computerized non-living things

First, except for Groups V and VI, the three activities are not obvious and are not activities of the mind in the ordinary sense. Second, except for Groups V and VI and for each elementary particle in Group I, these activities are mainly for individual parts (cells, tissues, organs, or portions of the entity), not for the whole entity. Third, the most evident three activities in each group are associated with different physical systems: the elementary or basic physical system in Group I, the biochemical system in Groups II–IV, the nervous or neural system in Group V, and the electronic system in Group VI. Therefore, based on these characteristic differences, it can be logically argued that there exist not one but several kinds of minds—elementary, biochemical, neural, and electronic. 

Moreover, different kinds of minds can coexist in a single entity in layers (please see Table 2). It seems that the kind of mind in Group I, the elementary mind, associated with the elementary or basic physical system, is the most basic and exists in the first layer of all things because all things are composed of elementary particles and the basic physical system operates in every physical entity.

Table 2. The mind in each layer

GroupThe 1st layerThe 2nd layerThe 3rd layer
IThe elementary mind
IIThe elementary mindThe biochemical mind
IIIThe elementary mindThe biochemical mind
IVThe elementary mindThe biochemical mind
VThe elementary mindThe biochemical mindThe neural mind
VIThe elementary mindThe electronic mind
Groups I—non-computerized, non-living things, II—unicellular organisms, III—multicellular organisms without a nervous system, IV—parts of multicellular organisms, V—animals with a nervous system, and VI—computerized non-living things

Later, another kind of mind, the biochemical mind, associated with the biochemical system, evolved in Group II and subsequently in Groups III and IV, overlaying the elementary mind. Afterward, yet another kind of mind, the nervous system or neural mind, associated with the nervous system, evolved in Group V to exist in the highest layer, overlaying the elementary and biochemical minds. The latest development is the artificial kind of mind, the electronic mind in Group VI, associated with the electronic system, which is invented by humans and overlays the elementary mind.

It is important to note that the above classification is a simplified one. Its purpose is to categorize things into easily understandable groups. In reality, similar to the classification of other complex systems, such as living things, no absolute boundaries exist between systems. For example, no definite lines separate the basic physical, basic chemical, complex chemical, biochemical (which supports life in general), pre-nervous (which supports pre-nervous system animals: Trichoplax, sponges, and mesozoans), and nervous systems. Nevertheless, the classification helps us appreciate the possible existence of a continuous mind spectrum as well as its several distinct mind forms, associated with different physical systems, along the spectrum and guides us in the attempt to solve the questions about the mind as follows:

D1.3 The definition of the mind

Before we go on to set up the definition of the mind, let us take note of one fact. From the preceding discussions, we can see that a non-material entity that evidently executes the three activities, the mind, possibly exists in all physical entities. However, we cannot be certain that this is true. Except for one’s self, one cannot be certain that there is the mind in other things.* Nevertheless, it is scientifically better to accept the possibility that there is the mind in other physical entities and investigate this possibility rather than deny it and not investigate it at all. Therefore, the author presumes that this possibility is viable, and this theory is based on the premise that there is the mind in physical entities other than humans.

(* However, if one believes that nature is regular and consistent … and elegant, one should believe that, at least, there are minds in other people and animals with a nervous system. The reason: If the reader or author, which is an animal with a nervous system and exhibits the three activities distinctly, has a non-material entity that performs the three activities—a mind—inside, other people and other animals with the same characteristics should have minds within them as well.)

Granted that there is a mind in each physical entity, we still have to face the problem that their minds may not be the same because of the differences discussed in the previous section. Questions about the mind, such as what the mind is, why it occurs, how it occurs, what its important phenomena—qualia and consciousness—are, and why and how they occur, are some of the most difficult. As there may be different answers for different kinds of minds, investigating a large, heterogeneous group of minds to find definitive answers for all of them in one attempt is likely to be complicated and unsuccessful. Hence, this theory, which is a basic theory, chooses to investigate only one homogeneous group. The group chosen is the kind of mind in Group V—animals with a nervous system (humans and other animals with a nervous system)because it is the kind of mind that we think of most when the word “mind” comes up, that we are most familiar with, and that we have been wondering about for a very long time. This theory will attempt to answer what this kind of mind is physically and ontologically, why and how it occurs, and other related questions. Hopefully, the answers found by this theory for this kind of mind will provide a basis for more advanced theories to cover other kinds of minds and, ultimately, for a general theory of the mind that will cover all kinds of minds completely.

Therefore, in accordance with the reasons discussed, this theory will set up a working definition for the kind of mind that this theory will investigate as follows: 

Definition: The mind is a non-material entity that exists in an animal with a nervous system and that can evidently execute the three activities for the whole animal.

Specifically, the three activities are

     i. sensing signals from outside the body (such as light, sound, and tactile stimuli), from its own body parts (such as proprioceptive stimuli from joints, vestibular stimuli from vestibular organs, and pain from internal organs), and from within its mental self (such as emotions, thoughts, and memories),

     ii. acting on signals (such as integrating, storing, and retrieving), resulting in various mental processes, including the highest-level mental processes that that animal can have, and

     iii. sending signals between its mental parts (such as between the sensory perception, consciousness, and emotion parts) and to the effectors of its body (striated muscles and smooth muscles, glands, and other specialized cells) to control its own body functions and respond to stimuli.

If the mind can execute all the three activities completely, it is considered a complete mind. If the mind can execute them only partially because the animal is still in a developing stage, such as the fetal stage, it is considered an immature mind. If the mind can do so only partially because the animal is sleeping, suppressed by a pharmacologic or toxic agent, or affected by a pathologic condition, such as a cerebral concussion, brain tumor, or stroke, it is considered a partially functioning mind. Although the mind can exist in various forms with varying degrees of completeness in executing the three activities, all forms are basically the same. That is, all are non-material entities that exist in animals with a nervous system and can evidently execute the three activities for the whole animals (in their present conditions). Thus, they are the same kind of mind.

It should also be noted that when the brain of an animal is cut off, its body, limbs, and other parts can still perform the three activities (sensing, operating, and sending signals) for some time before all those parts die. In this case, a non-material entity that is left in these parts and can still perform the three activities is not the mind because it cannot operate signals at the highest level that that animal can (see ii above) and thus cannot generate the highest-level mental processes that the animal can have; it is considered to be only part of a complete mind. 

Finally, to recap in plain language, the mind that this theory will investigate is the non-material entity that exists in us—the reader, the author, other people, and other animals with a nervous system—and evidently executes the three activities (sensing, operating, and sending signals) for each of us as a whole. This theory attempts to determine what this entity physically and ontologically is and why and how it occurs, together with answers to other associated questions.

D2. Brain

In this theory, the term “brain” refers to a material organ that executes the three activities, including the highest-level ones, for the whole animal. Structurally, it is composed of material components (mainly neural, but also vascular, connective, and other tissues) and all their physical activities (chemical, electrical, magnetic, etc.). Any non-material entity that may coexist with this organ, such as, theoretically, the mind or soul, is not part of the brain by this definition. 

D3. Mental Process and Mental Phenomenon

In this theory, a mental process is a mind’s component that performs any of the three activities, and its product is called a mental process phenomenon or, in short, a mental phenomenon. For example, the visual-perception mental process is a mental process that perceives an image from visual stimuli, and its product—an image in the mind or a mental image—is its mental phenomenon; the emotion-generation mental process is a mental process that generates an emotion, and its product—an emotion—is its mental phenomenon, and the thought-formation mental process is a mental process that forms a thought, and its product—a thought—is its mental phenomenon.

A mental process can be a conscious mental process or an unconscious mental process [7,70,146–154], depending on whether the mind can experience what the mental process’s mental phenomenon is like or not [9,17,27–29,155]; if the mind can, then it is a conscious mental process; if not, then it is an unconscious mental process. For example, the final-stage visual-perception mental process is a conscious mental process because the mind can experience what its mental phenomenon (e.g., red in the mind) is like (e.g., we can experience what the red in the mind is like), whereas all early-stage visual-perception mental processes are unconscious mental processes because the mind cannot experience what their mental phenomena are like (e.g., we cannot experience what the mental phenomena of red perception in those early stages are like). This will be discussed in detail in Chapters 3 to 7.

D4. Neural Circuit

A neural circuit is a group of neurons that are connected in a specific pattern to perform its principal function, which is to process signals for a certain operation [156–164], such as perceiving visual signals to form an image, integrating various signals to form a decision, or synthesizing signals to control a motor movement. Anatomically, a normal neural circuit is usually a complex, 3-dimensional circuit and always has connections to other neural circuits and sometimes to its sensor or effector so that it can receive signals from and send signals to them. A neural circuit is not a multifunctional circuit that performs various neural functions alternately. Instead, each neural circuit typically performs only one function [156,163,160,165], such as creating a visual image, forming a decision, or generating a motor command. These function-specific neural circuits reside separately in various specific brain areas, such as visual-perception neural circuits in the visual and visual association cortices, emotion-generation neural circuits in the amygdala, hippocampal formation, limbic cortex, and some neighboring areas, and thought-formation neural circuits in the frontal and nearby cortices [165–180]. Currently, more than one hundred distinct functional brain areas have been identified using various methods [176,178,179,181]. 

However, a neural circuit may not be just a single group of locally connected neurons but may be a network of scattered, connected groups of neurons [160,164]. This is especially true for complex neural circuits that perform complicated functions, such as the neural circuits of the central-executive network (which operates attention- and cognition-demanding tasks) [161,182,183], the salience network (which recognizes the most relevant stimuli among various stimuli) [161,182–186], and the default mode network (which functions as the baseline state when the mind is not engaging in externally oriented tasks or as the internal mentation state when the mind is engaging in internally oriented tasks, such as retrieval of autobiographical memory, envisioning the future, or self-reflection) [161,183,184,185–205]—all these complex neural circuits consist of several neuron groups in various separate areas linked together. One of the most important neural circuits is the consciousness neural circuit, which is also a complex neural circuit; it functions for consciousness and will be discussed in detail in Chapter 7.

D5. Neural Process

A neural process is the signal-processing process of a neural circuit. A neural process is the principal process of a neural circuit because it processes the signals in the neural circuit to perform a specific operation. However, the neural process is not the only process in a neural circuit. A neural circuit has two other major processes: the metabolic process and the structural maintenance process (of membranes, organelles, cytoskeletons, etc.).

Two types of signal-processing processes occur in neural circuits. The first is a fast signal-processing process, which processes electrical or chemical signals that are transmitted through electrical or chemical synapses and then transfers the processed signals to other synapses on other neurons for further processing. This signal-processing process proceeds rapidly on a timescale of milliseconds (ms). The second is a slow signal-processing process, which modifies the neural circuit’s structure and performance (e.g., by modifying the structures of neural membranes, receptors, and synapses to change their sensitivity and responses) to store information from the passing signals for the formation of long-term memory, skill, personality, etc. [206–211]. This signal-processing process may take much longer time than the first type; it can last from milliseconds to years [118,208].

The neural process is not instantaneous. Even for a fast signal-processing process, it takes some milliseconds to complete the process in each neural circuit [118,212–215]. For example, after signals of an image reach the primary visual cortex, they are processed through successive visual-perception processing areas, taking about 10 ms of processing in each area so that the visual perception of this image is formed in about 100–120 ms [215–220], the whole brain is updated about the new image within 100–200 ms (depending on the species and brain size) [76], and a conscious visual percept is formed in about 150–400 ms from the signal onset [213,221,222].

The result of the processing is the product that the neural circuit and process function for. For example, the result of visual-perception processing—an image—is the product that the visual-perception neural circuit and process function for. One neural process that is of paramount importance is the neural process whose result is consciousness. This neural process is the signal-processing process of the consciousness neural circuit and is called the consciousness neural process. Like the consciousness neural circuit, a more detailed discussion is provided in Chapter 7.

D6. Signaling Pattern

A signaling pattern (SP) is the pattern of signaling that a neural circuit sends to another neural circuit to convey information.

An SP is a dynamic, 3-dimensional pattern of neural signaling. Neural circuits can encode their information in SPs by encoding electrical spikes that are sent along the axons [118,119,129,206] (or the dendrites in some cases [223–228]) to synaptic or neuromuscular junctions in the form of frequency, temporal, population, latency, rank, and other types of coding [124–145]. Since a neural circuit communicates its information to others via electrical signaling, electrochemical signaling, or both in the form of SPs, an SP that the neural circuit sends to another circuit must be the information that is to be sent [129,132,141,206]. However, for a receiving neural circuit to be able to distinguish a particular piece of information, the SP for that particular piece of information must be unique—different from all other SPs for other information pieces [132]. For example, when the visual-perception neural circuit sends an SP to a certain neural circuit, the SP for the visual perception of the letter “A” must be unique—different from the SPs for that of the letter “B,” a red color, a house image, and so on.

SPs are very important because every neural circuit receives information from other neural circuits and its sensors and sends information to other neural circuits and its effectors in the form of SPs. Thus, every neural circuit is affected by other neural circuits’ and its sensors’ SPs, and it affects other neural circuits and its effectors by its SPs. The fact that SPs are the only signals that neural processes can read helps us answer the problems of what qualia and consciousness are. This matter will be discussed in detail in Chapters 3 to 7.

D7. Signaling State 

A signaling state (SS) is the pattern of signaling of a whole neural circuit, with signals circulating in the circuit in a particular pattern at a certain moment. 

In the same way that an SP is the information that a neural circuit sends to other neural circuits, an SS is the information that is in the neural circuit at that moment. For example, after the primary visual-perception neural circuit has received early-stage visual signals of a house from the lateral geniculate nuclei, it will have the SS—signals circulating in its circuit in a particular pattern—that is the information about the early-stage visual perception of the house; then, after the final visual-perception neural circuit has finished the process of perceiving the vision of the house, it will have the SS—signals circulating in its circuit in another pattern—that is the information about the final visual perception of the house (i.e., the visual image that we see in our mind). 

Similar to SPs, SSs are important because they contain information that exists in the neural circuits at a particular moment. Most importantly, we will see that the SS of a certain neural circuit, the consciousness neural circuit, is deeply connected to the occurrence and function of consciousness. This connection will be discussed in detail in Chapter 7.

D8. Information

Information is an important entity; however, at present, there is no established standard definition for it. Therefore, when people talk about information, they may refer to different things and talk past each other [229–237]. Thus, this theory sets up a working definition for this term to avoid misunderstandings. The definition that this theory uses is applicable to situations in ordinary daily life; that is, information is a non-material thing that is transferable.** For example, the information on this page is a non-material thing that is transferable from the author to this page and from this page to the reader. 

(** The detailed derivation of this definition, together with related discussions, is in Extra Chapter I: Information)

Other important characteristics of this type of information are that a) it consists of content, b) it is embodied in a physical object, which is called the carrier of the information, and c) ontologically, it is the pattern of its carrier [234,237,238]. For example, the information on this page consists of content that is about the definition of the term information; it is embodied in dots on a sheet of paper or pixels on an electronic screen; ontologically, it is the pattern of the dots or pixels that appears as the text that the reader is reading. Here, it is important to note that it is the non-material pattern, not the material dots or pixels per se, that is the information. If the material dots or pixels change (such as their brightness, colors, sizes, or their composing materials are changed) but the pattern remains the same, the information remains the same; on the contrary, if the material dots or pixels remain the same but the pattern changes (such as changes into patterns of other texts), the information changes.

Regarding the nervous system, which is the principal area investigated by this theory, typical examples are visual and auditory information about the outside world: 

     – Visual information about the outside world is non-material and transferable from outside-world objects to the light waves reflected from the objects and then to an animal’s visual and other processing systems. It consists of content: visual characteristics—color, brightness, shape, dimension, movement, etc.—of the objects. It is first embodied in the objects’ surfaces, then in the light waves, and then in the neural signals of the animal’s visual and other processing systems. This visual information is, first, the pattern of the object’s surface, then, the pattern of the light waves, and finally, the patterns of neural signaling in the animal’s visual and other processing systems. 

     – Sound information about the outside world is non-material and transferable from outside-world objects to the air surrounding the objects and then to an animal’s auditory and other processing systems. It consists of content: sound characteristics—pitch, timbre, loudness, formant, attack, etc.—of the vibrations of the objects. It is first embodied in the vibrations of objects, then in the air waves (sound waves) from the objects, and then in the neural signals of the animal’s auditory and other processing systems. This auditory information is, first, the pattern of the object’s vibrations, then, the pattern of the air waves, and finally, the patterns of neural signaling in the animal’s auditory and other processing systems. 

Because information, by this definition, consists of content for information receivers to interpret, it has meaning and is semantic information [164,229–231,239,240]. It is essential to note that the meaning of a piece of information is relative, not absolute. That is, the meaning of a piece of information varies, depending not only on the pattern of the physical carrier but also on the receiver and context in which the information is transferred [233,241]. A particular piece of information usually means a specific thing to a certain receiver in a certain context but means different things, incomprehensible things, or nothing to other receivers or in different contexts. For example, the information embodied in this dot/pixel pattern consciousness means consciousness to those who know English but means gibberish or practically nothing to those who do not; the information embodied in this pattern 头脑 means mind to those who understand Chinese but means gibberish or practically nothing to those who do not. Regarding the context of information transfer, an example is that, to people who know the convention of clock chiming, the information embodied in the three chimes of a striking clock’s bell means, in the morning, that the time is 03:00, but means, in the afternoon, that it is 15:00. Likewise, information of SPs in the human optic nerve means images in the outside world to the human visual nervous system but means gibberish or nothing to other parts of the nervous system and everything else (e.g., a television set, a DVD player, and a current computer—all of which cannot decode information in the human optic nerve’s SPs, even if we could somehow feed the SPs to them). 

The paramount importance of the meaning of information can be briefly glimpsed from the following fact: If the information of a certain neural SP means “what it is like to see a red color” to some neural process, then when that neural process reads the information in the SP, “what it is like to see a red color” will naturally and inevitably appear in that neural process, and thus in the brain. This is how phenomenal qualia and consciousness can occur in physical systems. This important matter is discussed in detail in Chapters 3 to 7.


> To freely download the PDF version of the whole theory, please click: Download the PDF version of the theory

> Go to Chapter 1

< Go back to Homepage


References

  1. De Sousa A. Towards an integrative theory of consciousness: Part 1 (neurobiological and cognitive models). Mens Sana Monogr. 2013 Jan–Dec;11(1):100–150. doi: 10.4103/0973-1229.109335. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3653219/
  2. Fieser J. Chapter 3: Mind. Great Issues in Philosophy. 2021 Jan 1. http://edduarddo1000.freehostia.com/filosofiapdf/what%20is%20a%20mind.pdf
  3. Jacob P. Intentionality. In: Zalta EN, editor. The Stanford encyclopedia of philosophy (Winter 2019 edition). https://plato.stanford.edu/archives/win2019/entries/intentionality/
  4. Moutoussis K. The machine behind the stage: A neurobiological approach toward theoretical issues of sensory perception. Front Psychol. 2016;7:1357. doi: 10.3389/fpsyg.2016.01357. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5020606/
  5. O’Madagain C. Intentionality. In: Internet Encyclopedia of Philosophy. http://www.iep.utm.edu/intentio/
  6. Pernu TK. The five marks of the mental. Front Psychol. 2017;8:1084. doi: 10.3389/fpsyg.2017.01084. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5500963/
  7. Rosenthal D. Concepts and definitions of consciousness. In: Banks WP, editor. Encyclopedia of Consciousness. Amsterdam: Elsevier; 2009:157–169. https://www.davidrosenthal.org/DR-Concepts-Dfns.pdf
  8. Seth AK, Baars BJ. Neural Darwinism and consciousness. Conscious Cogn. 2005 Mar;14(1):140–168. http://ccrg.cs.memphis.edu/assets/papers/2004/Seth%20&%20Baars,%20Neural%20Darwinism-2004.pdf
  9. Van Gulick R. Consciousness. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy (Summer 2017 edition). https://plato.stanford.edu/archives/sum2017/entries/consciousness
  10. Baars BJ, Geld N, Kozma R. Global Workspace Theory (GWT) and prefrontal cortex: Recent developments. Front Psychol. 2021;12:749868. doi: 10.3389/fpsyg.2021.749868. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8660103/
  11. Crick F, Koch C. Towards a neurobiological theory of consciousness. Seminars in the Neurosciences. 1990;2:263–275. https://authors.library.caltech.edu/40352/1/148.pdf
  12. Dehaene S, Charles L, King JR, Marti S. Toward a computational theory of conscious processing. Curr Opin Neurobiol. 2014 Apr;25:76–84. doi: 10.1016/j.conb.2013.12.005. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5635963/
  13. Edelman GM, Gally JA, Baars BJ. Biology of Consciousness. Front Psychol. 2011;2:4. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3111444/
  14. Fingelkurts AA, Fingelkurts AA, Neves CFH. Consciousness as a phenomenon in the operational architectonics of brain organization: Criticality and self-organization considerations. Chaos Solitons Fractals. 2013;55:13–31. doi: 10.1016/j.chaos.2013.02.007. https://www.researchgate.net/publication/276169154_Consciousness_as_a_phenomenon_in_the_operational_architectonics_of_brain_organization_Criticality_and_self-organization_considerations
  15. Grossberg S. Adaptive Resonance Theory: How a brain learns to consciously attend, learn, and recognize a changing world. Neural Netw. 2013 Jan;37:1–47. doi: 10.1016/j.neunet.2012.09.017. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.294.4425&rep=rep1&type=pdf
  16. Hameroff S. ‘Orch OR’ is the most complete, and most easily falsifiable theory of consciousness. Cogn Neurosci. 2021;12(2):74–76. doi: 10.1080/17588928.2020.1839037. https://www.tandfonline.com/doi/full/10.1080/17588928.2020.1839037
  17. Llinás R, Ribary U, Contreras D, Pedroarena C. The neuronal basis for consciousness. Philos Trans R Soc Lond B Biol Sci. 1998 Nov 29;353(1377):1841–1849. doi: 10.1098/rstb.1998.0336. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1692417/pdf/9854256.pdf
  18. McFadden J. Integrating information in the brain’s EM field: The cemi field theory of consciousness. Neurosci Conscious. 2020;10:1093/nc/niaa016. https://www.researchgate.net/publication/345370853_Integrating_information_in_the_brain’s_EM_field_the_cemi_field_theory_of_consciousness
  19. Northoff G, Zilio F. Temporo-spatial Theory of Consciousness (TTC)—Bridging the gap of neuronal activity and phenomenal states. Behav Brain Res. 2022 Apr;424:113788. doi: 10.1016/j.bbr.2022.113788. https://static1.squarespace.com/static/528facb6e4b0a18b7e9cde91/t/6209095ede99f90f5935f940/1644759400210/northoff+zilio.pdf
  20. Orpwood R. Information and the origin of qualia. Front Syst Neurosci. 2017 Apr 21;11(Article 22):1–16. doi: 10.3389/fnsys.2017.00022  https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5399078/pdf/fnsys-11-00022.pdf
  21. Sevush S. Single-neuron Theory of Consciousness. Journal of Theoretical Biology. 2006;238(3):704–725. https://doi.org/10.1016/j.jtbi.2005.06.018 http://cogprints.org/4432/1/single_neuron_theory.htm
  22. Song X, Tang X. An extended theory of global workspace of consciousness. Prog Nat Sci. 2008 Jul 10;18(7):789–793. https://doi.org/10.1016/j.pnsc.2008.02.003. https://www.sciencedirect.com/science/article/pii/S100200710800138X
  23. Tononi G, Boly M, Massimini M, Koch C. Integrated Information Theory: From consciousness to its physical substrate. Nature Reviews Neuroscience. 2016;17:450–461. https://doi.org/10.1038/nrn.2016.44 https://www.researchgate.net/publication/303551101_Integrated_information_theory_From_consciousness_to_its_physical_substrate
  24. Dennett DC. Facing up to the hard question of consciousness. Philos Trans R Soc Lond B Biol Sci. 2018;373(1755):20170342. doi: 10.1098/rstb.2017.0342. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6074080/?report=classic
  25. Cohen MA, Dennett DC. Consciousness cannot be separated from function. Trends Cogn Sci. 2011 Aug;15(8):358–364. doi: 10.1016/j.tics.2011.06.008. http://www.michaelacohen.net/uploads/5/9/0/7/59073133/1-s2-0-s1364661311001252-main.pdf
  26. Gennaro RJ. Consciousness. In: Internet Encyclopedia of Philosophy. http://www.iep.utm.edu/consciou/
  27. Chalmers DJ. Consciousness and its place in nature. In: Chalmers DJ, editor. Philosophy of mind: Classical and contemporary readings. Oxford: Oxford University Press; 2002. ISBN-13: 978-0195145816 ISBN-10: 019514581X. http://consc.net/papers/nature.html
  28. Chalmers DJ. Facing up to the problem of consciousness. J Conscious Stud. 1995;2(3):200–219. http://consc.net/papers/facing.html
  29. Chalmers DJ. Moving forward on the problem of consciousness. J Conscious Stud. 1997;4(1):3–46. http://consc.net/papers/moving.html
  30. Chalmers DJ. The puzzle of conscious experience. Sci Am. 1995 Dec;273(6):80–86. http://s3.amazonaws.com/arena-attachments/2382142/9247d5f1a845e5482b1bd66d82c3a9bf.pdf?1530582615
  31. Brogaard B, Electra Gatzia DE. What can neuroscience tell us about the hard problem of consciousness? Front Neurosci. 2016;10:395. doi: 10.3389/fnins.2016.00395. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5013033/
  32. Chis-Ciure R, Ellia F. Facing up to the hard problem of consciousness as an integrated information theorist. Found. Sci. 2021:1–17. doi: 10.1007/s10699-020-09724-7. http://philsci-archive.pitt.edu/20787/1/Chis-Ciure%20%26%20Ellia%20-%20Facing%20up%20to%20the%20Hard%20Problem%20as%20an%20Integrated%20Information%20Theoriest%20%28final%20preprint%29%20%282021%29.pdf
  33. Feinberg TE, Mallatt J. The nature of primary consciousness. A new synthesis. Conscious Cogn. 2016;43:113–127. doi: 10.1016/j.concog.2016.05.009. https://www.gwern.net/docs/psychology/2016-feinberg.pdf
  34. Loorits K. Structural qualia: A solution to the hard problem of consciousness. Front Psychol. 2014;5:237. doi: 10.3389/fpsyg.2014.00237. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3957492/
  35. Syamala H. How vedanta explains conscious subjective experience. Scientific GOD Journal. 2015;6:194–207. https://www.researchgate.net/publication/313870517_How_Vedanta_Explains_Conscious_Subjective_Experience
  36. Velmans M. Chapter 1. How to separate conceptual issues from empirical ones in the study of consciousness. In: Banerjee R, Chakrabarti BK, editors. Progress in brain research. Vol. 168. Models of brain and mind physical, computational and psychological approaches. Amsterdam: Elsevier B.V.; 2008:1–10. https://www.researchgate.net/publication/28764945_HOW_TO_SEPARATE_CONCEPTUAL_ISSUES_FROM_EMPIRICAL_ONES_IN_THE_STUDY_OF_CONSCIOUSNESS  
  37. Velmans M. Understanding consciousness. 2nd ed. Hove, East Sussex: Routledge; 2009. https://dl.uswr.ac.ir/bitstream/Hannan/130278/1/0415425158.Routledge.Understanding.Consciousness.Second.Edition.Apr.2009.pdf
  38. Weisberg J. The hard problem of consciousness. In: Internet Encyclopedia of Philosophy. https://www.iep.utm.edu/hard-con/
  39. Block N. Chapter 77. Comparing the major theories of consciousness. In: Gazzaniga MS, editor. The Cognitive Neurosciences. 4th ed. Cambridge, MA: MIT Press; 2009:1111–1122. https://www.nyu.edu/gsas/dept/philo/faculty/block/papers/Theories_of_Consciousness.pdf
  40. Block N. Consciousness, accessibility, and the mesh between psychology and neuroscience. Behavioral and Brain Sciences. 2007;30(5–6):481–499. https://www.nyu.edu/gsas/dept/philo/faculty/block/papers/Block_BBS.pdf
  41. Byrne A. Inverted qualia. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy (Winter 2016 Edition). https://plato.stanford.edu/archives/win2016/entries/qualia-inverted/
  42. Chalmers DJ. Phenomenal concepts and the explanatory gap. In: Alter T, Walter S, editors. Phenomenal concepts and phenomenal knowledge: New essays on consciousness and physicalism. Oxford University Press; 2006. https://www.sciencedharma.com/uploads/7/6/8/0/76803975/pceg.pdf
  43. Chalmers DJ. The conscious mind: In search of a fundamental theory. Oxford: Oxford University Press;1996. https://personal.lse.ac.uk/ROBERT49/teaching/ph103/pdf/Chalmers_The_Conscious_Mind.pdf
  44. Feinberg TE, Mallatt J. Phenomenal consciousness and emergence: Eliminating the explanatory gap. Front Psychol. 2020;11:1041. doi: 10.3389/fpsyg.2020.01041. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7304239/
  45. Levine J. Materialism and qualia: The explanatory gap. Pacific Philosophical Quarterly. 1983;64:354–361. https://hope.simons-rock.edu/~pshields/cs/cmpt265/levine.pdf
  46. Papineau D. Mind the gap. Philosophical Perspectives. 1998;12:373–389. https://sas-space.sas.ac.uk/878/1/D_Papineau_Gap..pdf  http://www.davidpapineau.co.uk/uploads/1/8/5/5/18551740/mind_the_gap.pdf
  47. Sturm T. Consciousness regained? Philosophical arguments for and against reductive physicalism. Dialogues Clin Neurosci. 2012 Mar;14(1):55–63. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3341650/
  48. Tye M. Qualia. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy (Winter 2017 edition). https://plato.stanford.edu/archives/win2017/entries/qualia/
  49. Antonakou EI, Triarhou LC. Soul, butterfly, mythological nymph: Psyche in philosophy and neuroscience. Arq. Neuro-Psiquiatr. 2017 Mar;75(3) São Paulo. https://doi.org/10.1590/0004-282×20170012. http://www.scielo.br/scielo.php?script=sci_arttext&pid=S0004-282X2017000300176&lng=en&nrm=iso&tlng=en
  50. Bennett MR. Development of the concept of mind. Aust N Z J Psychiatry. 2007 Dec;41(12):943–56.
  51. Coseru C. Mind in Indian Buddhist Philosophy. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy (Spring 2017 Edition). https://plato.stanford.edu/archives/spr2017/entries/mind-indian-buddhism
  52. Chalmers DJ. The phenomenal and the psychological concepts of mind. In: The conscious mind: In search of a fundamental theory. Oxford: Oxford University Press; 1996:11–16. https://personal.lse.ac.uk/ROBERT49/teaching/ph103/pdf/Chalmers_The_Conscious_Mind.pdf
  53. Crivellato E, Ribatti D. Soul, mind, brain: Greek philosophy and the birth of neuroscience. Brain Res Bull. 2007 Jan 9;71(4):327–336. doi: 10.1016/j.brainresbull.2006.09.020. https://pdfslide.net/documents/soul-mind-and-brain-brb-2007.html
  54. Dolan B. Soul searching: A brief history of the mind/body debate in the neurosciences. Neurosurg Focus. 2007 Jul;23(1):E2. https://pdfs.semanticscholar.org/0ee7/daa68d6c67564e9778c9c4426988a7e6a0b9.pdf?_ga=2.218379610.273739394.1573212485-1576401793.1567926358  
  55. Dreyfus G, Thompson E. Asian perspectives: Indian theories of mind. In: Zelazo PD, Moscovitch, M, Thompson E, editors. The Cambridge Handbook of Consciousness (Chapter 5). Cambridge: Cambridge University Press; 2007:89–114. http://perpus.univpancasila.ac.id/repository/EBUPT181231.pdf
  56. Hansotia P. A neurologist looks at mind and brain: “The enchanted loom.” Clin Med Res. 2003 Oct;1(4):327–332. doi: 10.3121/cmr.1.4.327. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1069062/
  57. Ivry A. Arabic and Islamic psychology and philosophy of mind. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy (Summer 2012 Edition). https://plato.stanford.edu/archives/sum2012/entries/arabic-islamic-mind/
  58. Kelley T. Pransky J, Kelley TM. Three Principles for realizing mental health: A new psycho-spiritual view. Journal of Creativity in Mental Health. 2014;9:53–68. https://www.researchgate.net/publication/265684685_2Pransky_J_and_Kelley_T_M_2014_Three_Principles_for_realizing_mental_health_A_new_psycho-spiritual_view
  59. McLear C. Kant: Philosophy of mind. Internet Encyclopedia of Philosophy. https://www.iep.utm.edu/kandmind/
  60. Pandya SK. Understanding brain, mind and soul: Contributions from neurology and neurosurgery. Mens Sana Monogr. 2011 Jan;9(1):129–149. doi: 10.4103/0973-1229.77431. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3115284/  
  61. Prabhu HR, Bhat PS. Mind and consciousness in yoga—Vedanta: A comparative analysis with western psychological concepts. Indian J Psychiatry. 2013;55(Suppl 2):S182–S186. doi: 10.4103/0019-5545.105524. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3705680/?report=classic
  62. Richert R, Harris P. Dualism revisited: Body vs. mind vs. soul. J Cogn Cult. 2008;8:99–115. doi: 10.1163/156770908X289224. https://pdfs.semanticscholar.org/a1cb/e5345428f011a8b02e638cdbd2395c381559.pdf
  63. Riekki T, Lindeman M, Lipsanen J. Conceptions about the mind-body problem and their relations to afterlife beliefs, paranormal beliefs, religiosity, and ontological confusions. Adv Cogn Psychol. 2013;9(3):112–120. doi: 10.2478/v10053-008-0138-5. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4158462/
  64. Roazzi M, Nyhof M, Johnson C. Mind, Soul and Spirit: A cross-cultural study of conceptions of immaterial identity. Int J Psychol Relig. 2013;23(1):75–86. https://www.researchgate.net/publication/210273911_Mind_Soul_and_Spirit_A_cross-cultural_study_of_conceptions_of_immaterial_identity
  65. Santoro G, Wood MD, Merlo L, Anastasi GP, Tomasello F, Germanò A. The anatomic location of the soul from the heart, through the brain, to the whole body, and beyond: A journey through western history, science, and philosophy. Neurosurgery. 2009 Oct;65(4):633–643. https://www.academia.edu/13337265/THE_ANATOMIC_LOCATION_OF_THE_SOUL_FROM_THE_HEART_THROUGH_THE_BRAIN_TO_THE_WHOLE_BODY_AND_BEYOND
  66. Skrbina D. Panpsychism. Internet Encyclopedia of Philosophy. https://iep.utm.edu/panpsych/
  67. Sreeja Gangadharan P, S P K Jena. Understanding Mind through Indian Psychology. The International Journal of Indian Psychology. 2016 April–June;3(3:11). http://oaji.net/articles/2016/1170-1467035825.pdf
  68. Winnicott DW. Mind and its relation to the psyche-soma. Br J Med Psychol. 1954;XXVII:200–209. https://web.english.upenn.edu/~cavitch/pdf-library/Winnicott_PsycheSoma.pdf
  69. Tan U. The psychomotor theory of human mind. Int J Neurosci. 2007 Aug;117(8):1109–1148. http://cogprints.org/5607/1/PSYCHOMOT._THEORY_TAN.pdf
  70. Velmans M. How to define consciousness—and how not to define consciousness. J Conscious Stud. 2009;16(5):139–156. http://cogprints.org/6453/1/How_to_define_consciousness.pdf
  71. Chalmers D. Panpsychism and panprotopsychism. In: Bruntrup G, Jaskolla L, editors. Panpsychism: Contemporary Perspectives. Oxford University Press; 2016. http://consc.net/papers/panpsychism.pdf
  72. Frankish K. Panpsychism and the depsychologization of consciousness. Aristotelian Soc Suppl Vol. 2021;95:51–70. https://academic.oup.com/aristoteliansupp/article-pdf/95/1/51/38850561/akab012.pdf
  73. Goff P, Seager W, Allen-Hermanson S. Panpsychism. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy (Summer 2020 Edition). https://plato.stanford.edu/entries/panpsychism/
  74. Hunt T, Schooler JW. The easy part of the hard problem: A resonance theory of consciousness. (Erratum in: Front Hum Neurosci. 2020 Sep 04;14:596409) Front Hum Neurosci. 2019;13:378. doi: 10.3389/fnhum.2019.00378. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6834646/?report=classic
  75. Koch C. Is consciousness universal? Sci Am Mind. 2014;25:26–29. https://www.scientificamerican.com/article/is-consciousness-universal/
  76. Lamme VAF. Challenges for theories of consciousness: Seeing or knowing, the missing ingredient and how to deal with panpsychism. Philos Trans R Soc Lond B Biol Sci. 2018 Sep;373(1755):20170344. doi: 10.1098/rstb.2017.0344. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6074090/
  77. Mathews F. Living cosmos panpsychism. In: Seager W, editor. The Routledge Handbook of Panpsychism. London: Routledge; 2019:1–13. https://www.researchgate.net/publication/334586627_Living_Cosmos_Panpsychism
  78. Mørch HH. The argument for panpsychism from experience of causation. In: Seager WE, editor. The Routledge Handbook of Panpsychism. New York: Routledge; 2020. https://philarchive.org/archive/MRCTAFv2
  79. Bechtel W, Bollhagen A. Philosophy of cell biology. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy (Winter 2019 Edition). https://plato.stanford.edu/entries/cell-biology/
  80. Brigandt I, Love A. Reductionism in biology. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy (Spring 2017 Edition). https://plato.stanford.edu/entries/reduction-biology/
  81. Coulter I, Snider P, Neil A. Vitalism – a worldview revisited: A critique of vitalism and its implications for naturopathic medicine. Integr Med (Encinitas). 2019 Jun;18(3):60–73. https://www.researchgate.net/publication/335258887_Vitalism-A_Worldview_Revisited_A_Critique_Of_Vitalism_And_Its_Implications_For_Naturopathic_Medicine
  82. Greco M. On the vitality of vitalism. Theory Cult Soc. 2005 Feb;22(1):15–27. doi: 10.1177/0263276405048432. https://www.researchgate.net/publication/43251499_On_the_Vitality_of_Vitalism
  83. Kyle K. The animal soul and the problem of animal suffering. Christian Apologetics Journal. 2015;13:5–32. https://www.researchgate.net/publication/308699820_The_Animal_Soul_and_the_Problem_of_Animal_Suffering
  84. Ng, On-cho. Qing Philosophy. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy (Summer 2019 Edition). https://plato.stanford.edu/entries/qing-philosophy/
  85. Oelze A. Chapter 5. Animal souls and sensory cognition. In: Animal Rationality. Brill; 2018:28–35. doi: https://doi.org/10.1163/9789004363779_008. https://brill.com/view/book/9789004363779/BP000008.xml
  86. Weber B. Life. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy (Spring 2017 Edition). https://plato.stanford.edu/entries/life/
  87. Bates AWH. Chapter 3. Have animals souls? the late-nineteenth century spiritual revival and animal welfare. In: Anti-Vivisection and the Profession of Medicine in Britain: A Social History. London: Palgrave Macmillan; 2017. https://www.ncbi.nlm.nih.gov/books/NBK513717/
  88. Gallup GG. Do minds exist in species other than our own? Neurosci Biobehav Rev. 1985;9(4):631–641. ISSN 0149-7634, https://doi.org/10.1016/0149-7634(85)90010-7. https://www.sciencedirect.com/science/article/pii/0149763485900107/pdf?md5=aab64ea61a4e1ba8f7816c8d1fcb5fb5&pid=1-s2.0-0149763485900107-main.pdf
  89. Tehrani AMH. Question 44: Existence of spirits within animals and its difference from that of the Human Being. In: Faith and Reason. The Porch of Wisdom Cultural Institution. The Islamic Education Board of the World Federation of KSIMC; 2006. https://www.al-islam.org/faith-and-reason/question-44-existence-spirits-within-animals-and-its-differnce-human-being  https://www.al-islam.org/printpdf/book/export/html/9466
  90. Kekedi B. Descartes’ wondering automata. https://www.academia.edu/22937144/DESCARTES_WONDERING_AUTOMATA.
  91. V. On the hypothesis that animals are automata, and its history. In: Huxley’s collected essays. Vol. I: Method and Results; 1874:199–250. http://aleph0.clarku.edu/huxley/CE1/AnAuto.html. http://aleph0.clarku.edu/huxley/CE1/index.html
  92. Riskin J. Machines in the garden. Republics of Letters: A Journal for the Study of Knowledge, Politics, and the Arts. 2010 Apr 30;1(2). http://rofl.stanford.edu/node/59. https://arcade.stanford.edu/rofl/machines-garden
  93. Baluška F, Levin M. On having no head: Cognition throughout Biological Systems. Front Psychol. 2016;7:902. doi: 10.3389/fpsyg.2016.00902. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4914563/
  94. Jorgensen EM. Animal evolution: Looking for the first nervous system. Curr Biol. 2014 Jul;24(14):R655–R658. https://doi.org/10.1016/j.cub.2014.06.036 http://www.cell.com/current-biology/fulltext/S0960-9822(14)00752-0
  95. Roth G, Dicke U. Evolution of Nervous Systems and Brains. In: Galizia CG, Lledo P-M (eds.). Neurosciences—From Molecule to Behavior: A University Textbook. Berlin Heidelberg: Springer-Verlag; 2013:19–45. ISBN: 978-3-642-10768-9. doi: 10.1007/978-3-642-10769-6_2. https://www.google.co.th/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0ahUKEwi8mLLz0KTWAhXJOY8KHTNcD7MQFggnMAA&url=http%3A%2F%2Fwww.springer.com%2Fcda%2Fcontent%2Fdocument%2Fcda_downloaddocument%2F9783642107689-c1.pdf%3FSGWID%3D0-0-45-1404406-p174537886&usg=AFQjCNEP_Y0Wv_J0fF_haYNFzYbTn95w2A
  96. Ben-Jacob E. Learning from bacteria about natural information processing. Ann N Y Acad Sci. 2009 Oct;1178:78–90. doi: 10.1111/j.1749-6632.2009.05022.x. http://files.campus.edublogs.org/micropopbio.org/dist/8/471/files/2012/04/Learning-from-Bacteria-about-Natural-Information-Processing-1dhem65.pdf
  97. Ben-Jacob E, Shapira Y, Tauber A. Seeking the foundations of cognition in bacteria: From Schrödinger’s negative entropy to latent information. Physica A: Statistical Mechanics and its Applications. 206;369:495–524. doi: 10.1016/j.physa.2005.05.096. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.566.6992&rep=rep1&type=pdf
  98. Camilli A, Bassler BL. Bacterial small-molecule signaling pathways. Science. 2006 Feb 24;311(5764):1113–1116. doi: 10.1126/science.1121357. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2776824/
  99. Lan G, Tu Y. Information processing in bacteria: Memory, computation, and statistical physics: A key issues review. Rep Prog Phys. 2016 May;79(5):052601. doi: 10.1088/0034-4885/79/5/052601. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4955840/
  100. Lyon P. The cognitive cell: Bacterial behavior reconsidered. Front Microbiol. 2015 Apr. https://doi.org/10.3389/fmicb.2015.00264  https://www.frontiersin.org/articles/10.3389/fmicb.2015.00264/full
  101. Xavier RS, Omar N, De Castro L. Bacterial colony: Information processing and computational behavior. Proceedings of the 2011 3rd World Congress on Nature and Biologically Inspired Computing (NaBIC). 2011 Oct:439–443. doi: 10.1109/NaBIC.2011.6089627. https://www.researchgate.net/publication/261311628_Bacterial_colony_Information_processing_and_computational_behavior
  102. Dexter JP, Prabakaran S, Gunawardena J. A complex hierarchy of avoidance behaviors in a single-cell eukaryote. Curr Biol. 2019 Dec 16;29(24):4323–4329.e2. https://doi.org/10.1016/j.cub.2019.10.059 https://www.sciencedirect.com/science/article/pii/S0960982219314319
  103. Leys SP. Elements of a ‘nervous system’ in sponges. J Exp Biol. 2015;218:581–591. doi: 10.1242/jeb.110817. http://jeb.biologists.org/content/218/4/581.long
  104. Renard E, Vacelet J, Gazave E, Lapébie P, Borchiellini C, Ereskovsky AV. Origin of the neuro-sensory system: New and expected insights from sponges. Integr Zool. 2009 Sep;4(3):294–308. doi: 10.1111/j.1749-4877.2009.00167.x. https://bio.spbu.ru/staff/pdf/Renard%20et_2009-NervSpon.pdf
  105. Smith CL, Pivovarova N, Reese TS. Coordinated feeding behavior in trichoplax, an animal without synapses. In: Steele RE, editor. PLoS One. 2015;10(9):e0136098. doi: 10.1371/journal.pone.0136098. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4558020/
  106. Smith CL, Reese TS, Govezensky T, Barrioc RA. Coherent directed movement toward food modeled in Trichoplax, a ciliated animal lacking a nervous system. Proc Natl Acad Sci U S A. 2019 Apr 30;116(18):8901–8908. doi: 10.1073/pnas.1815655116. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6500112/
  107. Lu TM, Furuya H, Satoh N. Gene expression profiles of dicyemid life-cycle stages may explain how dispersing larvae locate new hosts. Zoological Lett. 2019 Nov 13;5:32. doi: 10.1186/s40851-019-0146-y. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6854800/
  108. Lu TM, Kanda M, Furuya H, Satoh N. Dicyemid Mesozoans: A unique parasitic lifestyle and a reduced genome. Genome Biol Evol. 2019 Aug 1;11(8):2232–2243. doi: 10.1093/gbe/evz157. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6736024/
  109. Barlow PW. The natural history of consciousness, and the question of whether plants are conscious, in relation to the Hameroff-Penrose quantum-physical ‘Orch OR’ theory of universal consciousness. Commun Integr Biol. 2015 Jul–Aug;8(4):e1041696. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4594572/
  110. Bassel GW. Information processing and distributed computation in plant organs. Trends Plant Sci. 2018 Nov;23(11):994–1005. https://doi.org/10.1016/j.tplants.2018.08.006  https://research.birmingham.ac.uk/portal/files/54870081/Distributed_computation_in_plants_August_13.pdf
  111. Brenner ED, Stahlberg R, Mancuso S, Vivanco J, Baluska F, Van Volkenburgh E. Plant neurobiology: An integrated view of plant signaling. Trends Plant Sci. 2006 Aug;11(8):413–419. doi: 10.1016/j.tplants.2006.06.009 https://www.howplantswork.com/wp-content/uploads/2018/03/Plant_Neurobiology.pdf
  112. Duran-Nebreda S, Bassel GW. Plant behaviour in response to the environment: Information processing in the solid state. Philos Trans R Soc Lond B Biol Sci. 374:20180370. 2019 Apr. http://doi.org/10.1098/rstb.2018.0370  https://royalsocietypublishing.org/doi/10.1098/rstb.2018.0370
  113. Fromm J, Lautner S. Electrical signals and their physiological significance in plants. Plants, Cells & Environment. 2007 Mar;30(3):249–257. https://doi.org/10.1111/j.1365-3040.2006.01614.x https://onlinelibrary.wiley.com/doi/full/10.1111/j.1365-3040.2006.01614.x
  114. Garzón FC. The quest for cognition in plant neurobiology. Plant Signal Behav. 2007 Jul–Aug;2(4):208–211. doi: 10.4161/psb.2.4.4470. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2634130/
  115. Mallatt J, Blatt MR, Draguhn A, Robinson DG, Taiz L. Debunking a myth: Plant consciousness. Protoplasma. 2021;258(3):459–476. doi: 10.1007/s00709-020-01579-w. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8052213/?report=classic
  116. Stahlberg R. Historical overview on plant neurobiology. Plant Signal Behav. 2006 Jan–Feb;1(1):6–8. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2633693/
  117. Trewavas A. Aspects of plant intelligence. Ann Bot. 2003 Jul;92(1):1–20. doi: 10.1093/aob/mcg101. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4243628/
  118. Augustine GJ. Unit I. Neural signaling. In: Purves D,‎ Augustine GJ,‎ Fitzpatrick D,‎ Hall WC,‎ LaMantia AS,‎ Mooney RD, Platt ML, White LE, editors. Neuroscience. 6th ed. New York: Oxford University Press; 2018:31–190.
  119. Bean BP, Koester JD. Propagated signaling: The action potential. In: Kandel ER, Koester JD, Mack SH, Siegelbaum SA. editors. Principles of Neural Science. 6th ed., Kindle ed. McGraw Hill; 2021.
  120. Fischbach GD, Siegelbaum SA. Directly gated transmission: The nerve-muscle synapse. In: Kandel ER, Koester JD, Mack SH, Siegelbaum SA. editors. Principles of Neural Science. 6th ed., Kindle ed. McGraw Hill; 2021.
  121. Siegelbaum SA, Fischbach GD. Overview of synaptic transmission. In: Kandel ER, Koester JD, Mack SH, Siegelbaum SA. editors. Principles of Neural Science. 6th ed., Kindle ed. McGraw Hill; 2021.
  122. Siegelbaum SA, Südhof TC, Tsien RW. Transmitter release. In: Kandel ER, Koester JD, Mack SH, Siegelbaum SA. editors. Principles of Neural Science. 6th ed., Kindle ed. McGraw Hill; 2021.
  123. Yuste R, Siegelbaum SA. Synaptic integration in the central nervous system. In: Kandel ER, Koester JD, Mack SH, Siegelbaum SA. editors. Principles of Neural Science. 6th ed., Kindle ed. McGraw Hill; 2021.
  124. Ainsworth M, Lee S, Cunningham MO, Traub RD, Kopell NJ, Whittington MA. Rates and rhythms: A synergistic view of frequency and temporal coding in neuronal networks. Neuron. 2012 Aug 23;75(4):572–583. doi: 10.1016/j.neuron.2012.08.004. http://www.cell.com/neuron/fulltext/S0896-6273(12)00709-X
  125. Avitan L, Goodhill GJ. Code under construction: Neural coding over development. Trends Neurosci. 2018 Sep;41(9):599–609. https://goodhill.org/pub/avitan18.pdf
  126. Azarfar A, Calcini N, Huang C, Zeldenrust F, Celikel T. Neural coding: A single neuron’s perspective. Neurosci Biobehav Rev. 2018 Nov;94:238–247. doi: 10.1016/j.neubiorev.2018.09.007. https://linkinghub.elsevier.com/retrieve/pii/S0149-7634(17)30894-1  https://www.sciencedirect.com/science/article/pii/S0149763417308941?via%3Dihub
  127. Barbieri R, Frank LM, Nguyen DP, et al. Dynamic analyses of information encoding in neural ensembles. Neural Comput. 2004 Feb;16(2):277–307. doi: 10.1162/089976604322742038. https://www.researchgate.net/publication/8678252_Dynamic_Analyses_of_Information_Encoding_in_Neural_Ensembles
  128. Bohte SM. The evidence for neural information processing with precise spike-times: A survey. Nat Comput. 2004 Jun;3(2):195–206. https://homepages.cwi.nl/~sbohte/publication/spikeNeuronsNC.pdf
  129. deCharms RC1, Zador A. Neural representation and the cortical code. Annu Rev Neurosci. 2000;23:613–647. doi: 10.1146/annurev.neuro.23.1.613. http://www.cnbc.cmu.edu/~tai/readings/nature/zador_code.pdf
  130. Doetsch GS. Patterns in the brain. Neuronal population coding in the somatosensory system. Physiol Behav. 2000 Apr 1;69(1–2):187–201.
  131. Florian RV. The Chronotron: A neuron that learns to fire temporally precise spike patterns. PLoS One. 2012;7(8): e40233. https://doi.org/10.1371/journal.pone.0040233  https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0040233
  132. Földiák P. Chapter 19 – Sparse and explicit neural coding. In: Quiroga RQ, Panzeri S, editors. Principles of Neural Coding. Kindle Edition. Taylor and Francis CRC ebook account; 2013:379–389.
  133. Gardner B, Sporea I, Grüning A. Encoding spike patterns in multilayer spiking neural networks. arXiv.org. 2015. https://arxiv.org/pdf/1503.09129.pdf
  134. Gardner B, Sporea I, Grüning A. Learning spatiotemporally encoded pattern transformations in structured spiking neural networks. Neural Comput 2015;27(12):2548–2586.
  135. Gütig R, Sompolinsky H. The tempotron: A neuron that learns spike timing-based decisions. Nat Neurosci. 2006 Mar;9(3):420–428. doi: 10.1038/nn1643. http://mcn2016public.pbworks.com/w/file/fetch/137818197/Gutig_R_The%20tempotron_Nature%20Neuroscience.pdf
  136. Izhikevich EM, Desai NS, Walcott EC, Hoppensteadt FC. Bursts as a unit of neural information: Selective communication via resonance. Trends Neurosci. 2003 Mar;26(3):161–167. doi: 10.1016/S0166-2236(03)00034-1. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.85.6965&rep=rep1&type=pdf
  137. Jermakowicz WJ, Casagrande VA. Neural networks a century after Cajal. Brain Res Rev. 2007 Oct;55(2):264–284. doi: 10.1016/j.brainresrev.2007.06.003. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2101763/
  138. Johnson KO. Neural coding. Neuron. 2000 Jun;26(3):563–566. https://doi.org/10.1016/S0896-6273(00)81193-9 https://linkinghub.elsevier.com/retrieve/pii/S0896-6273(00)81193-9
  139. Lankarany M, Al-Basha D, Ratté S, Prescott SA. Differentially synchronized spiking enables multiplexed neural coding. Proc Natl Acad Sci U S A. 2019 May 14;116(20):10097–10102. doi: 10.1073/pnas.1812171116. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6525513/
  140. Ponulak F, Kasinski A. Introduction to spiking neural networks: Information processing, learning and applications. Acta Neurobiol Exp (Wars). 2011;71(4):409–433. http://www.ane.pl/linkout.php?pii=7146
  141. Quian Quiroga R. Principles of neural coding. Quian Quiroga R, Panzeri S, editors. Boca Raton, FL: CRC Press, Taylor and Francis. Kindle Edition; 2013.
  142. Recce M. Chapter 4. Encoding information in neuronal activity. https://www.cs.stir.ac.uk/~bpg/Teaching/31YF/Resources/PNN/chap4.pdf
  143. Rolls ET. The neuronal representation of information in the human brain. Brain. 2015 Nov;138(11):3459–3462. https://academic.oup.com/brain/article/138/11/3459/2095610
  144. VanRullen R, Guyonneau R, Thorpe SJ. Spike times make sense. Trends Neurosci. 2005 Jan;28(1):1–4. doi: 10.1016/j.tins.2004.10.010. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.72.8923&rep=rep1&type=pdf
  145. Zeldenrust F, Wadman WJ, Englitz B. Neural coding with bursts—Current state and future perspectives. Front Comput Neurosci. 2018;12:48. doi: 10.3389/fncom.2018.00048. https://pure.uva.nl/ws/files/30096303/fncom_12_00048.pdf
  146. Baars BJ, Gage NM. Chapter 8. Consciousness and attention. In: Baars BJ, Gage NM, editors. Cognition, Brain, and Consciousness. 2nd ed. Academic Press (Elsevier); 2010:239–304. https://social.hse.ru/data/2013/12/21/1338659679/Baars%20Gage%202010%20Cognition,%20Brain%20and%20Consciousness%20(2nd%20edition).pdf
  147. Dehaene S. Chapter 2. Fathoming unconscious depths, Chapter 3. What is consciousness good for?, and Chapter4. The signature of a conscious thought. In: Consciousness and the brain: Deciphering how the brain codes our thoughts. New York, New York: The Penguin Group; 2014:47–160.
  148. Dehaene S, Changeux JP, Naccache L, Sackur J, Sergent C. Conscious, preconscious, and subliminal processing: A testable taxonomy. Trends Cogn Sci. 2006 May;10(5):204–211. doi: 10.1016/j.tics.2006.03.007. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.65.3821&rep=rep1&type=pdf
  149. Dehaene S, Changeux JP, Naccache L. The Global Neuronal Workspace Model of conscious access: From neuronal architectures to clinical applications. 2011. In: Dehaene S, Christen Y, editors. Characterizing consciousness: From cognition to the clinic? Research and Perspectives in Neurosciences. Berlin, Heidelberg: Springer-Verlag; 2011. https://doi.org/10.1007/978-3-642-18015-6_4c http://www.antoniocasella.eu/dnlaw/Dehaene_Changeaux_Naccache_2011.pdf
  150. Evans JSBT. Dual-processing accounts of reasoning, judgment, and social cognition. Annu Rev Psychol. 2008;59:255–278. doi: 10.1146/annurev.psych.59.103006.093629. https://sites.ualberta.ca/~francisp/Phil488/EvansDualProcessing2008.pdf
  151. Kihlstrom JF. The cognitive unconscious. Science. 1987 Sep 18;237(4821):1445–1452. doi: 10.1126/science.3629249. https://www.ocf.berkeley.edu/~jfkihlstrom/PDFs/1980s/1987/ScienceCogUncog.pdf
  152. McGovern K, Baars BJ. Chapter 8. Cognitive theories of consciousness. In: Zelazo PD, Moscovitch M, Thompson E, editors. The Cambridge handbook of consciousness. Cambridge University Press; 2007:177–205. http://perpus.univpancasila.ac.id/repository/EBUPT181231.pdf
  153. Nisbett RE, Wilson TD. Telling more than we can know: Verbal reports on mental processes. Psychol Rev. 1977 May;84(3):231–259. https://doi.org/10.1037/0033-295X.84.3.231  https://deepblue.lib.umich.edu/bitstream/handle/2027.42/92167/TellingMoreThanWeCanKnow.pdf
  154. Velmans M. Is human information processing conscious? Behav Brain Sci. 1991;14:651–726. https://pdfs.semanticscholar.org/1bca/4e316885e05bda693868c7ce49cfbf206dba.pdf
  155. Nagel T. What is it like to be a bat? Philos Rev. 1974;4:435–450. https://www.sas.upenn.edu/~cavitch/pdf-library/Nagel_Bat.pdf
  156. Amaral DG. The neuroanatomical bases by which neural circuits mediate behavior. In: Kandel ER, Koester JD, Mack SH, Siegelbaum SA. editors. Principles of Neural Science. 6th ed., Kindle ed. McGraw Hill; 2021.
  157. Augustine GJ. Chapter 1. Studying the nervous system. In: Purves D,‎ Augustine GJ,‎ Fitzpatrick D,‎ Hall WC,‎ LaMantia AS,‎ Mooney RD, Platt ML, White LE, editors. Neuroscience. 6th ed. New York: Oxford University Press; 2018:1–29.
  158. Buzsáki G. Neural syntax: Cell assemblies, synapsembles and readers. Neuron. 2010 Nov 4;68(3):362–385. doi: 10.1016/j.neuron.2010.09.023. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3005627/
  159. Harris KD, Shepherd GMG. The neocortical circuit: Themes and variations. Nat Neurosci. 2015 Feb;18(2):170–181. doi: 10.1038/nn.3917. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4889215/
  160. Martinez P, Sprecher SG. Of circuits and brains: The origin and diversification of neural architectures. Front Ecol Evol. 2020 Mar. https://doi.org/10.3389/fevo.2020.00082  https://www.frontiersin.org/articles/10.3389/fevo.2020.00082/full
  161. Menon V, Uddin LQ. Saliency, switching, attention and control: A network model of insula function. Brain Struct Funct. 2010;214(5–6):655–667. doi: 10.1007/s00429-010-0262-0. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2899886/
  162. Pulvermüller F, Garagnani M, Wennekers T. Thinking in circuits: Toward neurobiological explanation in cognitive neuroscience. Biol Cybern. 2014;108(5):573–593. doi: 10.1007/s00422-014-0603-9. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4228116/
  163. Shadlen MN, Kandel ER. Nerve cells, neural circuitry, and behavior. In: Kandel ER, Koester JD, Mack SH, Siegelbaum SA. editors. Principles of Neural Science. 6th ed., Kindle ed. McGraw Hill; 2021.
  164. Tau G, Peterson B. Normal development of brain circuits. Neuropsychopharmacol. 2010;35:147–168. https://doi.org/10.1038/npp.2009.115  https://www.nature.com/articles/npp2009115#citeas
  165. Kanwisher N. Functional specificity in the human brain: A window into the functional architecture of the mind. Proc Natl Acad Sci U S A. 2010 Jun 22;107(25):11163–11170. doi: 10.1073/pnas.1005062107. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2895137/
  166. Amunts K, Zilles K. Architectonic mapping of the human brain beyond Brodmann. Neuron. 2015 Dec 16;88:1086–1113. https://doi.org/10.1016/j.neuron.2015.12.001 http://www.cell.com/neuron/fulltext/S0896-6273(15)01072-7
  167. Bartels A, Zekis S. The chronoarchitecture of the cerebral cortex. Philos Trans R Soc Lond B Biol Sci. 2005 Apr 29;360(1456):733–750. doi: 10.1098/rstb.2005.1627. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1569482/
  168. Cohen AL, Fair DA, Dosenbach NUF, et al. Defining functional areas in individual human brains using resting functional connectivity MRI. Neuroimage. 2008 May 15;41(1):45–57. doi: 10.1016/j.neuroimage.2008.01.066. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2705206/
  169. Geyer S, Weiss M, Reimann K, Lohmann G, Turner R. Microstructural parcellation of the human cerebral cortex – from Brodmann’s post-mortem map to in vivo mapping with high-field magnetic resonance imaging. Front Hum Neurosci. 2011;5:19. doi: 10.3389/fnhum.2011.00019. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3044325/
  170. Glasser MF, Coalson TS, Robinson EC, et al. A multi-modal parcellation of human cerebral cortex. Nature. 2016 Aug 11;536(7615):171–178. doi: 10.1038/nature18933. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4990127/
  171. James GA, Hazaroglu O, Bush KA. A human brain atlas derived via n-cut parcellation of resting-state and task-based fMRI data. Magn Reson Imaging. 2016 Feb;34(2):209–218. doi: 10.1016/j.mri.2015.10.036. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4837649/
  172. Passingham RE, Stephan KE, Kötter R. The anatomical basis of functional localization in the cortex. Nat Rev Neurosci. 2002 Aug;3(8):606–616. http://library.ibp.ac.cn/html/cogsci/NRN-2002-606.pdf
  173. Rakic P. Evolution of the neocortex: Perspective from developmental biology. Nat Rev Neurosci. 2009 Oct;10(10):724–735. doi: 10.1038/nrn2719. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2913577/
  174. Zilles K, Palomero-Gallagher N, Schleicher A. Transmitter receptors and functional anatomy of the cerebral cortex. J Anat. 2004 Dec;205(6):417–432. doi: 10.1111/j.0021-8782.2004.00357.x. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1571403/
  175. Sporns O. Cerebral cartography and connectomics. Philos Trans R Soc Lond B Biol Sci. 2015 May 19;370(1668):20140173. doi: 10.1098/rstb.2014.0173. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4387514/
  176. Sporns O, Tononi G, Kötter R. The human connectome: A structural description of the human brain. PLoS Comput Biol. 2005 Sep;1(4): e42. doi: 10.1371/journal.pcbi.0010042. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1239902/
  177. Tungaraza RL, Mehta SH, Haynor DR, Grabowski TJ. Anatomically informed metrics for connectivity-based cortical parcellation from diffusion MRI. IEEE J Biomed Health Inform. 2015 Jul;19(4):1375–1383. doi: 10.1109/JBHI.2015.2444917. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4561620/
  178. Van Essen DC, Glasser MF. The Human Connectome Project: Progress and prospects. Cerebrum. 2016 Sep–Oct;2016:cer-10-16. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5198757/
  179. Van Essen DC, Glasser MF, Dierker DL, Harwell J, Coalson T. Parcellations and hemispheric asymmetries of human cerebral cortex analyzed on surface-based atlases. Cereb Cortex. 2012 Oct;22(10):2241–2262. doi: 10.1093/cercor/bhr291. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3432236/
  180. Van Essen DC, Glasser MF. In vivo architectonics: A cortico-centric perspective. Neuroimage. 2014 Jun;93 Pt 2:157–164. doi: 10.1016/j.neuroimage.2013.04.095. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3767769/
  181. Kandel ER, Shadlen MN. The brain and behavior. In: Kandel ER, Koester JD, Mack SH, Siegelbaum SA. editors. Principles of Neural Science. 6th ed., Kindle ed. McGraw Hill; 2021.
  182. Seeley WW, Menon V, Schatzberg AF, et al. Dissociable intrinsic connectivity networks for salience processing and executive control. J Neurosci. 2007;27(9):2349–2356. doi: 10.1523/JNEUROSCI.5587-06.2007. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2680293/
  183. Sridharan D, Levitin DJ, Menon V. A critical role for the right fronto-insular cortex in switching between central-executive and default-mode networks. Proc Natl Acad Sci U S A. 2008;105(34):12569–12574. doi: 10.1073/pnas.0800005105. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2527952/
  184. Andrews-Hanna JR, Smallwood J, Spreng RN. The default network and self-generated thought: component processes, dynamic control, and clinical relevance. Ann N Y Acad Sci. 2014;1316(1):29–52. doi: 10.1111/nyas.12360. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4039623/
  185. Bonnelle V, Ham TE, Leech R, et al. Salience network integrity predicts default mode network function after traumatic brain injury. Proc Natl Acad Sci U S A. 2012;109(12):4690–4695. doi: 10.1073/pnas.1113455109. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3311356/
  186. Vanhaudenhuyse A, Demertzi A, Schabus M, et al. Two distinct neuronal networks mediate the awareness of environment and of self. J Cogn Neurosci. 2011 Mar;23(3):570–578. doi: 10.1162/jocn.2010.21488. https://www.researchgate.net/publication/44642456_Two_Distinct_Neuronal_Networks_Mediate_the_Awareness_of_Environment_and_of_Self/link/02faf4f86c485956a0000000/download
  187. Alves PM, Foulon C, Karolis V, et al. An improved neuroanatomical model of the default-mode network reconciles previous neuroimaging and neuropathological findings. Commun Biol. 2019 Oct 10;2:370. doi: 10.1038/s42003-019-0611-3. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6787009/
  188. Andrews-Hanna JR. The brain’s default network and its adaptive role in internal mentation. Neuroscientist. 2012 Jun;18(3):251–270. doi: 10.1177/1073858411403316. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3553600/
  189. Buckner RL, Andrews-Hanna JR, Schacter DL. The brain’s default network: Anatomy, function, and relevance to disease. Ann N Y Acad Sci. 2008 Mar;1124:1–38. doi: 10.1196/annals.1440.011. https://www.researchgate.net/publication/5451668_The_Brain’s_Default_Network
  190. Calster LV, D’Argembeau A, Salmon E, Peters F, Majerus S. Fluctuations of attentional networks and default mode network during the resting state reflect variations in cognitive states: Evidence from a novel resting-state experience sampling method. J Cogn Neurosci. 2017 Jan;29(1):95–113.
  191. Christoff K, Gordon AM, Smallwood J, Smith R, Schooler JW. Experience sampling during fMRI reveals default network and executive system contributions to mind wandering. Proc Natl Acad Sci USA. 2009 May;106(21):8719–8724. doi: 10.1073/pnas.0900234106. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2689035/  https://www.pnas.org/doi/10.1073/pnas.0900234106
  192. Fox MD, Snyder AZ, Vincent JL, Corbetta M, Van Essen DC, Raichle ME. The human brain is intrinsically organized into dynamic, anticorrelated functional networks. Proc Natl Acad Sci U S A. 2005 Jul 5;102(27):9673–9678. doi: 10.1073/pnas.0504136102. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1157105/
  193. Greicius MD, Krasnow B, Reiss AL, Menon V. Functional connectivity in the resting brain: A network analysis of the default mode hypothesis. Proc Natl Acad Sci U S A. 2003;100(1):253–258. doi: 10.1073/pnas.0135058100. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC140943/
  194. Greicius MD, Supekar K, Menon V, Dougherty RF. Resting-state functional connectivity reflects structural connectivity in the default mode network. Cereb Cortex. 2009;19(1):72–78. doi: 10.1093/cercor/bhn059. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2605172/
  195. Hagmann P, Cammoun L, Gigandet X, et al. Mapping the structural core of human cerebral cortex. PLosBio. 2008 Jul:6l(7):e159. https://doi.org/10.1371/journal.pbio.0060159 http://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.0060159
  196. Han ME, Park SY, Oh SO. Large-scale functional brain networks for consciousness. Anat Cell Biol. 2021;54(2):152–164. doi: 10.5115/acb.20.305. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8225483/?report=classic
  197. Leech R, Kamourieh S, Beckmann CF, Sharp DJ. Fractionating the default mode network: Distinct contributions of the ventral and dorsal posterior cingulate cortex to cognitive control. J Neurosci. 2011;31(9):3217–3224. doi: 10.1523/JNEUROSCI.5626-10.2011. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6623935/
  198. Mason MF, Norton MI, Van Horn JD, Wegner DM, Grafton ST, Macrae CN. Wandering minds: The default network and stimulus-independent thought. Science. 2007;315(5810):393–395. doi: 10.1126/science.1131295. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1821121/
  199. Raichle ME, MacLeod AM, Snyder AZ, Powers WJ, Gusnard DA, Shulman GL. A default mode of brain function. Proc Natl Acad Sci U S A. 2001 Jan 16;98(2):676–682. doi: 10.1073/pnas.98.2.676. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC14647/
  200. Salomon R, Levy DR, Malach R. Deconstructing the default: Cortical subdivision of the default mode/intrinsic system during self-related processing. Hum Brain Mapp. 2014;35(4):1491–1502. doi: 10.1002/hbm.22268. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6869590/
  201. Sporns O. Structure and function of complex brain networks. Dialogues Clin Neurosci. 2013 Sep;15(3):247–262. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3811098/
  202. Toro R, Fox PT, Paus T. Functional coactivation map of the human brain. Cereb Cortex. 2008;18(11):2553–2559. doi: 10.1093/cercor/bhn014. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2567424/
  203. Uddin LQ, Kelly AM, Biswal BB, Castellanos FX, Milham MP. Functional connectivity of default mode network components: Correlation, anticorrelation, and causality. Hum Brain Mapp. 2009;30(2):625–637. doi: 10.1002/hbm.20531. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3654104/
  204. Vanhaudenhuyse A, Noirhomme Q, Tshibanda L, et al. Default network connectivity reflects the level of consciousness in non-communicative brain-damaged patients. Brain. 2010;133:161–171. doi: 10.1093/brain/awp313. https://www.researchgate.net/publication/40774069_Default_network_connectivity_reflects_the_level_of_consciousness_in_non-communicative_brain-damaged_patients/link/02e7e521eff1173292000000/download
  205. Zhang M, Bernhardt BC, Wang X, et al. Perceptual coupling and decoupling of the default mode network during mind-wandering and reading. Elife. 2022 Mar;11:e74011. doi: 10.7554/eLife.74011. https://elifesciences.org/articles/74011
  206. Abbott LF, Losonczy A, Sawtell NB. The computational bases of neural circuits that mediate behavior. In: Kandel ER, Koester JD, Mack SH, Siegelbaum SA. editors. Principles of Neural Science. 6th ed., Kindle ed. McGraw Hill; 2021.
  207. Kennedy MB. Synaptic signaling in learning and memory. Cold Spring Harb Perspect Biol. 2016 Feb;8(2):a016824. doi: 10.1101/cshperspect.a016824. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4743082/
  208. LaMantia AS. Unit IV. The changing brain. In: Purves D,‎ Augustine GJ,‎ Fitzpatrick D,‎ Hall WC,‎ LaMantia AS,‎ Mooney RD, Platt ML, White LE, editors. Neuroscience. 6th ed. New York: Oxford University Press; 2018:489–623.
  209. Langille JJ, Brown RE. The synaptic theory of memory: A historical survey and reconciliation of recent opposition. Front Syst Neurosci. 2018 Oct. https://doi.org/10.3389/fnsys.2018.00052 https://www.frontiersin.org/articles/10.3389/fnsys.2018.00052/full
  210. Mayford M, Siegelbaum SA, Kandel ER. Synapses and memory storage. Cold Spring Harb Perspect Biol. 2012 Jun;4(6): a005751. doi: 10.1101/cshperspect.a005751. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3367555/
  211. Siegelbaum SA, Clapham DE, Marder E. Modulation of synaptic transmission and neuronal excitability: Second messengers. In: Kandel ER, Koester JD, Mack SH, Siegelbaum SA. editors. Principles of Neural Science. 6th ed., Kindle ed. McGraw Hill; 2021.
  212. Baars BJ, Edelman DB. Consciousness, biology and quantum hypotheses. Phys Life Rev. 2012 Sep;9(3):285–294. doi: 10.1016/j.plrev.2012.07.001. http://www.esalq.usp.br/lepse/imgs/conteudo_thumb/Consciousness–biology-and-quantum-hypotheses.pdf
  213. Babiloni C, Marzano N, Soricelli A, et al. Cortical neural synchronization underlies primary visual consciousness of qualia: Evidence from event-related potentials. Front Hum Neurosci. 2016;10:310. doi: 10.3389/fnhum.2016.00310. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4927634/
  214. Byrne JH. Introduction to neurons and neuronal networks. Neuroscience Online. The University of Texas Health Science Center at Houston (UTHealth). http://nba.uth.tmc.edu/neuroscience/s1/introduction.html
  215. Monk T, Paulin MG. Predation and the origin of neurones. Brain Behav Evol. 2014;84:246–261. https://doi.org/10.1159/000368177. https://www.karger.com/Article/FullText/368177
  216. Bacon-Macé N, Macé MJM, Fabre-Thorpe M, Thorpe SJ. The time course of visual processing: Backward masking and natural scene categorization. Vision Res. 2005 May;45(11):1459–1469. https://doi.org/10.1016/j.visres.2005.01.004 http://www.sciencedirect.com/science/article/pii/S0042698905000027?via%3Dihub
  217. Carbon CC. The first 100 milliseconds of a face: On the microgenesis of early face processing. Percept Mot Skills. 2011 Dec;113(3):859–874. doi: 10.2466/07.17.22.PMS.113.6.859-874. http://journals.sagepub.com/doi/pdf/10.2466/07.17.22.PMS.113.6.859-874
  218. Lamme VAF. Can neuroscience reveal the true nature of consciousness? citeseerx.ist.psu.edu. http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.217.2789  https://www.nyu.edu/gsas/dept/philo/courses/consciousness05/LammeNeuroscience.pdf
  219. Lamme VA, Roelfsema PR. The distinct modes of vision offered by feedforward and recurrent processing. Trends Neurosci. 2000 Nov;23(11):571–579. doi: 10.1016/S0166-2236(00)01657-X. https://www.researchgate.net/publication/12253934_The_Distinct_Modes_of_Vision_Offered_by_Feedforward_and_Recurrent_Processing
  220. Masquelier T, Albantakis L, Deco G. The timing of vision – how neural processing links to different temporal dynamics. Front Psychology. 2011 Jun 30. https://doi.org/10.3389/fpsyg.2011.00151 https://www.frontiersin.org/articles/10.3389/fpsyg.2011.00151/full
  221. Dehaene S. Chapter 4. The signature of a conscious thought. In: Consciousness and the brain: Deciphering how the brain codes our thoughts. New York, New York: The Penguin Group; 2014:115–160.
  222. Fisch L, Privman E, Ramot M, et al. Neural “Ignition”: Enhanced activation linked to perceptual awareness in human ventral stream visual cortex. Neuron. 2009 Nov 25;64(4):562–574. doi: 10.1016/j.neuron.2009.11.001. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2854160/
  223. Aghvami SS, Kubota Y, Egger V. Anatomical and functional connectivity at the dendrodendritic reciprocal mitral cell-granule cell synapse: Impact on recurrent and lateral inhibition. Front Neural Circuits. 2022 Jul 22;16:933201. doi: 10.3389/fncir.2022.933201. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9355734/?report=classic
  224. Ludwig M, Apps D, Menzies J, Patel JC, Rice ME. Dendritic release of neurotransmitters. Compr Physiol. 2016 Dec 6;7(1):235–252. doi: 10.1002/cphy.c160007. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5381730/
  225. Martin EA, Lasseigne AM, Miller AC. Understanding the Molecular and Cell Biological Mechanisms of Electrical Synapse Formation. Front Neuroanat. 2020 Apr 15;14:12. doi: 10.3389/fnana.2020.00012. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7179694/
  226. Pressler RT, Strowbridge BW. Direct recording of dendrodendritic excitation in the olfactory bulb: Divergent properties of local and external glutamatergic inputs govern synaptic integration in granule cells. J Neurosci. 2017 Dec 6;37(49):11774–11788. doi: 10.1523/JNEUROSCI.2033-17.2017. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5719967/
  227. Shepherd GM. Symposium overview and historical perspective: Dendrodendritic synapses: past, present, and future. Ann N Y Acad Sci. 2009 Jul;1170:215–223. doi: 10.1111/j.1749-6632.2009.03937.x. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3819211/
  228. Sotelo C. The History of the synapse. Anat Rec (Hoboken). 2020 May;303(5):1252–1279. doi: 10.1002/ar.24392. https://anatomypubs.onlinelibrary.wiley.com/doi/10.1002/ar.24392
  229. Adriaans P. Information. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy (Fall 2013 edition). https://plato.stanford.edu/archives/fall2013/entries/information
  230. Floridi L. Is semantic information meaningful data? Philos Phenomenol Res. 2005 Mar;LXX(2):351–370. http://www.philosophyofinformation.net/wp-content/uploads/sites/67/2014/05/iimd.pdf
  231. Floridi L. Semantic conceptions of information. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy (Spring 2017 edition). https://plato.stanford.edu/archives/spr2017/entries/information-semantic
  232. Lombardi O, Holik F, Vanni L. What is Shannon information? Syntheses. 2015 Jul. doi: 10.1007/s11229-015-0824-z. https://www.researchgate.net/publication/279780496_What_is_Shannon_information
  233. Madden A. A definition of information. Aslib Proceedings. 2000 Nov;52:343–349. doi: 10.1108/EUM0000000007027. https://www.researchgate.net/publication/241708484_A_definition_of_information
  234. Roederer JG. Pragmatic information in biology and physics. Philos Trans A Math Phys Eng Sci. 2016 Mar 13;374(2063). PII:20150152. doi: 10.1098/rsta.2015.0152. http://rsta.royalsocietypublishing.org/content/374/2063/20150152.long
  235. Shannon CE. A mathematical theory of communication. The Bell System Technical Journal. 1948 Jul;27(30):379–423. https://people.math.harvard.edu/~ctm/home/text/others/shannon/entropy/entropy.pdf
  236. Timpson C. Philosophical aspects of quantum information theory. In: Rickles D, editor. The Ashgate companion to the new philosophy of physics. Aldershot: Ashgate Publishing; 2008.197–261. https://arxiv.org/pdf/quant-ph/0611187.pdf
  237. Timpson C. Chapter 2. What is information? In: Avramides A, Child W, Eagle A, Mulhall S, editors. Quantum information theory and the foundations of quantum mechanics. Oxford, United Kingdom: Oxford University Press; 2013:10–42. ISBN 978-0-19-929646-0, 978-0-19-874813-7.
  238. Imari WS, Hyunju K, Davies PCW. The informational architecture of the cell. Phil. Trans. R. Soc. A. 3742015005720150057. 2016 Mar. http://doi.org/10.1098/rsta.2015.0057  https://royalsocietypublishing.org/doi/10.1098/rsta.2015.0057?keytype2=tf_ipsecsha&ijkey=b7acd66b689ed88cb012ba067bccefdf38e6c93b
  239. Earl B. The biological function of consciousness. Front Psychol. 2014;5:697. doi: 10.3389/fpsyg.2014.00697. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4122207/
  240. Zhong Y. A theory of semantic information. Proceedings. 2017;1,129. doi: 10.3390/IS4SI-2017-04000. http://www.mdpi.com/2504-3900/1/3/129/pdf
  241. Ernesto J, Barrios R. Information, genetics and entropy. Principia.2015;19(1):121–146. doi: 10.5007/1808-1711.2015v19n1p12. https://www.researchgate.net/publication/286403979_Information_Genetics_and_Entropy

> To freely download the PDF version of the whole theory, please click: Download the PDF version of the theory

> Go to Chapter 1

< Go back to Homepage