Information

Information

Contents:

I.1 The Meaning and Definition of Information

I.2 Information Carrier

I.3 Information Content and Meaning

I.4 The Nature of Information: What is information physically and ontologically?

I.5 Information’s Effects and Semantic Information

I.6 The Amount of Information

I.7 How Information can Exert Effects

I.8 The Mechanism for Information to Mean a Certain Thing

I.9 Information: A Basic Feature of the Universe

I.10 Remarks

I.1 The Meaning and Definition of Information

What is Information

Figure I.1 What does “information” mean?

Information is a term that has various meanings depending on several factors, such as who is using the term (layman, engineer, philosopher, etc.) and the context in which the term is being used (non-academic, engineering, philosophical, etc.) [1–9]. Some uses of this term, although frequently referred to in the literature, are technical and not in the ordinary, daily-life sense. For example, “information” in “A Mathematical Theory of Communication” [7], despite not being definitively defined in the theory, is widely cited and used to denote something that, when present as an element in a set, carries more weight or possesses more amount if it is unlikely to occur in that set than if it is otherwise [7,10]; information is what is produced by its source that is required to be reproducible at the destination if the transmission is to be counted as a success [9]; and information is a difference that makes a difference [11]. For general people, these descriptions can be confusing and lead to misinterpretation. Unfortunately, there is no consensus on its definition, even within a single discipline. Nevertheless, a definite definition of information is essential in a scientific work to avoid inconsistency in the work and confusion and misunderstanding among those who study the work. Therefore, a working definition will be established for this study. This theory aims to use “information” that has meaning in an ordinary, daily-life sense. To seek an appropriate definition for such a term, let us spend some time examining the uses of “information” in some ordinary, daily-life usages as follows:

  1. The couple gets information from the salesperson that the house was built with high-grade materials and by a reputable construction company, so they decide to buy it.
  2. Both factual and false information from various sources circulate in social media and can affect the opinions and behaviors of those who believe them.
  3. This science textbook, written by a renowned author, contains important scientific information, which can greatly enhance readers’ knowledge.
  4. The electromagnetic (EM) waves broadcasted from a TV station comprise visual and audio information, which will be converted to images and sounds on TVs.
  5. The computer received the information (visual and audio) about the incident from the Internet, processed it, and then displayed the video on its monitor while playing the audio through its speaker and stored the information on its hard drive.

From these common usages, we can see that the shared essential characteristics of the terms “information” are that each term refers to a non-material thing that is transferable from one physical entity to another, consists of content, and can have physical effects on the receivers. The examples can be rewritten to explicitly demonstrate these essential characteristics as follows:

  1. Information from the salesperson is non-material. It is transferable from the salesperson to the couple. It consists of content—an account of what the house was built with and by whom it was built—and can have physical effects on the couple’s decision.
  2. Information in social media is non-material. It is transferable from the publishers to the social media to the social media consumers. It consists of content—news, stories, comments, predictions, lessons, etc.—and can have physical effects on the consumers’ opinions and behaviors.
  3. Information in the textbook is non-material. It is transferable from the author to the textbook to the readers. It consists of content—descriptions and discussions of scientific discoveries, experiments, research, theories, formulas, and other scientific matters—and can have physical effects on the readers’ knowledge.
  4. Information in the EM waves is non-material. It is transferable from the TV station to the EM waves to the TVs. It consists of content—virtual images and sounds—and can have physical effects on the TVs’ video and audio systems.
  5. Information about the incident is non-material. It is transferrable from the incident to the internet to the computer and then to the computer’s monitor, speaker, and hard drive. It consists of content—the visual and audio of the incidentand can have physical effects on the computer’s processing system, monitor, speaker, and hard drive.

Before we set up the definition, let us examine three more examples to see whether this plausible definition is also valid in complex, biological systems at the nervous system, neuronal, and intracellular levels:

  1. The information in the optic nerve when one looks at the outside world tells the brain what the outside world looks like.
    Information in the optic nerve is non-material. It is transferable from the outside world’s objects to the object’s reflected lights, to the retina, to the optic nerve, and finally to other parts of the brain; it consists of content—visual characteristics of the outside world; lastly, it can have physical effects on the brain (by enabling the brain to know what the outside world looks like).
  2. Information of past events is embedded in the neural synaptic structures and can be retrieved to be used at present by the neural processing system to make appropriate decisions and responses.
    Information of past events is non-material. It is transferable from the past events to the neural synaptic structures to the current neural processing system; it consists of content—an account of past events; lastly, it can have physical effects on the processing system (in making decisions and responses).
  3. The biological information that determines the anatomical, physiological, biochemical, and other biological characteristics of an organism is stored in the genes and exerts its effects via the mRNAs and the protein synthesis system.
    The biological information in the genes is non-material. It is transferable from the genes to the mRNAs to the protein synthesis system; it consists of content—anatomical, physiological, biochemical, and other biological characteristics of the organism; lastly, it can have physical effects on the organism (by determining all the biological characteristics of that organism).

Clearly, that information is a non-material thing that is transferable from one physical entity to another, consists of content, and can have physical effects on the receiver is also true in these realms. Therefore, as this essence seems to be generally applicable, it is reasonable to use it as the basis for our working definition.

However, several issues must be considered prior to establishing a proper definition. First, the “non-material thing” in this usage means something that is not composed of the elementary particles of the Standard Model. Therefore, light and other forms of EM waves, although “non-material” in general usage because they are composed of massless photons, are not “non-material things” in this definition because they are composed of photons, which are elementary particles. They thus cannot be this theory’s “information.” At this point, it should be noted that, although information can exist in light and other EM waves, such as in Example 4 above, the light and EM waves themselves are not information, but some non-material thing in them is. What this non-material thing is will be identified later. Second, because the word “transferable” in this context already means “able to be moved from one entity to another,” the phrase “from one physical entity to another” is unnecessary. Third, because information is a thing (a non-material thing) and a thing always consists of something (otherwise, it would be nothing), information always consists of something, which this theory simply calls content. Therefore, stating that information is a non-material thing already means that it consists of something; thus, the phrase “consists of content” is not necessary either. Fourth, when information is transferred to or read by a receiver, the receiver will always have to process the information. Thus, the receiver will always be changed physically, at least temporarily, because of the entailed processing of the transferred information. The final change can range from trivial to significant, but the extent to which the receiver will finally change is another matter. The matter right now is that, if information is transferred to or read by a receiver, the information will always have some physical effects on the receiver. Hence, the phrase “can have physical effects on the receiver” is not necessary in the definition either.

Based on the above considerations, the definition of information in this theory is as follows:

Definition: Information is a non-material thing that is transferable.

This is the definition of information used in this theory. Please be reminded that it is a concise version of the longer, detailed definition: Information is a non-material thing that is transferable from one physical entity to another, consists of content, and can have physical effects on the receiver. We will see that this kind of information is a basic component of this universe and that various important entities, such as the mind, consciousness, and qualia, will be found to be composed of it in various forms. For now, let us examine this “information” in more detail first.

I.2 Information Carrier

Because information is transferable between physical entities, it must exist in one physical entity before and in another after the transfer. Therefore, information must always exist or be embodied in some physical entity [12,13], which will be called an information carrier or, in short, a carrier.* For example, in the first five examples in I.1, the information carriers are 1. speech sound, 2. electronic media, 3. printed texts, 4. EM waves, and 5. electrical signals in wired internet connections, light waves in fiber-optic cables, and radio waves in wireless internet connections.

(*Although some authors believe that information could exist alone by itself (without a physical object), this issue is still controversial [2]. Such information, if existing, is not information that is discussed in this theory because it is not transferable from one physical entity to another, as required by the definition. Thus, it will not be discussed further.)

It can be noted that information carriers can be either static or dynamic. Static carriers do not change their structures during information transfer, whereas dynamic carriers do. Examples of static carriers include paper and other stable materials that can bear visible or tactile signs (plastic, wood, metal, etc.), magnetic tape, hard disks, optical discs, solid-state drives, and various biological molecules/structures (e.g., blood group antigens, human leukocyte antigens, DNA, etc.). Examples of dynamic carriers include sound, electrical, and EM waves. In fact, all physical objects, including elementary and quantum particles, carry information [13,14]. The reason for this will be clear after we understand the nature of information in Section I.4.

Regarding the nervous system, which is the principal system that this theory studies, the major carriers of information that is sent between its basic components, neurons, are a) electrical signals in the form of propagating action potentials at the membranes of axons, dendrites, and somas; synaptic potentials at the synapses; and end-plate potentials at the neuromuscular junctions, and b) chemical signals at synapses and neuromuscular junctions using various neurotransmitters. The major carriers of information that is stored in this system are the structures of synapses and neural circuits [15–23].

I.3 Information Content and Meaning

As discussed in Section I.1, information consists of content. Because information is non-material, its content must also be non-material. For example, in the first five examples in Section I.1, the information content is 1. an account of what the house was built with and so forth, 2. news, stories, comments, and so on, 3. descriptions and discussions of various scientific matters, 4. virtual images and sounds, and 5. the visual and audio of the incident; clearly, all these kinds of content are non-material.

Regarding its function, the information content is a representation of something or means something to a receiver. That something can be anything—it can be material or non-material and range from very simple to very complex and from virtually meaningless to extremely meaningful. For instance, in Examples 1, 2, and 3 in Section I.1, the information content represents something non-material (1. an account of …, 2. news, stories, comments, … 3. descriptions and discussions of …), while in Examples 4 and 5, it represents both non-material and material things (e.g., the non-material story of the movie and account of the incident, and the material objects in the broadcasted movie and incident). Some examples concerning information complexity are: The information content of a head signaling by nodding or shaking means, to ordinary people, yes or no, which is very simple and requires just 1 bit to represent in the electronic binary system, whereas the information content of a radio wave streaming from a digital TV station that broadcasts a movie means, to digital TV processing systems, the movie, which is very complex and requires gigabytes or more to be represented in the electronic binary system.

The meaning of information content depends not only on the content itself but also on the receivers and the context of the information transfer. For example, the information content in the last two examples above is meaningful to its receivers. However, if the receivers in the former case were animals and those in the latter case were analog TV processing systems, the content of the same information would be practically meaningless to the receivers in both cases. Some examples regarding the context are: If the head nodding or shaking is done in some countries, such as Bulgaria, it will represent no or yes, respectively, instead of yes or no, as in most countries; or if it is done while a person is testing something, such as tasting a food or drink, it will represent approval or disapproval, respectively, instead of yes or no. This matter will be analyzed in more detail in Section I.5. However, the important point at this juncture is that the content of information represents something or means something, which can be anything, to the receivers.

Theoretically, because information content can mean anything, some information content can mean qualia (phenomena that manifest what they are like in the mind, such as visions, sounds, smells, emotions, and thoughts) and consciousness (awareness and experiences of such visions, sounds, smells, and other qualia) to some receivers. Therefore, if the nervous system can somehow create information content that means qualia or consciousness and send it to some neural process that can interpret this information, then qualia or consciousness will appear in that neural process, and thus in the brain. This is the principle underlying how phenomenal qualia (such as visions, sounds, and smells that we experience in our minds) and consciousness (awareness and experiences of such visions, sounds, smells, and other qualia) can occur in the physical brain. It is discussed in detail in Chapters 3 to 7.

I.4 The Nature of Information: What is information, physically and ontologically?

Now, let us find the solution to a fundamental question about information: What is information’s physical and ontological nature?

Similar to the mind, qualia, and consciousness, the answer can be found by considering its physical properties as follows:

As discussed in the previous sections, information a) is a non-material thing that is not composed of elementary particles, b) is transferable between entities, c) always exists in a carrier, d) consists of content that can mean anything (from very simple to very complex), and e) can have physical effects on the receiver. The entity that is information must also have these properties. Now, if we examine the information and its carriers in all the examples discussed so far, we find that the only entity with all these properties is the pattern of the carrier. This is because a) the pattern of any carrier is a non-material thing that is not composed of elementary particles (although its carrier is), b) the pattern is transferable between entities because a receiver can read, that is, receive, it into its system for subsequent processing, c) the pattern of a carrier always exists in the carrier (a simple fact), d) the pattern of a carrier can consist of content that can mean anything because patterns, in general, can vary infinitely (from arbitrarily simple to arbitrarily complex), and the pattern of a carrier can be any of these infinite patterns, and e) the pattern can have physical effects on the receiver because of the entailed pattern processing in the receiver. Therefore, it can be concluded that information is the pattern of its carrier.

It is important to note that the pattern of a carrier refers to the arrangement of the carrier’s components in space and/or time, so a pattern can be spatial, temporal, or spatiotemporal. An obvious example is the pattern of dots or pixels appearing as the text the reader is reading. The pattern is a non-material thing; it is transferable from the paper or electronic screen to the reader (i.e., the reader can read the pattern and take it into the reader’s processing processes); it always exists in the ink dots on the paper or electronic pixels on the computer screen (without the dots or pixels, the pattern cannot exist); it consists of content, which, in this case, is a discussion about the nature of information, but can vary because the dot or pixel pattern can change to any pattern to represent anything; lastly, it can have effects on the reader (e.g., alter the reader’s knowledge, concept, and activities). An example of a temporal pattern that is information is the Morse code, which utilizes patterns of signals with different time durations. The temporal pattern of Morse code is a non-material thing; it is transferable from the sender to the receiver; it always exists in the time durations of the signals (sounds, flashes of light, pulsations of something, etc.); it consists of content, which in this case is a letter to be sent, but which can vary because the patterns of signals with different time durations can change to represent or mean anything; lastly, it can have effects on the receiver (such as changing the receiver’s knowledge).

That information is the pattern of its carrier is also true in the examples in Section I.1. For example,

  • The information that the salesperson tells the couple is his sound pattern. If his sound pattern changes, the information will change; that is, he tells the couple something else.
  • The visual and audio information broadcasted from the TV station is the patterns of EM wave frequency and amplitude. If these patterns change, the visual and audio information will change, and the resulting images and sounds on the TV will change.
  • The information of an image in the optic nerve is the pattern of electrical signaling in the nerve. If this signaling pattern changes, the information will change, and the brain will receive a different image.

Information can be either static or dynamic [6]. During information transfer, static information does not change its content (such as information on paper, on a solid-state drive, and in a DNA base sequence), whereas dynamic information does (such as information in live reports, movies, and analyzing electronic circuits). Both types of information are embodied in carriers by being the spatiotemporal patterns of the carriers (such as the patterns of dots on a paper, bits 0 and 1 in a solid-state drive, base sequences in a DNA strand, frequency and amplitude of sound waves, and duration of sounds [in Morse code]).** In the nervous system, dynamic information is carried principally by electrical or electrochemical signals in neural circuits in the form of signaling patterns (information sent to other neural circuits) and signaling states (information circulating in the entire neural circuits). For later use, both kinds of information are stored in the patterns of synaptic or circuit structures [6,15–26]. For example, when the visual perception neural circuit has finished the process of perceiving the vision of a house, it will have signals circulating in its circuit in a particular pattern—it will be in a certain signaling state—that is the information about the visual perception of the house. When it communicates this information with other neural circuits, it will send signaling patterns that are this information via its axons to the other neural circuits. When the information about this visual perception is stored for later usage, it will be embodied in certain synaptic or circuit structures in the brain.

(** It should be noted that static information [such as a still image] can be carried by both static carriers [such as a paper or a solid-state drive] and dynamic carriers [such as an EM wave] and that, similarly, dynamic information [such as a movie] can be carried by both static carriers (such as a solid-state drive) and dynamic carriers (such as an EM wave).

Thus, information is the spatiotemporal pattern of something—specifically, of its carrier. This accords with other authors who have realized this fact before. For example, Roederer (2016) said that “Pragmatic information is encoded in the brain dynamically in short-term patterns of neural impulses and statically in the long-term patterns of synaptic architecture.” [6]; Timpson (2013) said that “The basic idea is of a pattern or structure: something which can be repeatedly realized in different instances; and perhaps realized in different media.” [9]; Imari et al. (2016) said that “The signature of biological structure uncovered by our analysis therefore lies not with the underlying causal structure (the network topology) but with the informational architecture via specific patterns in the distribution of correlations unique to the biological network.” [27]; Földiák (2013) said that “Such symbols are not limited to words in a human language; they can be arbitrary patterns on a wide variety of possible physical carriers. They can be sent in a letter, or over a fiber optic cable with minimal cost in energy and time. For example, information can be encoded by ink marks on paper, a lit candle in a window, flags on a ship,” [28]; and an entry in The Oxford Pocket Dictionary of Current English states that “(Information is) what is conveyed or represented by a particular arrangement or sequence of things” [29].

In the most basic concept, because every physical object has components and each component can exist in various states, every physical object always has a pattern—a pattern composed of its components in particular states. For example, object O (Figure I.2) comprises two lines, thus having the two lines as its components, and its components (lines) can exist in various states (such as in various directions, as shown in the figure); hence, object O always has a pattern composed of these two lines, each in a particular state (a particular direction, as illustrated in the figure). Some patterns, among infinite patterns that object O can have, are as shown below:

A pattern is information

Figure I.2 Patterns of an object O comprising two lines

To repeat this important point, because object O consists of two lines and the two lines always exist in some states, composing some patterns, it is inevitable that object O always has a pattern in some form—it cannot exist without a pattern.

The last statement above applies to all objects, including a single elementary particle. Every elementary particle has several characteristics, such as spatial position, velocity, mass/energy, electrical charge, and spin [30–33], all of which can be considered components of the particle. Accordingly, we can think of these characteristics as the above lines, which are components of object O. Because each of these characteristics can exist in various states (in the same way that each of the two lines above can exist in various directions), every elementary particle always has a pattern composed of its component characteristics in particular states (in the same way that object O always has a pattern composed of its component lines in particular directions).

Because every physical object always has a pattern and information is the pattern of a physical object, every physical object always has information, and information is inherent in every physical object. This fact includes every single elementary particle because every single elementary particle always has a pattern, as just discussed.

Finally, it should be noted that, although an information carrier is a physical object, composed of elementary particles, the information or pattern of its carrier is not composed of them. Hence, information basically differs from its carrier and all other entities that are composed of elementary particles, which are called conventional physical or mechanical entities (see Section 2.4, Chapter 2), such as all material things around us, lights, and other EM waves. This is why information appears very different from these conventional physical entities.

I.5 Information’s Effects and Semantic Information

As discussed in the previous section, the pattern of a carrier is the information that is embodied in that carrier. However, it is essential to note that, although the pattern of a particular carrier or the information embodied in that carrier is fixed, the interpretations of the pattern or information by different receivers vary. As a result, the meaning to and the effects on the receivers vary. Theoretically, for a certain pattern read by different receivers, an infinite number of interpretations with different meanings and effects exist. This is because, for a certain pattern and information, the interpretations, meanings, and effects depend not only on the pattern or information itself but also on the receivers and contexts of information transfer, the latter two of which can vary without limits.

For example, for a hand raising two fingers, as shown in Figure I.3, the pattern of the fingers or the information embodied in them can mean two, victory, or the letter “V” to general people, but it means nothing specific to animals, babies, and people who do not use these conventions.

Information - Two finger pattern

Figure I.3 Two-finger Pattern

Also, whether it means two, victory, or the letter “V,” what its effects on the receivers, and what the receivers’ final states are depend on, for instance, whether the receiving person is waiting to know how many concert seats are available, whether his or her soccer team has won or lost, or what the letter that is missing in the word “li_e” is. Because the variability of the receivers and contexts is limitless, there can be infinitely different meanings and effects for this two-finger pattern.

Regarding the final effects of information, we can conclude from the previous discussions that a) if information is meaningful to the receiver, that is, if it means something specific to the receiver (such as those meanings in Examples 1 to 8 in Section I.1; or two, victory, or the letter “V” in the above example; or in general, an answer to a question of whether, what, who, whom, which, when, where, why, or how [34]), it will have specific effects that finally change the receiver into a significantly different physical state (such as those in eight examples in Section I.1) and b) those final effects and the resulting states are different for different meanings (such as different resulting states of peoples interpreting different meanings of the two-finger pattern in the example above). On the other hand, if information is meaningless (that is, if it does not mean something specific but is just a nonspecific input to a receiver), it will have only nonspecific effects that will not change the receiver into a significantly different physical state. These nonspecific effects result from the receiver’s regular physical processing of any received information. They finally dissipate and, ultimately, do not change the receiver into a new physical state (such as in the case of animals or babies seeing the two fingers in the example above). Hence, it can be concluded that the meaning of information to a certain receiver is the effect of the information on that receiver.

As already discussed, the meaning and effect of information depend not only on the pattern of the carrier but also on the receiver and context. Interestingly, in some cases, different patterns (different information) can yield the same effects on the same receiver in the same context. For example, after Alice asks Bob what number she has to jot down on the paper, Bob can say three, raise three fingers, or show a card with “3” on it, and normally, Alice would write three on the paper for each signal. This means that the different patterns of Bob’s signaling, which are different information, have the same effects on and meanings to Alice. Such different information with the same effects is said to be semantically the same. That is, when different patterns (different information) have the same effects on the receiver, they have the same meaning to the receiver and are thus semantically the same. Vice versa, different patterns (different information) have the same meanings, are semantically the same, and are the same semantic information to the receiver if they have the same effects on the receiver.

This concept has been expressed before: Casagrande (1999) said that “Marking [of information] can occur by one person rearranging patterns of meaning (e.g., color and form in the case of modern art) so that they are only recognizable at a specific epistemological level. Marking also relies on the ability of the receiver to recognize meaning in patterns.” [35], and Dubrovsky (2019) said that “When I say to a student: ‘Come up to me!’ and he performs this action, it is caused and determined not by the physical properties of the words I uttered but namely by the information expressed with their help, its semantic and pragmatic features. … I can cause exactly the same effect to be brought about by other words and, in general, by signals very different in their physical properties (by virtue of the principle of invariance of information with respect to the physical properties of its carrier…)” [36].

I.6 The Amount of Information

The next interesting question is: What is the amount of information (AOI) of a piece of information, or information piece? The answer depends on what we mean by AOI. If we define it as something that quantitatively correlates with an information piece’s effect, then the AOI of an information piece depends on what the receiver and the context of the information transfer are because information’s effects vary with these two factors, as discussed above. Therefore, because its effects are variable, an information piece does not have a fixed AOI associated with it.

However, AOI can be defined in other ways. If we define the AOI of an information piece as something quantitatively correlates with the number of characters, electronic bits, or other physical items employed to represent that information piece, then the AOI of that information piece will be fixed (not dependent on the receiver and the context of the information transfer) but depend on how we represent it. For example, the AOI of consciousness and 意识 (the Chinese word for consciousness) are fixed at 13 and 2 characters, respectively, regardless of the receiver and the context of information transfer. Notably, the AOI of these two information pieces is different even though they represent the same thing (consciousness). Also, if we choose other representations, such as different characters, for consciousness, the AOI of these information pieces will also differ even though they are information pieces of the same thing (consciousness).

The AOI of an information piece, henceforth abbreviated as IP for conciseness, may also be defined in yet other ways. One definition that is important in communication engineering can be inferred from Shannon’s theory (1948) [7]. In this case, the AOI of an IP in a set of IPs is defined as something that quantitatively correlates with the improbability of its occurrence in that set—the greater the improbability, the more AOI. For example, in the letter set of {m,i,n,d}, every letter has equal AOI because each item occurs with equal probability (hence, equal improbability), while in {d,a,t,a}, letters “d” and “t” has more AOI than “a” because they occur less often in the set than “a” does (this is the case even though each character is equally important to us in the usual sense!). It should be noted that, similar to the second definition above, this kind of AOI is fixed for an IP (not dependent on the receiver and the context of the information transfer). For instance, the characters in the two sets exemplified have fixed AOI no matter who receives them and what the situations of sending them are.

The receiver- and effect-independent characteristic of this type of AOI meaning can be seen more clearly in the following example. Please consider a set of characters representing two equations:

{a. XX = 4, b. X + YY/X = 4}.

In this set, the a and b occur once among all the 26 items (including spaces, periods, and a comma but not the curly brackets, which are not content in the set but markers of the set). Because the probability of a’s or b’s occurrence (or the chance that a or b will occur) in this set is 1/26, their occurrence improbabilities (or the chance that a or b will not occur) is 25/26 each in this set. Since, by this definition, their AOI quantitatively correlates with the improbability of their occurrences in the set, their AOI is proportion to some monotonically increasing function, f( ), of this occurrence improbability. Thus, for a and b, their AOI is f(25/26). Because the occurrence probabilities of X and Y in this set are 4/26 and 2/26, respectively, their occurrence improbabilities are 22/26 and 24/26, respectively, and their AOI is f(22/26) and f(24/26), respectively—both of which are less than f(25/26).

Hence, because they occur least often in the set, a and b have the most AOI, even though they are mere item symbols and do not have significant effects on the usual receivers, who have something to do with (such as solving) these equations. They can even be left out. In contrast, X and Y, even though they are essential IPs in the set, have less AOI because they occur more often. Also, even though X and Y are of equal importance because they are similar unknown variables to be solved, they have different AOI. Therefore, the reader should be mindful of what the meaning of AOI by this definition is about.

(N.B. This rather surprising meaning of AOI originated from the attempt to communicate information through an electronic channel in the most efficient way, i.e., with the least number of electronic bits possible. For a set consisting of IPs, this can be achieved if the number of bits representing an IP is proportional to that IP’s occurrence improbability in the set. That is, the more frequently occurring IPs are represented with fewer bits (such as 00, 01, 10, 11, 100, 101, and so on), while the less frequently occurring IPs are represented with more bits (such as 111, 1110, 1111, and so on). For example, in the example above, the equation set can be sent with the fewest number of bits if such IPs as X and Y, which occur more frequently, are sent with fewer bits (such as two or three bits) while the IPs a and b, which occur less frequently, are sent with more bits (such as four bits). However, because an IP with lesser occurrence probability (such as a and b in the example) has more bits representing it, some hold that it has more AOI. Certainly, this AOI is not the AOI in the ordinary life sense. But confusion and misunderstanding can occur if one does not observe what this AOI really refers to.)

Also, it should be noted that if we represent XX and YY in other forms, such as X^2, Y^2, X2, or Y2, the occurrence probabilities of both X and Y in this set will change. Consequently, their AOI will change even though they are still the same variables.

Summarily, the amount of information (AOI) of a piece of information (IP) depends on its definition. In some definitions, such as the first definition, it is variable and depends on its effects on the receivers—which are dependent on what the receivers and the context of the information transfer are. Thus, in this definition, an IP does not have a fixed AOI associated with it. In some definitions, such as in the last two definitions, the AOI for any IP depends on how we represent it. Moreover, the AOI for any IP is fixed and always the same for all receivers regardless of its meanings and effects, which may differ among those receivers. In sum, the AOI of an IP in these last two definitions does not correlate with the IP’s effects on the receivers but with the number of characters, electronic bits, or other physical items we use to represent the IP.

I.7 How Information Can Exert Effects

The next crucial question is how information, which is a non-material, non-conventional physical entity, can affect a conventional physical system.

This answer can be found by examining what happens in information transfer. First, consider a case of a woman waiting for information from a man regarding the number she must say. When the information is given by the man by raising two fingers and the woman sees them, she receives the required information and speaks out the correct number. Superficially, it seems that the information leaped from the signaling man to the woman and somehow made the woman say the number correctly (as illustrated in Figure I.4).

How information exerts effects

Figure I.4 Information leaping from the man?

However, what actually happens is that a) streams of photons reflected from the information source (the fingers) hit the woman’s retinae, creating electrical signals there, b) the created signals thereafter run along the optic nerves and then the optic tracts to the primary visual processing circuits and then to many subsequent visual processing and other related circuits in the woman’s brain, including those for language, cognition, and motor command, and c) after the signals are processed by those complex processing circuits, the final signals are created that make the woman’s vocalizing organs produce the sound “two” correctly. All these events are predictable physical cascades of electromagnetic, electrochemical, biochemical, and, finally, mechanical interactions involving only conventional physical components of the information source, intermediate information carrier (the light waves), and woman’s processing systems. Nowhere in the chain of these cascades does anything other than two fundamental physical forces (mainly the electromagnetic force with some gravitational force later) exert effects on these conventional physical components. Thus, this event is just an ordinary physical phenomenon in a very complex way, with only conventional physical entities and laws involved—neither any new physical entity nor any new physical law is involved in these physical interaction cascades. Hence, it may seem that the entire process occurs without any effects from the information, and information is merely an invented, redundant concept.

Nevertheless, despite the seeming redundancy, information is not a dispensable or imaginary entity—information does exist as an entity in the universe, has physical effects, and cannot be rid of. This matter can be demonstrated as follows:

Information - how it exerts effects

Figure I.5 How information exerts its effects

Let us reconsider what happens in the information transfer in the previous example (please see Figure I.5). We will see that the entire process and final result of the information transfer depend, in fact, on the initial characteristics of the interacting systems, which include the information source, receiver, and surrounding systems. In each system, the initial characteristic is the initial pattern of all the physical characteristics (the spatial position, velocity, mass/energy, etc.) of the constituent particles. These initial patterns of the three systems are the determinants of their initial interactions, which determine subsequent interactions and, ultimately, the final result. Therefore, because the source’s information is the initial pattern of the source system, it is one of the three factors that determine how the entire process initiates and proceeds and what the final result is. If the initial patterns of the other two systems remain the same, the whole process’s development and final result will vary with the initial pattern or information of the source system only. In the above example, the pattern of the man’s gesture (the two fingers) is the initial pattern of the source system. This initial pattern determines what will happen and what the final result will be for the woman. If this pattern changes, such as if the man shows three fingers, even though the woman and the surroundings do not change, the initial characteristics of the three systems will change, and the entire process and final result in the woman will change. On the contrary, if the pattern (the two-finger pattern) does not change, even if the man is replaced by another man, a woman, a child, a robot, etc., the overall initial characteristics of the three systems will essentially not change, and the entire process will practically be the same, with the same final result in the woman: She will say “two.”

Hence, although the source’s information does not affect each of the innumerable physical interactions in these complex interaction cascades directly by being the forces or physical components involved, it affects the entire interactions and final result from the very beginning by being the initial pattern of the physical components of the information source. Because such effects from information cannot be removed, information does have physical effects, exists as an entity in the universe, and cannot be eliminated.

It should be noted that this concept is similar to previous notions by other authors. For example, Madden (2000) said that “information should be defined as: a stimulus originating in one system that affects the interpretation by another system of either the second system’s relationship to the first or of the relationship the two systems share with a given environment…” [5]; Lombardi et al. (2015) said that “Peter Kosso states that ‘information is transferred between states through interaction.’ (1989, p. 37). The need of a carrier signal sounds natural in the light of the generic idea that physical influences can only be transferred through interactions.” [4]; and Roederer (2016) said that “Information-driven interactions all involve complex systems in the classical domain, with time sequences which fulfil the dictates of causality, locality, special relativity and thermodynamics. It is important to emphasize that information-driven interactions do function on the basis of force-driven interactions between their components—what counts is how these purely physical components are put together in the interaction mechanism (the ‘informational architecture’ of Walker et al.).” [6].

I.8 The Mechanism for Information to Mean a Certain Thing

One of the most interesting and important questions regarding information is: What is the mechanism for a piece of information to mean a certain thing to a receiver? Importantly, what is the mechanism for a piece of information to mean phenomenal qualia or phenomenal consciousness to a receiver? If we know this mechanism, we can apply it to make a machine or robot have phenomenal qualia and phenomenal consciousness in its system, and it will become conscious like us.

The answer is that, for a piece of information I to mean a certain thing M to a receiver R, I must have physical effects on R such that R changes into R + M; that is, R changes into a new physical entity that has the meaning M in its system. This will make all R’s subsequent activities based on having M in its system instead of on the lack of M in the system. The other requirement is that these effects must be caused purely by a physical interaction cascade initiated by the interaction between I, R, and the surrounding system.

For example, please see Figure I.6. If the initial state R consists of a group of particles [a,b,c,d,e,f,g], with specified characteristics—such as spatial position, velocity, mass/energy, electrical charge, and spin—for [a] = a1,a2,a3,a4,a5, respectively, for [b] = b1,b2,b3,b4,b5, respectively, and so on, and if the final state R + M consists of a group of particles [a′,b′,c′,d′,e′,f′,g′], with similar specified characteristics for [a′] = a′1,a′2,a′3,a′4,a′5, [b′] = b′1,b′2,b′3,b′4,b′5, and so on,

Mechanism for information to mean something

S = a signal with information I, S′ = the signal after the information transfer
R= a receiver, R + M = the receiver after the information transfer

Figure I.6 The mechanism for signal S with information I to mean M to receiver R

then for I to mean M to R, I’s particles (the particles of I’s carrier, to be exact) must interact with R’s particles such that a physical interaction cascade occurs in R that finally changes R to R + M, that is, changes

[a]:(a1,a2,a3,a4,a5) to [a′]:(a′1,a′2,a′3,a′4,a′5),

[b]:(b1,b2,b3,b4,b5) to [b′]:(b′1,b′2,b′3,b′4,b′5),

and so on.

Similarly, for an SP (signaling pattern) of the visual perception neural process, which is the information (I) of this neural process, to mean “a visual quale of a house” or “an image of a house that manifests what it looks like” (M) to the consciousness neural process (R), which supposedly consists of a group of particles [a,b,c,d,e,f,g], the following must be the case:

  • The final state of the consciousness neural process with information that means “a visual quale of a house” in its system (R + M) must be able to exist. That is, the state of [a′,b′,c′,d′,e′,f′,g′] with specified characteristics for [a′] = a′1,a′2,a′3,a′4,a′5, [b′]= b′1,b′2,b′3,b′4,b′5, and so on must be possible, which indeed is possible because we really can be aware of a visual quale of a house, meaning M can exist in R.
  • The SP (I) must have physical effects on the consciousness neural process (R), which consists of [a,b,c,d,e,f,g], such that a physical interaction cascade occurs (in the consciousness neural process [R]) that finally changes the consciousness neural process (R) into the consciousness neural process with information that means “a visual quale of a house” (M) in its system, i.e., (R + M). That is, in detail, the physical cascade that occurs must finally change [a]:(a1,a2,a3,a4,a5) to [a′]:(a′1,a′2,a′3,a′4,a′5), [b]:(b1,b2,b3,b4,b5) to [b′]:(b′1,b′2,b′3,b′4,b′5), and so on. This process will make subsequent activities of the consciousness neural process and its connected neural processes consist of information with this meaning (M), rendering them able to report that the quale has occurred, remember it, create an emotion towards it, and so on.

To reiterate, as we know that qualia occur in us and have various activities about them (such as thinking, talking, and writing about them), the state of consciousness with qualia information in its system (R + M) must exist. This, in turn, means that a signaling pattern SP with information I that has the meaning M of a quale can occur and be transferred, processed, and interpreted as a quale by the consciousness process (R). Because the consciousness neural process interprets the information as a quale, a quale occurs in the consciousness neural process, and thus in the physical brain. This is how information can mean a quale and how a quale can occur in the physical system, the consciousness neural process of the brain. The matter is the same for a signal with information I meaning consciousness. In this case, the information I has consciousness, instead of a quale, as its meaning (M), but the whole interaction process proceeds similarly.     

Therefore, for a machine or robot to have phenomenal qualia and phenomenal consciousness in its system or for a machine or robot to experience qualia and be conscious, a state of R + M or a state of the machine’s processing system with information that means phenomenal qualia and phenomenal consciousness in its system must exist. Although, at present, we do not know whether this state is possible, if we want to make a machine or robot with phenomenal qualia and consciousness, we must assume that it is possible and strive to realize it. However, because of the requirement above, we should remember that trying to create conscious machines or robots will never succeed if we aim only for the complexity of their electronic circuits and processes but neglect this elementary fact.

Conscious robot

Figure I.7 An unconscious robot (A) vs a conscious robot (B)

Essentially, the principal target must be generating information that means phenomenal qualia and consciousness in their processing systems. This can be achieved by imitating how the nervous system produces and then interprets information with such meanings.

I.9 Information: A Basic Feature of the Universe

Hence, physically and ontologically, information is not a new physical entity governed by new physical rules but is the patterns of conventional physical entities. Because conventional physical entities are governed and predictable by conventional physical laws, so is information, which is just their patterns. Thus, information is a physical entity as well, but an unconventional one. Unlike conventional physical entities, it is not composed of elementary particles—it is a pattern of them—which is why it is so different from conventional physical entities.

Because information is the pattern of a physical object and a physical object always exists in some pattern, a physical object always has information embodied in it, and this fact includes every single elementary particle, as discussed in Section I.4.

Regarding physical interactions, because how all physical interactions transpire depends not only on the physical rules that govern the interactions but also on the interacting agents’ characteristics, which always exist in patterns, and because patterns are information, interacting agents’ information is the basis of their interactions. This is true for all physical interactions, from those between elementary particles to those between galactic clusters. Thus, fundamentally, physical interactions are information processing, and information determines all events in this universe. This has been the case since the inception of everything, and based on existing evidence, it will always be. Therefore, information is a fundamental feature of the universe.

Again, the concept that information is a basic feature of the universe is not novel; several authors have previously expressed this idea. For example, Wheeler (1990) pointed out that “In short, that all things physical are information-theoretic in origin and that this is a participatory universe” [37], and there is a concept that endorses an information-theoretic, metaphysical monism: The universe’s essential nature is digital, being fundamentally composed of information as data/dedomena instead of matter or energy, with material objects as a complex secondary manifestation [3].

I.10 Remarks

It is important to bear in mind that “information” in this theory is a non-material thing that is transferable. In detail, this means that it is a non-material thing that is transferable from one physical entity to another, consists of content, and can have physical effects on the receiver. This type of information does not exist by itself but always in a physical entity, called a carrier. Physically and ontologically, it is the pattern of its carrier. According to this theory, the mind, qualia, and consciousness are entities that are composed of information (in certain forms). That is, the reader’s mental images, sounds, smells, etc., awareness of them, feelings, emotions, memories, and all other mental phenomena, including the reader’s thought right now, are information in some forms.

The answer to how information can manifest itself as these mental phenomena can be found in the chapter “The Explanatory Gap,” and the answer to why such information occurs in us can be found in the chapter “The Hard Problem.”


> To freely download the PDF version of the whole theory, please click: Download the PDF version of the theory.

> To go to the Homepage of the theory, click Homepage.


References

  1. Adriaans P. Information. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy (Fall 2013 edition). https://plato.stanford.edu/archives/fall2013/entries/information
  2. Floridi L. Is semantic information meaningful data? Philos Phenomenol Res. 2005 Mar;LXX(2):351–370. http://www.philosophyofinformation.net/wp-content/uploads/sites/67/2014/05/iimd.pdf
  3. Floridi L. Semantic conceptions of information. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy (Spring 2017 edition). https://plato.stanford.edu/archives/spr2017/entries/information-semantic
  4. Lombardi O, Holik F, Vanni L. What is Shannon information? Syntheses. 2015 Jul. doi: 10.1007/s11229-015-0824-z. https://www.researchgate.net/publication/279780496_What_is_Shannon_information
  5. Madden A. A definition of information. Aslib Proceedings. 2000 Nov;52:343–349. doi: 10.1108/EUM0000000007027. https://www.researchgate.net/publication/241708484_A_definition_of_information
  6. Roederer JG. Pragmatic information in biology and physics. Philos Trans A Math Phys Eng Sci. 2016 Mar 13;374(2063). PII:20150152. doi: 10.1098/rsta.2015.0152. http://rsta.royalsocietypublishing.org/content/374/2063/20150152.long
  7. Shannon CE. A mathematical theory of communication. The Bell System Technical Journal. 1948 Jul;27(30):379–423. https://people.math.harvard.edu/~ctm/home/text/others/shannon/entropy/entropy.pdf
  8. Timpson C. Philosophical aspects of quantum information theory. In: Rickles D, editor. The Ashgate companion to the new philosophy of physics. Aldershot: Ashgate Publishing; 2008.197–261. https://arxiv.org/pdf/quant-ph/0611187.pdf
  9. Timpson C. Chapter 2. What is information? In: Avramides A, Child W, Eagle A, Mulhall S, editors. Quantum information theory and the foundations of quantum mechanics. Oxford, United Kingdom: Oxford University Press; 2013:10–42. ISBN 978-0-19-929646-0, 978-0-19-874813-7.
  10. Ernesto J, Barrios R. Information, genetics and entropy. Principia.2015;19(1):121–146. doi: 10.5007/1808-1711.2015v19n1p12. https://www.researchgate.net/publication/286403979_Information_Genetics_and_Entropy
  11. Bateson G. Form, substance, and difference. In: Steps to an Ecology of Mind. Northvale, New Jersey, London: Jason Aronson Inc.;1972,1987:455–457. ISBN 0-87668-950-0.
  12. Doyle B. What is information? The Information Philosopher. https://www.informationphilosopher.com/index.29.en.html/fqs/article/view/1450/2946  https://www.informationphilosopher.com/
  13. Bub J. Quantum entanglement and information. In: Zalta EN, Nodelman U, editors. The Stanford Encyclopedia of Philosophy (Summer 2023 Edition). https://plato.stanford.edu/entries/qt-entangle/#QuanInfo https://plato.stanford.edu/archives/sum2023/entries/qt-entangle/
  14. Vopson MM. Estimation of the information contained in the visible matter of the universe. AIP Advances. 2021;11(10):105317. doi: 10.1063/5.0064475. https://pubs.aip.org/aip/adv/article/11/10/105317/661214/Estimation-of-the-information-contained-in-the
  15. Abbott LF, Losonczy A, Sawtell NB. The computational bases of neural circuits that mediate behavior. In: Kandel ER, Koester JD, Mack SH, Siegelbaum SA. editors. Principles of Neural Science. 6th ed., Kindle ed. McGraw Hill; 2021.
  16. Amaral DG. The neuroanatomical bases by which neural circuits mediate behavior. In: Kandel ER, Koester JD, Mack SH, Siegelbaum SA. editors. Principles of Neural Science. 6th ed., Kindle ed. McGraw Hill; 2021.
  17. Augustine GJ. Unit I. Neural signaling. In: Purves D,‎ Augustine GJ,‎ Fitzpatrick D,‎ Hall WC,‎ LaMantia AS,‎ Mooney RD, Platt ML, White LE, editors. Neuroscience. 6th ed. New York: Oxford University Press; 2018:31–190.
  18. Bean BP, Koester JD. Propagated signaling: The action potential. In: Kandel ER, Koester JD, Mack SH, Siegelbaum SA. editors. Principles of Neural Science. 6th ed., Kindle ed. McGraw Hill; 2021.
  19. LaMantia AS. Unit IV. The changing brain. In: Purves D,‎ Augustine GJ,‎ Fitzpatrick D,‎ Hall WC,‎ LaMantia AS,‎ Mooney RD, Platt ML, White LE, editors. Neuroscience. 6th ed. New York: Oxford University Press; 2018:489–623.
  20. Shadlen MN, Kandel ER. Nerve cells, neural circuitry, and behavior. In: Kandel ER, Koester JD, Mack SH, Siegelbaum SA. editors. Principles of Neural Science. 6th ed., Kindle ed. McGraw Hill; 2021.
  21. Siegelbaum SA, Clapham DE, Marder E. Modulation of synaptic transmission and neuronal excitability: Second messengers. In: Kandel ER, Koester JD, Mack SH, Siegelbaum SA. editors. Principles of Neural Science. 6th ed., Kindle ed. McGraw Hill; 2021.
  22. Siegelbaum SA, Fischbach GD. Overview of synaptic transmission. In: Kandel ER, Koester JD, Mack SH, Siegelbaum SA. editors. Principles of Neural Science. 6th ed., Kindle ed. McGraw Hill; 2021.
  23. Yuste R, Siegelbaum SA. Synaptic integration in the central nervous system. In: Kandel ER, Koester JD, Mack SH, Siegelbaum SA. editors. Principles of Neural Science. 6th ed., Kindle ed. McGraw Hill; 2021.
  24. Langille JJ, Brown RE. The synaptic theory of memory: A historical survey and reconciliation of recent opposition. Front Syst Neurosci. 2018 Oct. https://doi.org/10.3389/fnsys.2018.00052  https://www.frontiersin.org/articles/10.3389/fnsys.2018.00052/full
  25. Kennedy MB. Synaptic signaling in learning and memory. Cold Spring Harb Perspect Biol. 2016 Feb;8(2):a016824. doi: 10.1101/cshperspect.a016824. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4743082/
  26. Mayford M, Siegelbaum SA, Kandel ER. Synapses and memory storage. Cold Spring Harb Perspect Biol. 2012 Jun;4(6): a005751. doi: 10.1101/cshperspect.a005751. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3367555/
  27. Imari WS, Hyunju K, Davies PCW. The informational architecture of the cell. Phil. Trans. R. Soc. A. 3742015005720150057. 2016 Mar. http://doi.org/10.1098/rsta.2015.0057 https://royalsocietypublishing.org/doi/10.1098/rsta.2015.0057?keytype2=tf_ipsecsha&ijkey=b7acd66b689ed88cb012ba067bccefdf38e6c93b
  28. Földiák P. Chapter 19 – Sparse and explicit neural coding. In: Quiroga RQ, Panzeri S, editors. Principles of Neural Coding. Kindle Edition. Taylor and Francis CRC ebook account; 2013:379–389.
  29. The Oxford Pocket Dictionary of Current English. “information.” Encyclopedia.com. https://www.encyclopedia.com/social-sciences-and-law/law/law/information  
  30. Martin SP, Wells JD. Introduction. In: Elementary particles and their interactions. Graduate texts in physics. Springer, Cham; 2022
  31. Gasiorowicz SG, Langacker P. Elementary particles in physics. https://www.physics.upenn.edu/~pgl/e27/E27.pdf
  32. Elert G. The Standard Model. In: The Physics Hypertextbook. https://physics.info/standard/
  33. The Open University. Particle physics: Understanding the basics. The Open University. Kindle Edition. Edition 1.1; 2023.
  34. Moses OE. A Review of the role of modern ict tools in economic development. www.academia.edu. 2019 Mar. https://www.academia.edu/38641211/A_REVIEW_OF_THE_ROLE_OF_MODERN_ICT_TOOLS_IN_ECONOMIC_DEVELOPMENT
  35. Casagrande D. Information as verb: Re-conceptualizing information for cognitive and ecological models. Journal of Ecological Anthropology. 1999;3(1):4–13. doi: 10.5038/2162-4593.3.1.1. https://digitalcommons.usf.edu/cgi/viewcontent.cgi?article=1093&context=jea
  36. Dubrovsky DI. “The Hard Problem of Consciousness.” Theoretical solution of its main questions. AIMS Neurosci. 2019;6(2):85–103. doi: 10.3934/Neuroscience.2019.2.85. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7179338/?report=classic
  37. Wheeler JA. Information, physics, quantum: The search for links. In: Zureck WH, editor. Complexity, Entropy, and the Physics of Information. Redwood City, CA: Addison Wesley; 1990:309–336. https://cqi.inf.usi.ch/qic/wheeler.pdf

> To freely download the PDF version of the whole theory, please click: Download the PDF version of the theory

> To go to the Homepage of the theory, click Homepage.


Keywords: Information, Information nature, Information effects, Semantic information, Neural information, Information theory, what is information, types of information