Computational Psychiatry: A Systems Biology Approach to the Epigenetics of Mental Disorders 1st ed.

7. Tools for the Future: Hidden Symmetries

Rodrick Wallace1

(1)

New York State Psychiatric Institute, New York, NY, USA

Summary

Adopting Maturan a’s and Varela’s perspective on the necessity of cognition—or Rosen’s view of “anticipation”—at every scale and level of organization of the living state it is possible to extend Yeung’s relation between information inequalities and finite group structure to a fundamental duality between an information-theoretic characterization of cognition in gene expression and the groupoid extension of simple symmetries. It appears that gene expression, responding to and hence “anticipating” environmental clues, in a large sense, may sometimes be characterized by groupoids of increasing complexity constructed from underlying finite groups. While higher organisms at later developmental stages may not be described so simply, Yeung’s powerful results suggest the existence of regularities built from finite groups during certain developmental periods via an extension of the spontaneous symmetry breaking/making formalism familiar from physical theory. Essentially, the S-shaped developmental curve serves as a temperature analog driving increasing levels of deeply underlying topological symmetries. This suggests, as in the characterization of complex geometric objects by “simpler” structures in algebraic topology, that a shift of perspective from gene expression networks and their dynamics to their underlying symmetries may provide deeper insight to ontology and its dysfunctions.

The inference of gene regulatory networks…from experimental observations is at the heart of systems biology. This includes the inference of both network topology and its dynamics. (Vera-Licona et al. 2014)

…[P]eriodic solutions…can occur in a network with no symmetry…generated by symmetry on a quotient network. (Golubitsky et al. 2012)

…[I] was astonished to find this profusion of anticipatory behavior at all levels of biological organization. (Rosen 2012)

7.1 Introduction

Recent work by Hendriks et al. (2014) on gene expression in the relatively simple reference organism C. elegans uncovered unexpected periodicities. The experimenters found that their genome-wide and temporally highly resolved gene expression studies revealed extensive periodic gene expression during C. elegans larval development. This observation affected a fifth of expressed genes, revealing robust, transcriptionally driven oscillations across a continuum of phases that result in periodic translation, promoting periodic developmental processes. These unanticipated results highlight dynamics and complexity of gene expression patterns during C. elegans development. Hendriks et al. propose that a unique combination of features makes these oscillations a powerful model to study coordinated gene expression in an animal, i.e., that insights into the mechanisms that achieve robust phase locking and broad distribution of phases may be illuminating for the understanding of coordinated gene expression more generally. They find it striking that oscillations are robustly detectable in RNA from whole animals, suggesting the presence of effective mechanisms, yet to be uncovered, that coordinate oscillations spatially and temporally.

The associated commentary by Laxman et al. (2014) explores numerous similar examples, finding that such transcriptional clocks allow organisms to temporally compartmentalize biological processes in synchrony with environmental cues. As they note, in photosynthetic cyanobacteria, the periodic expression of clusters of circadian clock-controlled genes enables the organism to generate energy and grow while temporally separating incompatible chemical processes, a mechanism also observed in plants. Such cycles, they assert, can be considered as being polytopic, regulating multiple processes through temporal compartmentalization of metabolism, suggesting that cyclic changes in a cell’s metabolic state can drive such biological oscillations. Their reference list is striking.

Theoretical work on symmetries associated with periodicities and oscillations in network systems lays a foundation for understanding the yet-to-be-uncovered mechanisms. Following closely the arguments of Golubitsky and Stewart (2006), a formal theory of symmetries of networks of coupled dynamical systems, using the permutation group of the nodes that preserve the network topology, has existed for some time. Global network symmetries impose strong constraints on their associated dynamical systems, characterizing equilibria, periodic states, heteroclinic cycles, and chaotic states. The group symmetries of the network can lead to synchrony, phase relations, resonances, and synchronous or cycling chaos. However, group symmetry is, they argue, too restrictive an assumption, and network theory should be more comprehensive. They use an extension of the group-theoretic notion of symmetry to replace global symmetries with bijections between appropriate input subsets of the directed edges of the network. The symmetry group then is a groupoid, an algebraic structure that resembles a group but is different in that the product of two elements is not necessarily defined. Golubitsky and Stewart argue that use of groupoids makes it possible to extend group-theoretic methods to more general networks, permitting full classification of dominant patterns of synchrony in terms of the combinatorial structure of the network. Golubitsky et al. (2012) find that periodicities can be driven on a network without symmetries by those characterizing a quotient network .

Adapting an information theory treatment of cognition and Yeung’s (2008) analysis of the consonance between information theory inequalities and results from the theory of finite groups, it becomes possible to expand the underlying arguments to more complex systems than C. elegans and yeasts.

7.2 The Cognitive Paradigm for Gene Expression

A direct approach to the dynamics of the regulation of gene expression invokes an extension of the “cognitive paradigm” of Atlan and Cohen (1998), who recognized that the immune response is not merely an automatic reflex, but involves active choice of a particular response to insult from a larger repertoire of possible responses, according to comparison with an internal learned or inherited picture of the world. Choice reduces uncertainty and implies the existence of an underlying information source (Wallace 2012a). Similar perspectives apply to gene expression (Wallace and Wallace 2010). That is, gene expression is also a cognitive phenomenon, analogous to, if different from, the immune response since, at developmental branch points, choice must be made regarding which genes to activate and which to suppress.

The Atlan/Cohen perspective is recognizably similar to Rosen’s (2012, p. 313) characterization of an “anticipatory system” as one

…containing a predictive model of itself and/or its environment, which allows it to change state in an instant in accord with the model’s predictions…

Following Chap. 1, given an information source associated with a cognitive gene expression system—called “dual” to it—an equivalence class algebra can be constructed by choosing different system origin states a0 and defining the equivalence of two subsequent states at times m, n > 0, written as am , a n , by the existence of high probability “meaningful” paths connecting them to the same origin point. Disjoint partition by equivalence class, analogous to orbit equivalence classes in dynamical systems, defines a symmetry groupoid associated with the cognitive process. Again, groupoids are generalizations of the group concept in which there is not necessarily a product defined for each possible element pair (Weinstein 1996).

The equivalence classes define a set of cognitive dual information sources available to the gene regulation system, creating a large groupoid, with each orbit corresponding to a transitive groupoid whose disjoint union is the full groupoid. Each subgroupoid has its own dual information source, and larger groupoids will have richer dual information sources than smaller.

Let  $$X_{G_{i}}$$  be the gene expression system’s dual information source associated with the groupoid element Gi , and take Y as the information source associated with an embedding environment that, in a large sense, signals the developmental process. Chapter 1 details how environmental regularities imply the existence of an environmental information source.

A “free energy” Morse Function (Pettini 2007) can now be constructed using a pseudoprobability argument.

Let  $$H(X_{G_{i}},Y ) \equiv H_{G_{i}}$$  be the joint uncertainty of the two information sources. A Boltzmann-like pseudoprobability is written as

 $$\displaystyle{ P[H_{G_{i}}] = \frac{\exp [-H_{G_{i}}/T(t)]} {\sum _{j}\exp [-H_{G_{j}}/T(t)]} }$$

(7.1)

where T(t) is a dimensionless “developmental state” characteristic of the organism, t is the real time, and the sum is over the different possible cognitive modes of the full system. T(t) would be expected to have the usual S-shaped form, for example,

 $$\displaystyle{ T(t) = \frac{\kappa _{1}} {1 +\kappa _{2}\exp [-\alpha t]} }$$

(7.2)

with α > 0,  κ2 ≫ κ 1 > 0, leading to a dynamic like that of Fig. 7.1α has the dimension of a rate and the κi are numerical.

A437846_1_En_7_Fig1_HTML.gif

Fig. 7.1

Dimensionless “developmental state” T(t) as a function of real time t

A Morse Function  $$\mathcal{F}$$ —analogous to free energy in a physical system—can be defined in terms of the “partition function” in the denominator of Eq. (7.1) as

 $$\displaystyle{ \exp [-\mathcal{F}/T(t)] \equiv \sum _{j}\exp [-H_{G_{j}}/T(t)] }$$

(7.3)

Given a groupoid structure characteristic of gene expression networks as a generalization of the simple symmetry group, it becomes possible to apply an extension of Landau’s picture of phase transition in a physical system (Pettini 2007). In Landau’s “spontaneous symmetry breaking,” phase transitions driven by temperature changes occur as alteration of system symmetry, with higher energies at higher temperatures being more symmetric. The proposed shift between symmetries is highly punctuated in the temperature. Typically, there are only a very limited number of possible phases. Here, the developmental index plays the role of temperature. That is, increasing T(t) leads to staged increase in the complexity of cognitive developmental process—as affected by signals from the embedding environment determining norms of reaction—although development itself is, of course, uniform. Some further effort, however, suggests the possibility of a more specific approach.

It is worth reiterating that incorporation of the environmental information source Y into the calculation of  $$\mathcal{F}$$  implies that norms of reaction will be incorporated into the developmental trajectory of the organism. Different norms would be associated with different sequences of groupoids observable during development, particularly at critical ontological branch points.

7.3 The Group Structures of Information Processes

There is some theoretical support for this approach. Yeung (2008) has explored the deep relationship between information theory and the theory of finite groups , mirroring the relation between synchronicity in networks and their permutation symmetries as discussed at length in Golubitsky and Stewart (2006). For example, given two random variables X1 and X 2 with Shannon uncertainties H(X1) and H(X 2), the information theory chain rule (Cover and Thomas 2006) states that, for the joint uncertainty H(X1, X 2),

 $$\displaystyle{ H(X_{1}) + H(X_{2}) \geq H(X_{1},X_{2}) }$$

(7.4)

Similarly, let G be any finite group, and G1, G 2 be subgroups of G. Let | G | represent the order of a group—the number of elements. Then the intersection G1 ∩ G2 is also a subgroup, and a “group inequality” can be derived that is the precise analog of Eq. (7.4):

 $$\displaystyle{ \log \left [ \frac{\vert G\vert } {\vert G_{1}\vert }\right ] +\log \left [ \frac{\vert G\vert } {\vert G_{2}}\right ] \geq \log \left [ \frac{\vert G\vert } {\vert G_{1} \cap G_{2}\vert }\right ] }$$

(7.5)

Yeung defines a probability for a pseudorandom variate associated with a group G as Pr{X = a} = 1∕ | G | . This allows construction of a group-characterized information source, noting that, in general, the joint uncertainty of a set of random variables is not necessarily the logarithm of a rational number. The surprising result Yeung (2008) ultimately establishes is a one-to-one correspondence between unconstrained information inequalities—generalizations of Eq. (7.4)—and finite group inequalities: unconstrained inequalities can be proved by techniques in group theory, and many group-theoretic inequalities can be proven by techniques of information theory. Yeung uses an obscure unconstrained information inequality to derive, in his Eq. (16.116), a complex group inequality for which, as he puts it, the “…implications in group theory are yet to be understood.”

An intuitive argument in this direction actually follows from the essential mantra of algebraic topology (Hatcher 2001): forming algebraic images of topological spaces. The most basic of these images is the fundamental group, leading to Van Kampen’s theorem allowing the computation of the fundamental group of spaces that can be decomposed into simpler spaces whose fundamental group is already known. As Hatcher (2001, p. 40) puts it, “By systematic use of this theorem one can compute the fundamental groups of a very large number of spaces…[F]or every group G there is a space X G whose fundamental group is isomorphic to G”. As Golubitsky and Stewart forcefully argue, network structures and dynamics are imaged by fundamental groupoids, for which there also exists a version of the Seifert–Van Kampen theorem (Brown et al. 2011). Yeung’s (2008) results suggest information theory-based “cognitive” generalizations that may include essential dynamics of gene expression and its regulation.

7.4 The Topology of “Code Networks”

Curiously parallel results emerge from Tlusty’s (20072008) error-minimization analysis of the genetic code, in which a “network” of codons is embedded in a larger topological structure having a fundamental group . Tlusty finds that discussion of the topology of errors portrays the codon space as a graph whose vertices are the codons. In his analysis, two codons are linked by an edge if they are likely to be confused by misreading. He assumes that two codons are most likely to be confused if all their letters except for one agree and therefore draw an edge between them. The resulting graph is, in his approach, “natural” for considering the impact of translation errors on mutations because such errors almost always involve a single letter difference, that is, a movement along an edge of the graph to a neighboring vertex.

The topology of a graph is characterized by its genus γ, the minimal number of holes required for a surface to embed the graph such that no two edges cross. The more connected that a graph is the more holes are required for its minimal embedding. The highly interconnected 64-codon graph is embedded in a hole, γ = 41 surface. The genus is somewhat reduced to γ = 25 if only 48 effective codons are considered.

Tlusty uses the extremum of an information-theoretic Morse Function to determine a single contiguous domain where a certain amino acid is encoded. Thus every mode corresponds to an amino acid and the number of modes is the number of amino acids. This compact organization is advantageous, he claims, because misreading of one codon as another codon within the same domain has no deleterious impact: if the code has two amino acids, it is the error load of an arrangement where there are two large contiguous regions, each coding for a different amino acid is much smaller than a checkerboard arrangement of the amino acids.

This result is analogous to—but significantly different from—the topological coloring problem. In the coding problem one desires maximal similarity in the colors of neighboring “countries,” while the coloring problem must color neighboring countries by different colors. The number of possible amino acids in this scheme is determined by Heawood’s formula (Ringel and Young 1968),

 $$\displaystyle{ \mathrm{chr}(\gamma ) = \mathrm{Int}\left [\frac{1} {2}(7 + \sqrt{1 + 48\gamma })\right ] }$$

(7.6)

chr(γ) is the number of “colored” regions, Int is the integer value of the enclosed expression, and γ is the genus of the surface, roughly speaking, the number of “holes.” In general, γ = 1 − (1∕2)(V − E + F), where V is the number of code network vertices, E the number of network edges, and F the number of enclosed faces.

The text table below gives the first few steps in the calculation.

γ (# surface holes)

chr (γ)(# error classes)

0

4

1

7

2

8

3

9

4

10

5

11

6, 7

12

8, 9

13

Tlusty (2007) then models the emergence of the genetic code as a phase transition in a noisy information channel, using the Rate Distortion Theorem, with the optimal code is described by the minimum of a “free energy”-like functional, as above, characterizing the code’s emergence as a transition akin to a phase transition in statistical physics: a supercritical phase transition is known to take place in noisy information channels. The noisy channel is controlled by a temperature-like parameter that determines the balance between the information rate and the distortion in the same way that physical temperature controls the balance between energy and entropy in a physical system. Following Tlusty’s (2007) equation (2), the free energy functional has the form  $$D -\hat{T } S$$  where D is the average error load’, equivalent to average distortion in a rate distortion problem, S is the “entropy due to random drift,” and  $$\hat{T }$$ measures the strength of random drift relative to the selection force that pushes toward fitness maximization. This is essentially a Morse Function (Pettini 2007). At high  $$\hat{T }$$ , in this model, the channel is totally random and it conveys zero information. At a certain critical temperature  $$\hat{T } _{c}$$  the information rate starts to increase continuously.

This is not the only such example. Hecht et al. (2004) found that protein α-helices have an inherent “code” 101100100110…where 1 indicates a polar and 0 a non-polar amino acid. Protein β-sheets have the simpler coding 10101010…Wallace (2010) extends Tlusty’s topological analysis via Heawood’s graph genus formula—the “magic numbers” above—to the more complicated protein folding classifications uncovered by Chou and Maggiora (1998). Going beyond the four classes of Levitt and Chothia (1976)—α helices, β sheets, α +β and αβ structures—three more minor classifications, and possibly another three subminor classes. The globular “protein folding error code network” becomes a large connected “sphere” producing the four fundamental classes, having one minor, and possibly as many as three more subminor attachment handles, in the Morse Theory sense (Pettini 2007).

Wallace (2012b) finds that similar arguments apply to the twelve monosaccharides associated with the mammalian glycan “kelp frond” production at the surface of the cell, suggesting an underlying code network having genus 6 or 7, according to the table above.

Applying a Morse Function approach to error-limiting code networks, as Wallace (2015) notes,

1.     1.

2.     2.

3.     3.

As codes become more complex, the appropriate quotient groups must become richer, consonant with “spontaneous symmetry” arguments.

A similar approach will apply to other kinds of networks—direct biochemical or indirect nets of interacting information sources—that are associated with gene expression and its control.

In sum, the developmental state index T(t) from Eqs. (7.1)–(7.3) will serve as a temperature analog driving increased levels of deep network topological symmetry.

7.5 Expanding the Model

Equation (7.4)—and its symmetry expression in Eq. (7.5)—has profound implications for development. In essence, one information source can act as an “information catalyst” to direct, or at least influence, the trajectory of another.

At some developmental time T(t), a Boltzmann probability density can be constructed in terms of the rate of metabolic free energy consumed by development. For an information source X, having source uncertainty H(X) and metabolic free energy (MFE) available at some rate M,

 $$\displaystyle{ \dot{P}[X] \equiv \frac{\exp [-H/\kappa M]} {\int \exp [-H/\kappa M]dH} }$$

(7.7)

Assuming the translation rate of MFE into information is very small, to first order  $$\hat{H}\equiv \int H\dot{P}[H]dH \approx \kappa M$$ , and, using Eq. (7.4),

 $$\displaystyle\begin{array}{rcl} \hat{H}(X,Y )& \leq & \hat{H}(X) + \hat{H} (Y ) \\ M_{X,Y }& \leq & M_{X} + M_{Y } {}\end{array}$$

(7.8)

A more comprehensive argument might involve the maximum capacities for developmental channels under the Rate Distortion Theorem, in which case H in Eq. (7.7) is replaced by a Rate Distortion Function R integrated over the range 0 → , so that  $$\hat{R}=\kappa M$$ .

As a consequence of the information inequality of Eq. (7.4)—inherent in the symmetry relation of Eq. (7.5)—allowing crosstalk between information sources consumes less MFE than isolating information sources, something that confounds electrical engineers attempting to isolate individual signals. For developing organisms, however, information catalysis becomes a central tool for gene expression and its control. It seems likely that other expressions of the relation between information inequalities and group inequalities will sometimes have similar importance.

We can now iterate the argument across developmental time T(t), in terms of a network of interacting information sources rather than a network of biochemical reactions, as is the current practice. We thus, in a sense, extend the idea of a “quotient network” from Golubitsky and Stewart (2006), involving equivalence classes of network nodes. Golubtisky and Stewart demonstrate that dynamics of the quotient network impose themselves onto the original network, although there can be lower level network dynamics that are not reflected in the quotient. This is, perhaps, analogous to “unconscious” vs “conscious” processes in the brains of higher animals: consciousness is taken as a function of “global workspace” dynamics at higher levels than individual cognitive modules. Again, see Chap. 1.

For a given underlying network topology of n interacting information sources representing cognitive submodules of a developing organism at time T(t), there will be some average crosstalk between them, say εn . For that given network topology—remember, a network of interacting information sources—there will be some critical value εn C at which a giant component (GC) emerges. That is, at εn C , a very large number of cognitive developmental processes become linked across the developing organism into a joint GC information source. For random networks this phenomenon has been well studied and the conditions under which it occurs are well understood (Wallace 2012a). While other topologies impose different detailed conditions, the punctuated emergence of a GC is almost universal across a very large class of networks. If ε > εC at some T(t), then individual cognitive developmental modules become coordinated across the organism. Conversely, different network topologies can impose different critical crosstalk values for onset of the GC. Topology can drive crosstalk, crosstalk can drive topology, and topologies are inherently defined by various symmetry indices, both groups and groupoids.

7.6 Developmental Canalization and Directed Homotopy

Equation (7.8) suggests the possibility of an “information catalysis” in developmental trajectory via an imposed signal from embedding regulatory information sources. The analytic tool for this is an extension—or weakening—of the fundamental group associated with homotopy loops to that of a fundamental groupoid associated with directed homotopy , in the sense of Goubault and Raussen (2002) and Goubault (2003). More complete discussions can be found in Grandis (2009) and Fajstrup et al. (2016).

Directed homotopy is different from simple homotopy in that inherently one-way paths from one point to another are the fundamental objects, rather than loops beginning and ending at a point-of-origin. Continuous deformation of directed paths between the same beginning and end points defines equivalence classes of dihomotopy paths constituting a groupoid. In Fig. 7.2C represents a developmental branch point leading from an initial phenotype So to two different possible phenotypes, S1 and S2. C casts a developmental shadow, and two equivalence classes of dihomotopy paths are possible. The embedding regulatory information sources, via Eq. (7.8), define one of these phenotypes as a relative energy minimum. Thus, for gene expression and its regulation, the fundamental symmetries are likely to be those of groupoids, built up via the groupoid version of the Seifert–Van Kampen theorem and by the occurrence of repeated critical points C1, C 2,  as the developmental state index T(t) increases.

A437846_1_En_7_Fig2_HTML.gif

Fig. 7.2

Starting at an initial developmental phenotype So, at critical period C that casts a developmental shadow, there are two directed homotopy equivalence classes of deformable paths leading to phenotypes S1 and S2 that define a groupoid. The canalizing “choice” between them is driven by embedding regulatory information sources via the catalysis of Eq. (7.8). Repeated critical points, C1, C 2,  over the developmental state T(t) systematically enlarge the fundamental groupoid

7.7 Discussion

An essential inference from Yeung’s results and the cognitive paradigm for gene expression is that, for many organisms, development might be associated with increasingly complex groupoid symmetries—which may sometimes be limited to simpler symmetry groups—some of which, at least, may be constructed by appropriate combinations of finite groups. In addition, it may be possible to characterize developmental trajectories by their associated groupoid sequences, keyed to different norms of reaction in which gene expression is keyed to environmental clues, in a large sense. Thus different “environmental” effects—normal or pathological—might well be associated with markedly different symmetry pathways. An interesting research question then asks if normal and pathological development have different characteristic network groupoid symmetry dynamics, and whether these can be easily identified, and perhaps modified.

An analog to this general approach can be found in the theory of error-correcting codes (as opposed to Tlusty’s error-limiting codes), otherwise known as algebraic coding theory, which looks for redundancies in message coding over noisy channels enabling efficient reconstruction of lost or distorted information. Mathematical techniques involve groups, ideals, rings, algebras, and finite fields. These formalisms have generated many different codes with different capabilities and complexities, for example, BCH, Goppa, Hamming, Linear, Reed–Muller, Reed–Solomon, and so on. It may well be that the relations between groups, groupoids, and a broad spectrum of information related phenomena of interest in biology are similarly intimate.

Indeed, parallel arguments can be found in computational neuroscience. For example, Ashwin et al. (2016) infer, regarding the dynamics of neural networks, that it may be necessary to

…further tap into recent ideas for classifying emergent dynamics based upon the group of structural symmetries of the network…For many real-world networks, this can be decomposed into direct and wreath products of symmetric groups…

Adopting the perspective of Maturana and Varela (1980) on the necessity of cognition at every scale and level of organization of the living state, it seems possible to extend the relation between information inequalities and finite group structure to a fundamental duality between an information-theoretic characterization of cognition in gene expression and groupoids. For some, or even many, organisms, development might be characterized by groupoids of increasing complexity constructed from underlying finite groups. Higher organisms at later developmental stages may not be so simply described, but Yeung’s (2008) powerful results suggest some regularities involving finite groups during certain developmental stages for some organisms.

This conjecture is a matter for empirical study, which will almost certainly find matters significantly more complicated than a simple progression to more complicated groupoids constructed from underlying finite groups: biology is inherently messy. The statistical models proposed here can, however, as with the more familiar regression models, provide benchmarks against which to compare observations on different systems under similar conditions, or the same or similar systems under different conditions.

In sum, there may be complex interactions between group and groupoid symmetries at different levels of organization and across multiple temporal and geometric scales in the living state. This may even involve the kinds of semidirect and wreath products of groups and groupoids found in the study of intrinsically disordered proteins, other nonrigid molecular structures, and symmetric networks (Wallace 2012c; MacArthur et al. 2008; Houghton 1975).

Regarding inherent symmetry, Noether’s first theorem (Wikipedia 2016), as it applies in classical mechanics, states that every differentiable symmetry of the action of a physical system has a corresponding conservation law. The action of a physical system is the integral over time of a Lagrangian function, from which system behavior can be determined by the variational principle of least action. A generalization of the formulations on constants of motion in Lagrangian and Hamiltonian mechanics, the theorem does not apply to systems that cannot be modeled with a Lagrangian alone: dissipative systems with continuous symmetries need not have a corresponding conservation law, although Onsager-like linearization methods similar to those of nonequilibrium thermodynamics may sometimes be used as a first approximation (Kontogeorgaki et al. 2016).

For non-dissipative systems, rotational symmetry implies conservation of angular momentum, symmetry in time implies conservation of energy, displacement symmetry implies conservation of momentum, and so on into quantum mechanics, where matters become far more subtle. However, network and other such cognitive systems always act under dissipative circumstances, i.e., metabolic or other free energy is constantly converted to heat. Then the results of Golubitsky and Stewart and of Yeung imply—modulo the inherent dimensional collapse necessarily associated with an information source—that deep, if more complicated, symmetries may still lie behind networks of gene expression and their control.

References

Ashwin, P., S. Coombes, and R. Nicks. 2016. Mathematical frameworks for oscillatory network dynamics in neuroscience. Journal of Mathematical Neuroscience 6: 2. doi: 10.1186/s13408-015-0033-6.CrossRefPubMedPubMedCentral

Atlan, H., and I. Cohen. 1998. Immune information, self-organization, and meaning. International Immunology 10: 711–717.CrossRefPubMed

Brown, R., P. Higgins, and R. Sivera. 2011. Nonabelian Algebraic Topology: Filtered Spaces, Crossed Complexes, Cubical Homotopy Groupoids. EMS Tracts in Mathematics, vol. 15. Zürich: European Mathematical Society. www.bangor.ac.uk/r.brown/nonab-a-t.html.

Chou, K., and G. Maggiora. 1998. Domain structural class prediction. Protein Engineering 11: 523–528.CrossRefPubMed

Cover, T., and J. Thomas. 2006. Elements of Information Theory, 2nd ed. New York: Wiley.

Fajstrup, L., E. Goubault, A. Mourgues, S. Mimram, and M. Raussen. 2016. Directed Algebraic Topology and Concurrency. New York: Springer.CrossRef

Golubitsky, M., and I. Stewart. 2006. Nonlinear dynamics of networks: The groupoid formalism. Bulletin of the American Mathematical Society 43: 305–364.CrossRef

Golubitsky, M., D. Romano, and Y. Wang. 2012. Network periodic solutions: Patterns of phase-shift synchrony. Nonlinearity 25: 1045–1074.CrossRef

Goubault, E. 2003. Some geometric perspectives on concurrency theory. Homology, Homotopy, and Applications 5: 95–136.CrossRef

Goubault, E., and M. Raussen. 2002. Dihomotopy as a tool in state space analysis. In Lecture Notes in Computer Science, vol. 2286, 16–37. New York: Springer.

Grandis, M. 2009. Directed Algebraic Topology: Models of Non-reversible Worlds. New York: Cambridge University Press.CrossRef

Hatcher, A. 2001. Algebraic Topology. New York: Cambridge University Press.

Hecht, M., A. Das, A. Go, L. Aradely, and Y. Wei. 2004. De novo proteins from designated combinatorial libraries. Protein Science 13: 1711–1723.CrossRefPubMedPubMedCentral

Hendriks, G., D. Gaidatzis, F. Aeschimann, and H. Grosshans. 2014. Extensive oscillatory gene expression during C. elegans larval development. Molecular Cell 53: 380–392.CrossRefPubMed

Houghton, C. 1975. Wreath products of groupoids. Journal of the London Mathematical Society 10: 179–188.CrossRef

Kontogeorgaki, S., R. Sanchez-Garcia, R. Ewing, K. Zygalakis, and B. MacArthur. 2016. Noise-processing by signalling networks. bioRxiv preprint http://dx.doi.org/10.1101/075366.

Laxman, S., B. Tu, and S. McKnight. 2014. Concerted effort: Oscillations in global gene expression during nematode development. Molecular Cell 53: 363–364.CrossRefPubMed

Lee, J. 2000. Introduction to Topological Manifolds. Graduate Texts in Mathematics. New York: Springer.

Levitt, M., and C. Chothia. 1976. Structural patterns in globular proteins. Nature 261: 552–557.CrossRefPubMed

MacArthur, B., R. Sanchez-Garcia, and J. Anderson. 2008. Symmetry in complex networks. Discrete and Applied Mathematics 156: 3525–3531.CrossRef

Maturana, H., and F. Varela. 1980. Autopoiesis and Cognition. Dordrecht: Riedel.CrossRef

Pettini, M. 2007. Geometry and Topology in Hamiltonian Dynamics. New York: Springer.CrossRef

Ringel, G., and J. Young. 1968. Solutions of the Heawood map-coloring problem. PNAS 60: 438–445.CrossRefPubMedPubMedCentral

Rosen, R. 2012. Anticipatory Systems: Philosophical, Mathematical, and Methodological Foundations, 2nd ed. New York: Springer.CrossRef

Tlusty, T. 2007. A model for the emergence of the genetic code as a transition in a noisy information channel. Journal of Theoretical Biology 249: 331–342.CrossRefPubMed

Tlusty, T. 2008. A simple model for the evolution of molecular codes driven by the interplay of accuracy, diversity and cost. Physical Biology 5: 016001.CrossRefPubMed

Vera-Licona, P., A. Jarrah, L. Garcia-Puente, J. McGee, and R. Laubenbacher. 2014. An algebra-based method of inferring gene regulatory networks. BMC Systems Biology 8: 37.CrossRefPubMedPubMedCentral

Wallace, R. 2010. A scientific open season. Physics of Life Reviews 7: 377–378.CrossRefPubMed

Wallace, R. 2012a. Consciousness, crosstalk, and the mereological fallacy: An evolutionary perspective. Physics of Life Reviews 9: 426–453.

Wallace, R. 2012b. Extending Tlusty’s rate distortion index theorem method to the glycome: Do even ‘low level’ biochemical phenomena require sophisticated cognitive paradigms? BioSystems 107: 145–152.

Wallace, R. 2012c. Spontaneous symmetry breaking in a non-rigid molecule approach to intrinsically disordered proteins. Molecular Biosystems 8: 374–377.

Wallace, R. 2015. Metabolic free energy and biological codes: A ‘data rate theorem’ aging model. Bulletin of Mathematical Biology 77: 879–903.CrossRefPubMed

Wallace R., and D. Wallace. 2010. Gene Expression and its Discontents: The Social Production of Chronic Disease. New York: Springer.CrossRef

Weinstein, A. 1996. Groupoids: Unifying internal and external symmetry. Notices of the American Mathematical Association 43: 744–752.

Wikipedia. 2016. https://en.wikipedia.org/wiki/Noether’s_theorem.

Yeung, H. 2008. Information Theory and Network Coding. New York: Springer.



If you find an error or have any questions, please email us at admin@doctorlib.info. Thank you!