BIOSYSTEMS

Cholesterol-ester prevents lipoprotein core from solidifying: Molecular dynamics simulation
Knyazeva OS, Oreshkin AA, Kisil SI, Samarina EA, Mikheeva KN, Tsukanov AA, Gosteva IV and Grachev EA
As an important part of lipid metabolism the liver produces large particles called very low density lipoproteins, filled mostly with triglyceride and cholesterol esters mixture. A large percentage of the mixture composition components has a melting point above physiological temperature. Thus solid cluster formation or phase transition could be expected. Though various single-component triglyceride systems are well researched both experimentally and by various simulation techniques, to our best knowledge, tripalmitin/cholesteryl-palmitate binary mixture was not yet studied. We study tripalmitin single component system, as well as 20%-80% and 50%-50% binary mixtures of cholesteryl-palmitate and tripalmitin using molecular dynamics approach. All systems are studied at the pressure of 1 atm and the physiological temperature of 310 K, which is below the melting points of both tripalmitin and cholesteryl-palmitate. Our results show that at the time of 1000 ns, there is still no phase transition, but there is a noticeable tendency to intermolecular organizing and early signs of clustering. We check fatty acid arrangements of tripalmitin molecules in both single component system and binary mixtures with two different percentages of cholesteryl-palmitate mixed in. Our results show that the more cholesteryl-palmitate molecules are in the mixture the smaller number of tripalmitin molecules transitions to 'a fork/chair' configuration during the same calculation time. Calculated angle distributions between fatty acid chains of tripalmitin molecules confirm that. Thus, our simulation results suggest slowing down or interfering effect of cholesteryl-palmitate on the crystallizing process of the binary mixture.
On the infodynamics of ramifications in constructal design
Panão MRO
Infodynamics is the study of how information behaves and changes within a system during its development. This study investigates the insights that informational analysis can provide regarding the ramifications predicted by constructal design. First, infodynamic neologisms informature, defined as a measure of the amount of information in indeterminate physical systems, and infotropy - contextualized informature representing the degree of transformation of indeterminate physical systems - are introduced. Flow architectures can be designed using either symmetric or asymmetric branching. The infodynamic analysis of symmetric branching revealed diminishing returns in information content, demonstrating that informature serves as a measure of diversity. These findings align with the principle of "few large and many small, but not too many," which is consistent with higher thermofluid performance. The Performance Scaled Svelteness Ψ expresses the ability of the flow architecture to promote thermofluid performance. By contextualizing the informature with Ψ, a performance infotropy that quantifies the degree of transformation associated with the link between thermofluid performance and diversity in the ramified flow structure is obtained. A predicted growth and decay effect with increasing branching levels leads to a local maximum, highlighting that the evolutionary direction of the ramifications is inversely proportional to the scale of the environment in which the flow structure develops. Assuming an evolutionary trend toward maximum infodynamic complexity, a pattern of asymmetric ramifications emerges, similar to the sap distribution in leaves or branching of trees.
The concepts of code biology
Barbieri M
Today there are two dominant paradigms in Biology: the idea that 'Life is Chemistry' and the idea that 'Life is Chemistry plus Information'. There is also a third paradigm, the idea that 'Life is Chemistry, Information and Meaning' but today this is a minority view, despite the fact that meaning is produced by codes and there is ample experimental evidence that hundreds of codes exist in living systems. This is because that evidence has not yet reached the university books, but what exists in nature is bound to exist, one day, also in our books and at that point the codes will become an integral part of biology. This paper is a brief description of the key concepts of that third paradigm that has become known as Code Biology.
The regulatory network that controls lymphopoiesis
Mendoza L, Vázquez-Ramírez R and Tzompantzi-de-Ita JM
Lymphopoiesis is the generation of the T, B and NK cell lineages from a common lymphoid-biased haematopoietic stem cell. The experimental study of this process has generated a large amount of cellular and molecular data. As a result, there is a considerable number of mathematical and computational models regarding different aspects of lymphopoiesis. We hereby present a regulatory network consisting of 95 nodes and 202 regulatory interactions among them. The network is studied as a qualitative dynamical system, which has as stationary states the molecular patterns reported for CLP, pre-B, B naive, PC, pNK, iNK, NK, DP, CD8 naive, CTL, CD4 naive, Th1, Th2, Th17 and Treg cells. Also, we show that the system is able to respond to specific stimuli to reproduce the ontogeny of the T, B and NK cell lineages.
Neural networks through the lens of evolutionary dynamics
Baciu DC
This article revisits Artificial Neural Networks (NNs) through the lens of Evolutionary Dynamics. The two most important features of NNs are shown to reflect the two most general processes of Evolutionary Dynamics. This overlap may serve as a new and powerful connection between NNs and Evolutionary Dynamics, which encompasses a body of knowledge that has been built over multiple centuries and has been expanded to inspire applications across a vast range of disciplines. Consequently, NNs should also be applicable across the same range of disciplines-that is, much more broadly than initially envisioned. The article concludes by opening questions about NN dynamics, based on the new connection to Evolutionary Dynamics.
A possible origin of life in nonpolar environments
Vitas M and Dobovišek A
Explaining the emergence of life is perhaps the central and most challenging question in modern science. We are proposing a new hypothesis concerning the origins of life. The new hypothesis is based on the assumption that during the emergence of life, evolution had to first involve autocatalytic systems which only subsequently acquired the capacity of genetic heredity. Additionally, the key abiotic and early biotic molecules required in the formation of early life, like cofactors, coenzymes, nucleic bases, prosthetic groups, polycyclic aromatic hydrocarbons (PAHs), some pigments, etc. are poorly soluble in aqueous media. To avoid the latter concentration problem, the new hypothesis assumes that life could have emerged in the nonpolar environments or low water systems, or at the interphase of the nonpolar and polar water phase, from where it was subsequently transferred to the aqueous environment. To support our hypothesis, we assume that hydrocarbons and oil on the Earth have abiotic origins.
The existence of the two domains of life, Bacteria and Archaea, would in itself imply that LUCA and the ancestors of these domains were progenotes
Di Giulio M
The length of the deepest branches of the tree of life would tend to support the hypothesis that the distance of the branch that separates the sequences of archaea from those of bacteria, i.e. the interdomain one, is longer than the intradomain ones, i.e. those that separate the sequences of archaea and those of bacteria within them. Why should interdomain distance be larger than intradomain distances? The fact that the rate of amino acid substitutions was slowed as the domains of life appeared would seem to imply an evolutionary transition. The slowdown in the speed of evolution that occurred during the formation of the two domains of life would be the consequence of the progenote- > cell evolutionary transition. Indeed, the evolutionary stage of the progenote being characterized by an accelerated tempo and mode of evolution might explain the considerable interdomain distance because the accumulation of many amino acid substitutions on this branch would indicate the progenote stage that is also characterized by a high rate of amino acid substitutions. Furthermore, the fact that intradomain distances are smaller than interdomain distances would corroborate the hypothesis of the achievement of cellularity at the appearance of the main phyletic lineages. Indeed, the cell stage, unlike the progenotic one, definitively establishes the relationship between the genotype and phenotype, lowering the rate of evolution. Therefore, the arguments presented lead to the conclusion that LUCA was a progenote.
Constructal Thermodynamics and its Semantic Ontology in Autopoetic, Digital, and Computational Architectural and Urban Space Open Systems
Mavromatidis L
This paper explores the intersections of constructal thermodynamics, and its semantic ontology within the context of autopoetic, digital and computational design in protocell inspired numerical architectural and urban narratives that are examined here as open systems. Constructal law is the thermodynamic theory based on the analysis of fluxes across the border of an open system. Protocells, as dynamic and adaptive open finite size systems, serve in this paper as a compelling metaphor and design model for responsive and sustainable manmade architectural and urban environments. The ability of protocells to harness energy, minimize entropy, and adapt to environmental changes mirrors the principles of constructal thermodynamics, which govern the flow and distribution of resources in complex self-organizing information open systems in nature. By applying these principles to digital architecture, this study investigates how relational dynamics between spaces, materials, and functions can create adaptive designs that "go with the flow" of ecological and cultural systems. The research demonstrates using the Gouy-Stodola theorem as a variational principle, how protocell-inspired processes facilitate exergy-efficient designs, minimizing waste while maximizing resilience and flexibility. The present through an applied case study argues for a paradigm where protocell digital architecture serves not only as an ecological and material model but also as a spatial narrative driver, blending constructal and digital tools with cultural mythos. Finally, this paper exploring simultaneously the semantic complexity of such systems, in turn, connects these constructal driven digital designs to broader méta-narratives, embedding cultural, symbolic, philosophical and functional predicates into architectural forms.
Understanding cancer from a biophysical, developmental and systems biology perspective using the landscapes-attractor model
Grunt TW
Biophysical, developmental and systems-biology considerations enable deeper understanding why cancer is life threatening despite intensive research. Here we use two metaphors. Both conceive the cell genome and the encoded molecular system as an interacting gene regulatory network (GRN). According to Waddington's epigenetic (quasi-potential)-landscape, an instrumental tool in ontogenetics, individual interaction patterns ( = expression profiles) within this GRN represent possible cell states with different stabilities. Network interactions with low stability are represented on peaks. Unstable interactions strive towards regions with higher stability located at lower altitude in valleys termed attractors that correspond to stable cell phenotypes. Cancer cells are seen as GRNs adopting aberrant semi-stable attractor states (cancer attractor). In the second metaphor, Wright's phylogenetic fitness (adaptive) landscape, each genome ( = GRN) is assigned a specific position in the landscape according to its structure and reproductive fitness in the specific environment. High elevation signifies high fitness and low altitude low fitness. Selection ensures that mutant GRNs evolve and move from valleys to peaks. The genetic flexibility is highlighted in the fitness landscape, while non-genetic flexibility is captured in the quasi-potential landscape. These models resolve several inconsistencies that have puzzled cancer researchers, such as the fact that phenotypes generated by non-genetic mechanisms coexist in a single tumor with phenotypes caused by mutations and they mitigate conflicts between cancer theories that claim cancer is caused by mutation (somatic mutation theory) or by disruption of tissue architecture (tissue organization field theory). Nevertheless, spontaneous mutations play key roles in cancer. Remarkable, fundamental natural laws such as the second law of thermodynamics and quantum mechanics state that mutations are inevitable events. The good side of this is that without mutational variability in DNA, evolutionary development would not have occurred, but its bad side is that the occurrence of cancer is essentially inevitable. In summary, both landscapes together fully describe the behavior of cancer under normal and stressful conditions such as chemotherapy. Thus, the landscapes-attractor model fully describes cancer cell behavior and offers new perspectives for future treatment.
A workflow for the hybrid modelling and simulation of multi-timescale biological systems
Herajy M, Liu F and Heiner M
With the steady advance of in-silico biological experimentation, model construction and simulation becomes a ubiquitous tool to understand and predict the behaviour of many biological systems. However, biological processes may contain components from different types of reaction networks, resulting in models with different (e.g., slow and fast) timescales. Hybrid simulation is one approach which can be employed to efficiently execute multi-timescale models. In this paper, we present a methodology and workflow utilizing (coloured) hybrid Petri nets to construct smaller and more complicated hybrid models. The presented workflow integrates algorithms and ideas from hybrid simulation of biochemical reaction networks as well as Petri nets. We also construct multi-timescale hybrid models and then show how these models can be efficiently executed using three different advanced hybrid simulation algorithms.
Ecological drivers for the absence of task shifting in termite-tunneling activity: A simulation study
Lee SH and Park CM
Subterranean termites build complex underground tunnel networks to efficiently gather food. Empirical observations indicate specific individuals are dedicated to tunneling, rarely interchanging tasks. However, considering the limited tunneling energy of termite populations, it is reasonable to expect regular task shifts between fatigued and rested individuals to maintain continuous tunneling and optimize foraging. To explore this disparity, we developed a sophisticated individual-based model simulating the termite tunneling process in two scenarios: one with task shifting and one without. In the task shift scenario, the initial group of termites excavates the tunnel, expends all their energy, and returns to the nest. A new group is then deployed to the tunnel tip to continue the excavation, collectively creating the final tunnel pattern. In the no task shift scenario, the initial group completes the tunneling without transitioning to subsequent groups. We compared the tunnel patterns of these two scenarios, focusing on tunnel directionality and size. The comparison revealed statistically no significant difference in tunnel directionality between the scenarios. However, the tunnel size was notably larger in the absence of task shift, suggesting that continuous tunneling without task shift may enhance food searching efficiency. In the discussion section, we briefly address the limitations of the model arising from differences between the simulations and actual termite systems. Additionally, we touch on the idea to explain the fact that only a fixed proportion of workers in a termite colony participate in tunneling activities.
Modeling the origin, evolution, and functioning of the genetic code
Dragovich B, Fimmel E, Khrennikov A and Mišić NŽ
Cancer memory as a mechanism to establish malignancy
Lissek T
Cancers during oncogenic progression hold information in epigenetic memory which allows flexible encoding of malignant phenotypes and more rapid reaction to the environment when compared to purely mutation-based clonal evolution mechanisms. Cancer memory describes a proposed mechanism by which complex information such as metastasis phenotypes, therapy resistance and interaction patterns with the tumor environment might be encoded at multiple levels via mechanisms used in memory formation in the brain and immune system (e.g. single-cell epigenetic changes and distributed state modifications in cellular ensembles). Carcinogenesis might hence be the result of physiological multi-level learning mechanisms unleashed by defined heritable oncogenic changes which lead to tumor-specific loss of goal state integration into the whole organism. The formation of cancer memories would create and bind new levels of individuality within the host organism into the entity we call cancer. Translational implications of cancer memory are that cancers could be engaged at higher organizational levels (e.g. be "trained" for memory extinction) and that compounds that are known to interfere with memory processes could be investigated for their potential to block cancer memory formation or recall. It also suggests that diagnostic measures should extend beyond sequencing approaches to functional diagnosis of cancer physiology.
The genetic code is not universal
Di Giulio M
Recently, a new genetic code with 62 sense codons, coding for 21 amino acids, and only 2 termination codons has been identified in archaea. The authors argue that the appearance of this variant of the genetic code is due to the relatively recent and complete recoding of all UAG stop codons to codons encoding for pyrrolysine. I re-evaluate this discovery by presenting arguments that favour the early, i.e. ancestral, appearance of this variant of the genetic code during the origin of the genetic code itself. These arguments are capable of supporting that during the origin of the organization of the genetic code, at least two versions of the genetic code evolved in the domain of the Archaea. Thus, the genetic code would not be absolutely universal.
"Assembly Theory" in life-origin models: A critical review
Abel DL
Any homeostatic protometabolism would have required orchestration of disparate biochemical pathways into integrated circuits. Extraordinarily specific molecular assemblies were also required at the right time and place. Assembly Theory conflated with its cousins-Complexity Theory, Chaos theory, Quantum Mechanics, Irreversible Nonequilibrium Thermodynamics and Molecular Evolution theory- collectively have great naturalistic appeal in hopes of their providing the needed exquisite steering and controls. They collectively offer the best hope of circumventing the need for active selection required to formally orchestrate bona fide formal organization (as opposed to the mere self-ordering of chaos theory) (Abel and Trevors, 2006b). This paper focuses specifically on AT's contribution to naturalistic life-origin models.
The problem of evolutionary directionality 50 years following the works of Sergei Meyen
Melkikh AV
An irreversible thermodynamic model of prebiological dissipative molecular structures inside vacuoles at the surface of the Archean Ocean
Montemayor-Aldrete JA, Nieto-Villar JM, Villagómez CJ and Márquez-Caballé RF
A prebiotic model, based in the framework of thermodynamic efficiency loss from small dissipative eukaryote organisms is developed to describe the maximum possible concentration of solar power to be dissipated on topological circular molecules structures encapsulated in lipid-walled vacuoles, which floated in the Archean oceans. By considering previously, the analysis of 71 species examined by covering 18 orders of mass magnitude from the Megapteranovaeangliae to Saccharomyces cerevisiae suggest that in molecular structures of smaller masses than any living being known nowadays, the power dissipation must be directly proportional to the power of the photons of solar origin that impinge them to give rise to the formation of more complex self-assembled molecular structures at the prebiotic stage by a quantum mechanics model of resonant photon wavelength excitation. The analysis of 12 circular molecules (encapsulated in lipid-walled vacuoles) relevant to the evolution of life on planet Earth such as the five nucleobases, and some aromatic molecules as pyrimidine, porphyrin, chlorin, coumarin, xanthine, etc., were carried out. Considering one vacuole of each type of molecule per square meter of the ocean's surface of planet Earth (1.8∗10 vacuoles), their dissipative operation would require only 10 times the matter used by the biomass currently existing on Earth. Relevant numbers (10-10) for the annual dissipative cycles corresponding to high energy photo chemical events, which in principle allow the assembling of more complex polymers, were obtained. The previous figures are compatible with some results obtained by followers of the primordial soup theory where under certain suppositions about the Archean chemical kinetical changes on the precursors of RNA and DNA try to justify the formation rate of RNA and DNA components and the emergence of life within a 10-million-year window, 3.5 billion years ago. The physical foundation perspective and the simplicity of the proposed approach suggests that it can serve as a possible template for both, the development of new kind of experiments, and for prebiotic theories that address self-organization occurring inside such vacuoles. Our model provides a new way to conceptualize the self-production of simple cyclic dissipative molecular structures in the Archean period of planet Earth. © 2017 ElsevierInc.Allrightsreserved.
Anti-wetting wing surface characteristics of a water bug, Diplonychusannulatus
Sharma S, Shaanker RU and Subramaniyan AK
Diplonychus annulatus (family Belostomatidae and order Hemipetra) is an aquatic water bug, adapted to ponds and wetlands. Commonly referred to as toe-biters or electric-light bugs, both the nymph and the adults prey on other invertebrates in the water. In search of both food and mates, the adults frequently fly between water bodies, leading to an amphibious lifestyle. It is likely that because of such a lifestyle, they have evolved structures on their wings that enable them to be dry and be able to fly. In this paper, we report the anti-wetting property of the fore and hind wings. We show that wings, have intricately designed hierarchical structures of setae, microtrichia, and a "micro-architectured well" interspersed with club-like projections. The wings were extremely superhydrophobic with water contact angle ranging between 160° to 170°. FTIR analysis of the wings indicated the presence of hydrophobic groups. Thus, due to both, the intricate surface features as well as possibly the low surface energy due to the hydrophobic groups on the wings, the water bug can maintain a high degree of dryness in its wings. We discuss these findings in the context of how wing adaptations contribute to the insect's ability to thrive in its amphibious lifestyle.
Biological mechanisms contradict AI consciousness: The spaces between the notes
Miller WB, Baluška F, Reber AS and Slijepčević P
The presumption that experiential consciousness requires a nervous system and brain has been central to the debate on the possibility of developing a conscious form of artificial intelligence (AI). The likelihood of future AI consciousness or devising tools to assess its presence has focused on how AI might mimic brain-centered activities. Currently, dual general assumptions prevail: AI consciousness is primarily an issue of functional information density and integration, and no substantive technical barriers exist to prevent its achievement. When the cognitive process that underpins consciousness is stipulated as a cellular attribute, these premises are directly contradicted. The innate characteristics of biological information and how that information is managed by individual cells have no parallels within machine-based AI systems. Any assertion of computer-based AI consciousness represents a fundamental misapprehension of these crucial differences.
A mechanistic approach to optimize combination antibiotic therapy
Clarelli F, Ankomah PO, Weiss H, Conway JM, Forsdahl G and Abel Zur Wiesch P
Antimicrobial resistance is one of the most significant healthcare challenges of our times. Multidrug or combination therapies are sometimes required to treat severe infections; for example, the current protocols to treat pulmonary tuberculosis combine several antibiotics. However, combination therapy is usually based on lengthy empirical trials, and it is difficult to predict its efficacy. We propose a new tool to identify antibiotic synergy or antagonism and optimize combination therapies. Our model explicitly incorporates the mechanisms of individual drug action and estimates their combined effect using a mechanistic approach. By quantifying the impact on growth and death of a bacterial population, we can identify optimal combinations of multiple drugs. Our approach also allows for the investigation of the drugs' actions and the testing of theoretical hypotheses. We demonstrate the utility of this tool with in vitro Escherichia coli data using a combination of ampicillin and ciprofloxacin. In contrast to previous interpretations, our model finds a slight synergy between the antibiotics. Our mechanistic model allows investigating possible causes of the synergy.
Stages and causes of the evolution of language and consciousness: A theoretical reconstruction
Rozov NS
This article presents a refinement of theoretical explanations of the main stages of linguistic and cognitive evolution in anthropogenesis. The concepts of language, consciousness, self-consciousness, the self, the unconscious, the subconscious, and the relation between free will and determinism remain at the center of active and complex debates in philosophy and neuroscience. A basic theoretical apparatus comprising the central concepts of "concern" and "providing structure" (an extension of the biological concept of "adaptation") develops the paradigm of the extended evolutionary synthesis. Challenge-threats and challenge-opportunities are invariably associated with concerns pertaining to sustenance, safety, sexuality, parenthood, status, and emotional support. The consolidation of successful behavioral tries (tries), in response to these challenges occurs through the formation of a variety of providing structures including practices, abilities, and attitudes. These structures are formed through mechanisms of interactive rituals and internalization. These novel practices facilitate the transformation of both techno-natural environmental niches and group niches. The emergence of new structures gives rise to new challenges and concerns, which in turn necessitate undertaking of new tries. In the context of African multiregionalism, hominin groups and populations that experienced favorable periods of demographic growth, active migration, genetic, technological, and skill exchange also underwent significant demographic disasters. During the most unfavorable bottleneck periods only the most advanced groups, populations and species survived. The achieved potential for these abilities was consolidated as complexes of innate assignments in gene pools through the Baldwin effect and the multilevel selection. This logic provides an explanation for the main stages of language and speech complication (from holophrases and articulation to complex syntax), as well as the emergence of new abilities of consciousness (from the expansion of attention field to self-consciousness and the "I"-structure).