Wednesday, June 30, 2021

#63. How Noncoding RNA May Work [chemistry]

 CH

Red, theory; black, fact.

Back, DNA; red, long noncoding RNA; green, transcription complex. A loop closes through an RNA running from bottom to top (not shown).

No junk DNA

The junk-DNA concept is quite dead, killed by the finding that the noncoding sections (sections that do not specify functional proteins) have base-pair sequences that are highly conserved in evolution and are therefore doing something useful.

Role of long non-coding RNA

So-called junk DNA is useful because the RNA transcripts made from it are useful, serving as controllers of the transcription process itself and thus, indirectly, of protein expression. Changes in protein expression may be considered the immediate precursor of a cell's response to its environment, analogous to muscle contractions in an intact human. Small noncoding RNAs seem to be repressors of transcription and long noncoding RNAs (lncRNA) may either repress or promote. Despite the accumulation of much biochemical information, summaries of what lncRNA does seem to me unfocussed and unsatisfactory.

Background on control of gene transcription 

The classical scheme of protein expression, due to Jacob and Monod, was discovered in bacteria, in which a signal molecule from the environment (lactose in the original discovery) acts by binding to a protein to change its conformation (folding pattern). The changed protein loses the ability to bind to DNA upstream from the sequence that specifies the lactase enzyme, where it normally acts to block transcription. The changed protein then desorbs from DNA, which triggers transcription of lactase messenger RNA, which is then translated into lactase enzyme, which confers on the bacterium the ability to digest lactose. Thus, the bacterium adapts to the availability of this food source.

All this can be modelled in neurobiological terms. Clearly, it's a reflex, comparable to the spinal reflexes in vertebrates. An elementary sensorium goes in, and an elementary response comes out. But vertebrates also have something higher than spinal reflexes: operations by the brain.

A neuron-inspired theory of long non-coding RNA

Noncoding RNAs may have a coordinating role: rather than relying on a set of independently acting "reflexes," eukaryotic cells can sense many promoter signals at once, as a gestalt, and respond with the expression of many proteins at once, as another gestalt. An entire brain is not needed to model this process, just one neuron. The synaptic inputs to the dendrites of the neuron can model the multiple promoter activations, and the eventual output of a nerve impulse (action potential) can represent the signal to co-express a certain set of proteins, which is hard-wired to that metaphorical neuron by axon collaterals. In real neurons, action potentials are generated by a positive feedback between membrane depolarization and activation of the voltage-gated sodium channel. This positive feedback can be translated into molecular biology as a cyclic, autocatalytic pattern of lncRNA transcription, in which each lncRNA transcript in the cycle activates the enhancer (which is like a promoter) of the DNA of the next lncRNA in the cycle. The neuron model suggests that the entire cycle has a low level of baseline activity (is "constitutively active" to some extent) but the inhibitory effect of the small noncoding RNAs (analogous to what is called the rheobase current in neurons) suppresses explosive activation. However, when substantially all the promoters in the cycle are activated simultaneously, such explosive transcription occurs. The messenger RNA of the proteins to be co-expressed as the coordinated response is generated as a co-product of lncRNA hyper-transcription, and the various DNA coding regions involved do not have to be on the same chromosome.


Sunday, May 23, 2021

#62. Storming South [evolution, evolutionary psychology]

EV  EP

Red, theory; black, fact.

This is a theory of the final stages of human evolution, when the large brain expansion occurred.

H. sapiens appears to have arisen from Homo erectus over the last 0.8 million years due to climate instability in the apparent origin area, namely East Africa. During this time, Europe was glaciated every 0.1 million years because of the astrophysical Milankovitch cycle, a rhythm in the amount of eccentricity in the Earth's orbit due to the influence of the planet Jupiter.
However, consider the hominins who had settled in Europe (or Asia, it doesn't matter for this argument) during the interglacial periods (remember that H. erectus was a great disperser) and when the ice began advancing again, were now facing much worse cooling and drying than in Africa, and thus much greater selection pressures. At least during the last continental glaciation, the ice cap only extended to the Baltic Sea at the maximum, but long before the ice arrives, the land is tundra, which can support only a very thin human population. In any given glaciation, the number of souls per hectare the land could support was relentlessly declining in northern Europe/Asia, and eventually the residents had to get out of Dodge City and settle on land further south, almost certainly over the dead bodies of the former owners. This would have selected Europeans or Asians for warlike tendencies and warfaring skills, which explains a lot of human history. 

Our large brains

However, our large brains seem to be great at something else besides warfaring: that is, environment modification. It's a no-brainer that the first thing someone living in the path of a 2-km wall of ice needs is to keep from freezing to death, and this would have been the first really good reason to modify environments. Unlike chipping a stone axe, environment modification involves fabricating something bigger than the fabricator. Even a parka has to be bigger than you or you can't get into it. This plausibly would have required a larger brain to enable a qualitatively new ability: making something you can't see all at once when it is at working distance.

Our rhythmic evolution

After parkas, early northerners might have evolved enough association cortex on the next glaciation cycle to build something a little bigger, like a tent or a lean-to. On the next cycle, they might have been able to pull off a decent longhouse made of wattle. On the next, a jolly good castle surrounded by cultivated lands and drainage ditches. These structures would have delayed the moment of decision when you have to go and take on the Pleistocene-era Low-brows to the south. This will buy you time to build up your numbers, and I understand that winning battles is very much a numbers game. Therefore, environment modification skill would have been selected for in tandem with making like army ants.

The fossil evidence for this theory

Fossil evidence of all this in Europe or Asia may exist in the form of Neanderthal and Denisovan discoveries, hominins who have been difficult to account for in terms of previous theories of human origins. My scenario can be defended against the fossil evidence for a human origin in East Africa in general terms by citing the well-known incompleteness of the fossil record and its many biases. Moreover, a detailed explanation begins by citing what else is in East Africa: the Suez, a land bridge to both Europe and Asia via the Arabian tectonic block, which was created by plate tectonics near the end of the Miocene, thus antedating both H. sapiens and H. erectus. Not only can hominins disperse through it to other continents during interglacials, but they can come back in, fiercer and brainier than before, when the ice is advancing again, to then deposit their fossil evidence in the Rift Valley region of East Africa. The Eurasian backflow event of 3000 years ago may be a relatively recent example of this. The Isthmus of Suez is low-lying and thus easily drowned by the sea, but the probability of this was minimal at times of continental glaciation, when sea levels are minimal. This argument has similarities with the Beringia theory of how North America was populated. Early hominins expanded like a gas into whatever continent they could access. Increasing glaciation/tundrafication of that continent would have recompressed the "gas" southward, causing it to retrace its path, partly back into Africa. 

Pleistocene selection pressures

This process would have been accompanied by great mortality and therefore, potentially, much selection. Moreover, during the period we are considering, temperatures were declining most of the time; the plot of temperature versus time has a saw-tooth pattern, with long declines alternating with short warming events, and it is the declines that would have been the times of natural selection of hominins living at high latitudes.


A limestone block in Canada showing scratches left by stones
embedded in the underside of a continental glacier.
The rock has also been ground nearly flat by the same process.



Sunday, December 6, 2020

#61. Consciousness is Google Searches Within Your Brain [neuroscience]

NE

Red, theory; black, fact.

The brain is like this because the long connections define the computations.

The Google search is too good a trick for Nature to miss and she didn't, and it's called consciousness.


Brain mechanism of consciousness

I conjecture that the human brain launches something like a Google search each time an attentional focus develops. This is not necessarily a literal focus of activity on the cortex; it is almost certainly a sub-network activation. The sub-net activity relays through the prefrontal cortex and then back to sensory cortex, where it activates several more sub-nets; each of these, in turn, activates further sub-nets via the prefrontal relay, and so on, exponentially. At each stage, however, the degree of activation declines, thereby keeping the total cortical activation limited.


Accounting for subjective experience

The first-generation associations are likely to be high in the search rankings, and thus subjectively "close" to the triggering attentional focus and relatively strongly in consciousness, although still in the penumbra that is subjectively "around" the attentional focus. Lower-ranking search results would form a vast crowd of associations only dimly in consciousness, but would give conscious experience its richness. Occasionally, an association far out in the penumbra will be just what you are looking for, and will therefore be promoted to the next attentional focus: you get an idea.


The role of emotions

The evaluation process responsible for this may involve the mediolateral connections of the cortex, which lead back to the limbic system, where emotions are thought to be mediated, at the cingulate gyrus. Some kind of pattern recognition seems necessary, whereby a representation of what you desire, itself a sub-network activation elaborated by the mediolateral system, is matched to retrieved associations. Your target may be only a part of the retrieved association, but will suffice to pull the association into the attentional focus.

This system would allow a mammal to converge everything it knows on every task, rather than having to perform as a blinkered if-then machine.


Brain mechanisms and our evolutionary history

Why should we have this back-and-forthing between the prefrontal cortex and the sensory association cortex? Two possible explanations are: 1) The backward projections serve a priming function, getting certain if-then rules closer to firing threshold in a context-sensitive manner; 2) This action is a uniquely human adaptation for our ecological niche as environment modifiers. 

In ordinary tool use and manufacturing dating back to Homo habilis, the built thing is smaller than the builder's body, but in environment modification, the built thing is larger than the builder's body. Thus, the builder can only see one part of it at a time. Viewings must therefore be interleaved with reorientations involving the eyes, neck, trunk, and feet. These reorientations, being motoric in nature, will be represented frontally, and I place these representations in the prefrontal cortex. The mental representation of the built thing therefore ends up being an interleaved collection of views and reorientations, in other words, a simulation. The reorientations would have to be calibrated by the vestibular system to allow the various views to be assembled into a coherent whole. By this theory, consciousness is associated with environment modification.

Consistent with this theory, the cortical representation of vestibular sense data is atypical. There is no "primary vestibular area." Rather, islands of vestibular-responsive neurons are scattered over the sensory cortex, distributing across the other senses. This seems analogous to a little annotation for xyz coordinates, etc., automatically inserted in a picture, as seen in computer-generated medical diagnostic images.

Saturday, October 31, 2020

#60. The Trembling-Network Theory of Everything [physics]

PH

Red, theory; black, fact. 



The world of appearances is simulation-like, in that how we perceive it is strongly affected by the fact that our point of view is inside it, and illusions are rampant.


The slate-of-givens approach is intended to exploit consilience to arrive at a simplified physics that attributes as many phenomena as possible to historical factors and the observer's point of view. Simplified physics is viewed as a stepping-stone to the true theory of everything. The existence of widespread consilience implies that such exists.

The basic theory

The underlying reality is proposed to be a small-world network, whose nodes are our elementary particles and whose links ("edges" in graph theory) are seen collectively as the fields around those particles.

This network is a crude approximation to a scale-free network (fractal network), but is only a recursion of three generations (with a fourth in the process of forming), each comprised of two sub-generations, and not an infinite regress. The first generation to form after the big bang was triangular networks that we call baryons. In the next generation, they linked up to form the networks underlying light atomic nuclei. These, and individual protons, were big enough to stably bond to single nodes (electrons) to form the network version of atoms. Above the atomic/molecular/electromagnetic level, further super-clustering took on the characteristics of gravitation. At the grandest cosmological scales, we may be seeing a fourth "force" that produces the foamy structure of galaxy distribution. The observations attributed to the presence of dark matter may be a sign that, at the intra-galactic scale, the nature of the "fields" is beginning to shift again.

I conjecture that throughout this clustering process, a continuous thermal-like agitation was running through all the links, and especially violent spikes in the agitation pattern could rupture links not sufficiently braced by other, parallel links. This would have been the basis of a trial-and error process of creation of small-world characteristics. The nature of the different "forces" we seem to see at different scales would be entirely conditioned by the type of clusters the links join at that scale, because cluster type would condition the opportunities for network stabilization by cooperative bracing. 

Mapping to known science

Formation and rupture of links would correspond to the quantum-mechanical phenomenon of wave-function collapse, and the endless converging, mixing, and re-diverging of the heat signals carried by the network would correspond to the smooth, reversible time-evolution of the wave-function between collapses. The experience of periodic motions would arise from resonances in closed paths embedded in the network. 

The photoelectric effect that Einstein made famous can be given a network interpretation: the work function is the energy needed to simultaneously break all the links holding the electron to the cluster that is the electrode, and the observation of an electron that then seems to fly away from the electrode happens by calculation in the remaining network after it has been energized by energy in excess of that needed to break the links, reflecting back into the network from the broken ends.

Distance

All the ineffably large number of nodes in the universe would be equidistant from each other, which is possible if they exist in a space with no distance measure. Distance would be the number of nodes that an observer cluster contains divided by the number of links connecting it with the observed cluster.

The finite speed of light

The time-delay effect of distance can be described by a hose-and-bucket model if we assume that all measurements require link breaking in the observer network. The energy received by the measuring system from the measured system is like water from a hose progressively filling a bucket. The delayed overflow of the bucket would correspond to the received energy reaching threshold for breaking a link in the observer network. The fewer the links connecting observer to observed relative to the observer size (i.e., the greater the distance), the slower the bucket fills and the longer signal transmission is observed to take.

The above mechanism cannot transmit a pulsatile event such as a supernova explosion. It takes not one, but two integrations to convert an impulse into a ramp function suitable for implementing a precise delay. Signal theory tells us that if you can transmit an impulse, you can transmit anything. The second integration has already been located in the observer cluster, so the obvious place in which to locate the first integration is in the observed cluster. Then when the link in the observer cluster breaks, which is an endothermic event, energy is sucked out of both integrators at once, resetting them to zero. That would describe an observer located in the near field of the observed cluster. In the far field, the endothermic rupture would cool only the observer cluster; most of the radiative cooling of the observed cluster would come from the rupture of inter-cluster links, not intra-cluster links. Thus, hot clusters such as stars are becoming increasingly disconnected from the rest of the universe. This can account for the apparent recessional velocity of the galaxies, since distance is inversely proportional to numbers of inter-cluster links.

Oscillations

Oscillating systems feature 4 clusters and thus 4 integrators connected in a loop to form a phase-shift oscillator. These integrators could be modeled as a pair of masses connected by a spring ( = 2 integrators) in each of the observer and observed systems (2 x 2 = 4 integrators).

Motion and gravity

Motion would be an energetically balanced breaking of links on one side of a cluster and making of links on the other side. This could happen on a hypothetical background of spontaneous, random link making and breaking. Acceleration in a gravitational field would happen if more links are coming in from one side than the opposite side. More links will correspond to a stronger mutual bracing effect, preferentially inhibiting link breaking on that side. This will shift the making/breaking equilibrium toward making on that side, resulting in an acceleration. The universal gravitational constant G could be interpreted as expressing the probability of a link spontaneously forming between any two nodes per unit of time.

Dimension

That the universe is spatially at least three-dimensional can be reduced to a rule that links do not cross. Why the minimum dimensionality permitted by this rule is the one we observe remains to be explained. 

A universal law 

Heat of link formation = heat of link rupture + increases in network heat content due to increases in network melting point due to increases in mutual bracing efficiency. Melting point measures the density of triangles in the network.

Repulsive forces

Repulsive forces are only seen with electromagnetism and then only after ionization. When particles said to be oppositely charged recombine, neutral atoms are re-formed, which creates new triangles and thus increases melting point. The recombination of particles said to be of like charge creates relatively few triangles and is therefore disfavored, creating the impression of mutual repulsion.
 

Momentum

Links are directional in their heat conduction. A direction imbalance in the interior of a cluster causes motion by spontaneously transporting heat from front to back. Front and back are defined by differences in numbers of links to an arbitrary external cluster between front and back sub-clusters.

Case study of a rocket motor

A directional link can only be burned out by heat applied to its inlet end. During liftoff, the intense heat down in the thruster chambers burns out links extending up into the remainder of the craft. This leaves an imbalanced excess of links within the rocket body going the other way, leading to a persistent flow of heat downward from the nose cone. This cooling stabilizes links from overhead gravitationally sized clusters ending in the nose cone, causing them to accumulate, thereby shortening the "distance" from the nose cone to those clusters. Meanwhile, the heat deposited at the bottom of the rocket progressively burns out links from the rocket to the Earth, thereby increasing the "distance" between the rocket and the Earth. The exhaust gasses have an imbalanced excess of upward-directed asymmetric links due to the temperature gradient along the exhaust plume that serves to break their connection to the rocket and create the kind of highly asymmetrical cluster required for space travel. Link stabilization is likewise only responsive to what happens at the inlet end. 

Future directions

Links in the universal network are the real things and the nodes are just their meeting places, which only appear to be real things because this is where the angle of the flow of energy changes. All links are directional and pairing of oppositely-directed links was an early step in the evolution of the universe. Directional links are representable as an inlet part joined to an outlet part. With this decomposition, a link pair looks like this:
⚪⚫
⚫⚪
A purely directional link recalls the one-way nature of time and may represent undifferentiated space and time. 

Tuesday, June 16, 2020

#59. Neuromodulators as Peril Specialists [neuroscience, evolution]


Red: theory; black, fact.


Solanum dulcamara, a plant with anticholinesterase activity.



“Life is Difficulty”


My PhD thesis was about a neuromodulator (acetylcholine) acting on mammalian brain. It was tough to decapitate all those rats; I never got used to it. But if you can’t stand the formaldehyde, get out of the lab.


The basic theory

I conjecture that the primordial function of any type of transmitter substance acting on the g-protein-coupled cell-surface receptors or nuclear receptors of neurons was to coordinate the whole-organism response to some class of perils.

Table 1.

 Peril  Substance  Failure mode
Extremes of heat and cold glutamate and GABA  ?
Predator serotonin depression
Parasite histamine phobia
Rival conspecific noradrenaline paranoia
Social isolation

Wednesday, March 25, 2020

#57. The Drill Sergeants of the Apocalypse [evolutionary psychology, population]

EP     PO     

Red, theory; black, fact.

Seen in a hospital ward.


The trickster type may really be Nature's penetration tester. The type probably emerges in contexts of unequal power (Elmer Fudd has the shotgun; Bugs Bunny doesn’t). Thus, an abiding fear is the soil out of which tricksterism grows, by the following positive feedback: A successful trick shows up the Fudds and shows them in a feckless light, which reduces the fear level of the trickster, which reinforces trick-playing. This is a short-term high that comes at the expense of worse relations with the Fudds and thus eventually even greater fear levels for the tricksters, which they try to remedy with still more tricks. An example of an unequal power relationship is between a foreign invader and the defenders. Invasion is such a common event in history that by now, countermeasures will have evolved. Tricksterism is likely to be a tile in the mosaic of any such adaptation. Tricksterism also seems to be a form of play. A result from animal ethology is that the play of young animals is a form of learning. The thing learned in playing tricks may be how to manage power inequalities.

A biological precedent for penetration testing?

Evidence for a biological precedent may be the many retroviruses integrated into the human genome. One of these becomes active now and then at random and kills the host cell if the anti-viral defenses of the latter have become weak due to some somatic mutation. The red team-blue team strategy seems to be too good a trick for nature to miss.

Evolution of the trickster

Modern human populations may have two independent axes of political polarization: oppressor-oppressed and trickster-control freak. The first may subserve dispersal by generating refugee groups and the second may subserve building. Any built thing must serve in a complex world in which many constraints must be simultaneously observed. Thus, after the initial build, a long period of tweaking must typically follow. The role of the tricksters is to powerfully motivate this tweaking, for example, by cleverly making someone’s shelter fall down, before the complacency of the control-freak builders leads to disaster. This may have been how engineering was done by an archaic version of Homo sapiens. Tricksterism may have evolved out of a previously evolved capacity for military strategy, which involves essentially putting one over on the enemy. The tricksters can also make mistakes, causing damage that cannot have a silver lining in any possible world, and moving to correct this is a natural role of the builders. If you are a builder, ask this: “What is the best use of my indignation?” It is to keep to a strict harm-reduction approach. Tricksterism can intensify into sadism, in which the protagonist takes pleasure in the victim’s torment and wants to make it last. Well, boy, if you make it last, you are giving the victim plenty of time and motivation to figure out solutions, like a patient old instructor giving his pupil his lessons one at a time, as he is ready for them, and this is how the wise victim will construct the situation. Such a victim will end up with information and know-how others will pay for. Selection for building skill and thus tricksterism may have occurred in multiple successive episodes over the course of the Pleistocene due to periodic continental glaciations.

Before the trickster

Our evolutionary forebears may have been champion dispersers for a long time before the ice age forced some of them to become champion builders, initially, of shelters and warm clothing. (Champion environment modifiers may be closer to the mark.) It is an interesting fact that physically, humans exceed all other animals only in long-distance running, which can be read as dispersal ability. Our carelessness with preserving the local environments and our propensity for overpopulation can be read as typical r-selected disperser behavior. The r-selecting niche was big game hunting. H. erectus sites indicate consumption of medium and large meat animals. Overhunting would have occurred routinely, due to the slow reproduction rates of large animals and the high hunting efficiency of H. erectus due to tool use, so that dispersal of the hunters to new habitats would likewise have been routine.