Each post presents a science theory I thought of. I am 71 years old and have two science degrees and six peer-reviewed publications.
Monday, April 27, 2020
Wednesday, March 25, 2020
#57. The Drill Sergeants of the Apocalypse [evolutionary psychology, population]
The trickster type may really be Nature's penetration tester. The type probably emerges in contexts of unequal power (Elmer Fudd has the shotgun; Bugs Bunny doesn’t). Thus, an abiding fear is the soil out of which tricksterism grows, by the following positive feedback: A successful trick shows up the Fudds and shows them in a feckless light, which reduces the fear level of the trickster, which reinforces trick-playing. This is a short-term high that comes at the expense of worse relations with the Fudds and thus eventually even greater fear levels for the tricksters, which they try to remedy with still more tricks. An example of an unequal power relationship is between a foreign invader and the defenders. Invasion is such a common event in history that by now, countermeasures will have evolved. Tricksterism is likely to be a tile in the mosaic of any such adaptation. Tricksterism also seems to be a form of play. A result from animal ethology is that the play of young animals is a form of learning. The thing learned in playing tricks may be how to manage power inequalities.
A biological precedent for penetration testing?
Evidence for a biological precedent may be the many retroviruses integrated into the human genome. One of these becomes active now and then at random and kills the host cell if the anti-viral defenses of the latter have become weak due to some somatic mutation. The red team-blue team strategy seems to be too good a trick for nature to miss.
Evolution of the trickster
Modern human populations may have two independent axes of political polarization: oppressor-oppressed and trickster-control freak. The first may subserve dispersal by generating refugee groups and the second may subserve building. Any built thing must serve in a complex world in which many constraints must be simultaneously observed. Thus, after the initial build, a long period of tweaking must typically follow. The role of the tricksters is to powerfully motivate this tweaking, for example, by cleverly making someone’s shelter fall down, before the complacency of the control-freak builders leads to disaster. This may have been how engineering was done by an archaic version of Homo sapiens. Tricksterism may have evolved out of a previously evolved capacity for military strategy, which involves essentially putting one over on the enemy. The tricksters can also make mistakes, causing damage that cannot have a silver lining in any possible world, and moving to correct this is a natural role of the builders. If you are a builder, ask this: “What is the best use of my indignation?” It is to keep to a strict harm-reduction approach. Tricksterism can intensify into sadism, in which the protagonist takes pleasure in the victim’s torment and wants to make it last. Well, boy, if you make it last, you are giving the victim plenty of time and motivation to figure out solutions, like a patient old instructor giving his pupil his lessons one at a time, as he is ready for them, and this is how the wise victim will construct the situation. Such a victim will end up with information and know-how others will pay for. Selection for building skill and thus tricksterism may have occurred in multiple successive episodes over the course of the Pleistocene due to periodic continental glaciations.
Before the trickster
Our evolutionary forebears may have been champion dispersers for a long time before the ice age forced some of them to become champion builders, initially, of shelters and warm clothing. (Champion environment modifiers may be closer to the mark.) It is an interesting fact that physically, humans exceed all other animals only in long-distance running, which can be read as dispersal ability. Our carelessness with preserving the local environments and our propensity for overpopulation can be read as typical r-selected disperser behavior. The r-selecting niche was big game hunting. H. erectus sites indicate consumption of medium and large meat animals. Overhunting would have occurred routinely, due to the slow reproduction rates of large animals and the high hunting efficiency of H. erectus due to tool use, so that dispersal of the hunters to new habitats would likewise have been routine.
Thursday, December 19, 2019
#56. Stress and Schizophrenia [neuroscience]
Introduction
The main positive symptoms of schizophrenia, namely hallucinations, word salad, and loosening of associations, all seem to be variations of the latter, so loosening of associations will here be taken as the primary disorder. Stress and the brain's dopaminergic system are strongly implicated in the causation of schizophrenia. In connection with stress, psychologists speak of "the affective [emotional] pathway to schizophrenia."
Organismal responses to stress
Stress is known to increase genetic variability in bacteria, a process known as transformation. Stress is likewise known to increase the meiotic recombination rate in sexually reproducing organisms such as fruit flies. (Stress-induced recombination and the mechanism of evolvability. Zhong W, Priest NK. Behavioral ecology and sociobiology. 2011;65:493-502.) It seems that when an organism is in trouble, it begins casting about ever more widely for solutions. If evolution is the only mode of adaptation available, this casting about will take the form of an increase in the size and frequency of mutations. In conscious humans, however, this casting about in search of solutions in the face of stress may well take the form of a loosening of associations during thought. Should the person find the solution he or she needs, then presumably the stress levels go down and the thought process tightens up again, so we have a negative feedback operating that eventually renormalizes the thought process and all is well. In optimization theory, this process is called "simulated annealing."
Disorder of a cognitive stress response
But what if the person does not find the solution they need? Then, presumably the loosening of associations gets more and more pronounced ("reverse annealing") until it begins to interfere with the activities of daily living and thus begins to contribute net stress, thus making matters worse, not better. Now we have a pernicious positive feedback operating, and it rapidly worsens the state of the sufferer in what is known as a psychotic break, resulting in hospitalization. That these psychotic breaks are associated with tremendous stress is made clear by the fact that post-traumatic stress disorder is a common sequel of a psychotic episode.
Stress: molecular aspects
Messenger substances (i.e., hormones and neuromodulators) known to carry the stress signal are: CRF, ACTH, cortisol, noradrenaline, adrenaline, dopamine, NGF, and prolactin. The well-known phenomenon of stress sensitization, which may be part of the disease mechanism of schizophrenia, probably inheres in long-term changes in protein expression and will not be apparent in a simple blood test for any of the above substances without a prior standardized stress challenge. (e.g., the process of getting the needle itself. In that case, you would install a catheter through the needle to permit repeated blood sampling and collect the baseline sample long after the intervention sample, not before, as is customary in research.)
Other mental illnesses
Bipolar disorder may result from an analogous positive feedback affecting another problem solving adaptation of the brain, which would be modelled by the alternation of brainstorming sessions (mania) with sessions in which the brainstormed productions are soberly critiqued (depression).
Brain mechanisms
How does the loosening of associations of schizophrenia arise? I conjecture that one activated sensory memory represented in the posterior cortex does not activate another directly, but indirectly via an anatomically lengthy but fast relay through the prefrontal cortex, which has a well known dopaminergic input from the ventral tegmental area of the midbrain. A higher vertebrate has a free-will spectrum, with machine-like performance and high dopaminergic tone at one end, and at the other, a carefully considered performance verging on overthinking, with low dopaminergic tone. Persons with schizophrenia have pushed past the latter end of the spectrum into dysfunction. Dopamine could orchestrate movement along the free-will spectrum by a dual action on the prefrontal cortex: inhibiting associational reflexes passing back to posterior cortex while facilitating direct outputs to the motor system. Dual actions of neuromodulators are a neuroscientific commonplace (e.g., my PhD thesis) and dopamine is a neuromodulator. The NMDA receptor, which is also strongly implicated in schizophrenia, enters the picture as the source of excitation of the ventral tegmental area.
#55. Gender is Pecking Order [evolutionary psychology]
EP
Red, theory; black, fact.
Gender is pecking order
Gender, social status, and testosterone are clearly interrelated, but exactly how requires clarification when the very nature of gender is in question, as now. One possibility is that the male pecking order sits directly atop the female pecking order, and there is no barrier between. Thus, a male who falls low enough in the male pecking order will undergo a reversal in gender identification from male to female (and maybe keep on going down) and a female who rises high enough in the female pecking order will likewise undergo a reversal in gender identification from female to male (and maybe keep on going up). The entire structure could be called "the" pecking order, with the statistical median of the status ranks, and possibly the ranked testosterone levels, always dividing females from males, at least in terms of gendered social signaling. This could be an example of what is called an exact theory replacing its approximate counterpart. In this case, the corresponding approximate theory would be the gender binary. ("You are either a man or a woman.")
A limitation of this “median theory” is that no causative mechanism is provided.
Recent history of trans
Since the early sixties, we have seen a trend of increasing media exposure of trans and non-binary individuals, and this was also a period of ever-increasing human population numbers. I conjecture that the latter trend caused the former. The population trend may have produced an upward trend in the average population density at which people are living, suburban expansion notwithstanding. This may have caused an increasing incidence of aggressive one-on-one interactions among humans due to the Calhoun effect, which is much discussed in these pages. Aggressive, one-on-one interactions are well known to change the social status of the combatants, the winner enjoying increased status (i.e., a higher ranking in the pecking order) and the loser suffering reduced status. Overall, population density increases can thus be expected to increase the amount of traffic on the social ladder, both upward and downward, leading to increasing numbers of individuals crossing the median and becoming trans or nonbinary. The increasing numbers of trans and nonbinary individuals in society was then faithfully reflected in the content of the news stories of the day. QED.
Trans not genetically determined
Consistent with this, PLOS blogger R. Lewis, who has a PhD in genetics, found remarkably little evidence of a direct genetic causation in transgenderism. Moreover, out of 58 studies on "transgender" listed on clinicaltrials.gov, nothing worth mentioning was found about genetics. This could be an instance of the filing-drawer effect (negative results not published but left to languish in the filing cabinet).
How pecking-order dynamics may lead to dispersal
I am indebted to Jordan Peterson for turning me on to the pecking-order idea. It can explain aspects of dispersalism, as follows: If people have no emotional memory of their social wins and losses, we would expect their distribution on the social ladder to be Gaussian (aka, a bell curve). However, if a win or loss leaves you with an emotional residue of optimism or pessimism (and, of course, it does), a positive feedback can set in if conflicts are coming faster than the emotional fallout from each can dissipate, so that the more you lose, the greater your pessimism, and the more likely you are to lose in the future. Moreover, the more you win, the greater your optimism, and the more likely you are to win in the future. Following Peterson, this emotional fallout effect may be due to prolonged up- and down-regulations of serotonin concentrations in the brain. See post 66 for my opinions on neuromodulators, e.g., serotonin. This dynamic then splits the population into a bi-modal social distribution of oppressors and oppressed, and the latter soon join some refugee stream, resulting in dispersal. The frequency of conflicts could be measuring population density, and the conflicts would not necessarily be over resources, but over proxies for these such as land or jobs. With the addition of these ideas, the splitting and separation of overcrowded rodent populations in the behavioral-sink phase of a Calhoun experiment is explained. To connect these ideas with my earlier idea of the sadness cycle, I conjecture that sadness and its attendant social signaling expresses anger colored by pessimism about winning, whereas contempt and its social signaling expresses anger colored by optimism about winning.
The attack on reproduction may be the field mark of the oppressor. This gets back to the beef the ancient Egyptians had with the Hebrews. It’s always, “These [fill in the blank] breed like flies.” and if you want a group to leave*, preventing them from reproducing would be very effective, as this is the most noxious intervention imaginable. In class-based oppression, the attack would often be structurally entrenched, and nobody sees the structure because they are too close to it. In this paragraph, I am conflating class-based and ethnicity-based oppression, which may or may not warrant future amendments.
A false-flag strategy?
Another idea about trans is that it is a false-flag strategy used by low-status males and females to reproduce without punishment. Pair a gay woman with a trans woman and you have a potentially fertile couple able to fool the oppressors until the deed is done. Likewise a gay man and a trans man.
That said, trans is not fundamentally political, but hedonistic. That said, “the heart [unconditioned stimuli] has its reasons that reason knoweth not.” The reason is natural selection, but we still say that trans is not genetically determined because what is selected is only the potential for it, which everyone has. Triggers from the environment are still required for expression of this adaptation, which I think of as a long-lasting state, so the idea of an unconditioned stimulus admits some nuance: the state-gated unconditioned stimulus. An example of a trigger from the environment would be your dad losing his temper in front of you to the point where you think you are about to die. The frustration behind this would be due to his experience of oppression. The outburst therefore amounts to an involuntary report to the child on conditions prevailing outside the family home.
I suppose you are thinking, “This false-flag strategy is unnecessary because, for example, no foreman would whack a couple of workers over the head with a pipewrench if he found them kissing in the lunch room, and they appeared to be a binary couple!” Yes, he would, if he were in evolutionary throwback mode, a terribly real mode people shift into whenever the price of bread rises relative to wages, in which behaviour follows the non-human laws of Homo erectus. Since I just wrote something potentially threatening or disempowering, I remind you that we all have this mode and entering and exiting this mode are situational, and that genes bias our behaviour but do not determine it.
*Means: have been selected in evolution to act as if you wanted them to leave.
Photo by Jonny Gios on Unsplash
Saturday, December 14, 2019
# 54. Disaster Biology [evolution, evolutionary psychology]
EV EP
Red, theory; black, fact.
The habitat may have been a unit of selection in early hominins, leading to group selection, and much of our evolution may have proceeded by an accumulation of founder effects. Opportunities for colonization of recently-emptied habitats are ephemeral. Under disaster-prone conditions, this plausibly leads to selection pressure for migrant production and evolvability (i.e., a high rate of evolution, especially founder-effect evolution). Language diversification in humans may be an evolvability adaptation. Language diversity would work by preserving genetic founder effects from dilution by late-coming migrants, whose reproduction would be held back by the difficulties of learning a new language. Xenophobia and persistent ethnicity markers can be explained in the same way. The spread of linguistic and cultural novelties in a hominin population is predicted to be especially fast in newly colonized, previously empty habitats. Alternatively, the linguistic novelties may start as a thick patois developed by an oppressed group in the home habitat prior to becoming refugees, as a way to make plans "under the noses" of the oppressing group. Refugee-producing adaptations sub-serving dispersal can be called "tough altruism." Populations producing more refugees are more likely to colonize further empty habitats, a selective advantage.
Disaster biology may be what is conceptually missing from theories of the origin of life (abiogenesis). i.e., the forerunners of the first cells may have been spores.
Sunday, November 24, 2019
#53. Where are All the Space Aliens? [evolution, evolutionary psychology]
EV EP
Red, theory; black, fact.
KIRK MUST DIE! (cut to commercial.)
Astronomical observations and the Fermi paradox
Contemporary exoplanet research keeps turning up extra-solar-system planets that seem to be promising abodes of life of the Earthly variety (never mind the completely weird biochemistries that may exist on other planets). In the habitable exoplanets catalogue (HEC), kept by the Planetary Habitability Laboratory, University of Puerto Rico, Arecibo, the list of planets found orbiting in the conservative habitable zone now has 17 entries, and a 2013 paper by Petigura et al. ("Prevalence of Earth-size planets orbiting Sun-like stars") placed the percentage of stars in our galaxy with potentially habitable planets at 22 ± 8. Accumulating evidence suggests that life is common in our galaxy, yet SETI research—the search for extraterrestrial civilizations that send out radio signals that bear some stamp of intelligence—has drawn a complete blank, as far as I know. And if it did find something, it would make such a sensation in the media that no-one could help knowing. So I ask you: where are all the space aliens? This question is generally attributed to 20th-century physicist Enrico Fermi and has since become known as the Fermi Paradox.
My hypothesis is this:
Life is one thing; intelligent life is quite another. This is a form of the Rare Earth hypothesis, which is one of the avenues that has been explored through the years in the search for a resolution of the Fermi Paradox.
Biospheres may not be permanent
No doubt there are many, many planets in our part of the galaxy that have some form of primitive life, and many, many more "graveyard planets" that once had life but are now sterile. Mars may well be an example of this kind of planet in our own solar system.
Biochallenge!
I conjecture that if we seem to be alone in this part of the galaxy, based on the negative SETI evidence, it is because we are, and this is because we have evolved to the level of intelligence first in this galactic neighborhood, because evolution on the Earth is egregiously rapid. It has taken us four billion years to get this far, which doesn't sound so fast, but everything is relative. This rapid evolution is plausibly a response to challenges: all the various natural disasters we are subject to here on Earth, examples being bolide (meteor) crashes, continental glaciations, drifting continents, droughts, earthquakes, floods, hurricanes, long climatic warm spells, tornadoes, tsunamis, volcanism, wild weather, wildfires, and winter.
Sept 23, 2018: Tornadoes knock out primary transformer station in Ottawa.
Case in point: a large bolide strike is believed to have triggered the extinction of the dinosaurs, making way for the rise of the mammals, and we ourselves are the descendants of those mammals. The bolide may have killed the dinosaurs indirectly, by touching off a climate shift in our dangerously unstable world. This would explain the temporary presence of dinosaur fossils above the Cretaceous/Tertiary iridium anomaly, which has been a problem for the bolide hypothesis.
Case in point: the rise of modern humans seems to have coincided with the end of the last continental glaciation. The rigorous, cold-climate conditions prevailing then might have selected our ancestors for high ability in building shelters and sewing protective clothing. These skills might have required the rapid evolution of a high ability to process spatial information, which we then leveraged into the building of civilizations upon the return of temperate climatic conditions.
To contrive a planet that is so challenging and difficult, yet has not succeeded in destroying life altogether in four billion years, may require a very rare combination of parameters (e.g., our distance from the sun, the size and composition of the Earth, the presence of the asteroid belt, the presence of the Oort cloud), and this rarity has led to our emerging into intelligence before it happened anywhere else in this part of the galaxy.
These parameters may well have special values at which critical behavior occurs, such as the onset of positive feedbacks leading to heating or cooling. Earth may be simultaneously close to several of these critical points, a rare circumstance, but one that does not require extreme, atypical values of any given variable.
My take on the Rare Earth hypothesis therefore emphasizes what are called "evolutionary pumps" (e.g., glaciations, bolide crashes, etc.) in discussions of this hypothesis, as well as the anthropic principle.
August 28, 2011: An Ottawa sunset inflamed by a recent hurricane in the USA.
Evil-ution
I further conjecture that the difficulties of our past have left their mark on us, and we call it "evil." Some will deny that this concept has any construct validity, saying, "It's not a thing," but I think that it is an approximate version of something that does, which I term "dispersalism" in this blog. This is because a basic strategy for surviving disasters is dispersal.
Our planet's predilection for disaster has deeply ingrained dispersal tendencies into most species here, by the mechanism of natural selection. Humans now get their food from agriculture. However, agriculture requires a settled existence and is therefore in opposition to dispersal, so the plot thickens.
This characteristic of agriculture results in the psychological pressure for dispersal relentlessly building, pressure-cooker fashion, across time, until a destructive explosion occurs (war or revolution), thereby accomplishing the long-delayed dispersal.
May 21, 2022: Derecho-storm damage in Ottawa. |
Wildfire smoke seen in Ottawa, Jun 2023. |
Thursday, June 13, 2019
#52. Reality is Virtual but the Host Computer is Analog. [physics]
Red, theory; black, fact.
An idea of what lies behind the veil of space-time: something
like a Turing machine, but with a multiplicity of tapes and a multiplicity of processors.
Each tape corresponds to one elementary particle in space time and the processors implement only Hebb’s
rule, a rule of learning first discovered in neuroscience, and that governs
correlations among signal sources. The tapes are
polymer-like, and their ongoing elongation by polymerization at the ends causes
the passage of time. This elongation is associated with a white-noise signal
unique to each particle/tape/strand because the monomers are sampled from a
population with a Gaussian size distribution.
![]() |
A still more complex, three-strand scheme. At a given point, multiple strands can adhere to the same processor, and vice-versa.
|
The first illustration is my current best guess as to what a Hebb processor is like, but as we say in research, "Many questions remain." The short, black lines are the catalyzed ligation points between monomers, and these are the points of attraction to the processor. If the rear pull-off point encounters a big gap between ligation points, the processor will advance an unusually great distance in one step, creating an unusually long catalytic pocket at the front, which will have a selectivity for longer monomers, thereby implementing a copying mechanism. Causally, this is backwards, but the alternative explanatory plan seems to involve a messy discussion of chemical equilibria.
This machine is presumed to be naturally occurring and to belong to a particular stage in the physical evolution of the universe. I make no appeal to arguments of intelligent design, even by space aliens. By the anthropic principle, the machine is not necessarily typical of all structures extant at that stage. In other words, we are living in a natural sub-world that is simulation-like in that the view from within is entirely unlike the view from without, but the two views are related by something that could be called a sub-world law, an example of which would be d = K/rxy. This concept is nothing new because it could also serve in outline as a description of human conscious experience, which is most unlike a mass of neurons signaling to each other in the dark. Thus, the theory of everything could turn out to be a series of nested sub-world laws.
The length of a strand relative to
some interior point is a Gaussian white-noise time-series signal with an upward
trend just steep enough to eliminate negative slopes. I will deal here with the
detrended version of this signal because, on the laboratory distance scale,
both the observed system and the observer will share the same trend, preventing
its direct observation. Moreover, the polymerization process is postulated to
preserve a permanent record of the detrended signal. Therefore, while the
future does not exist in this model of time, the past is perfectly preserved. A set of distinguishable time series is called panel data and is a Euclidean
space by the mathematical definition and can
therefore map onto and explain physical space, at least on the laboratory
scale.
Imagine some panel data consisting of two time series, X and Y, representing two elementary particles. Take a slice of this data ten samples long and plot them as two ten-dimensional vectors, X and Y. The dot product of these vectors is then easily computed as x₁y₁ + x₂y₂ + … x₁₀y₁₀. A Euclidean space is defined as an affine space (i.e., containing no special points like the origin) where the vector dot product is defined. Recall that this dot product is equal to the length of X times the length of Y times cosθ, where θ is the angle between the vectors. Moreover, cosθ is here equal to rₓy, aka Pearson’s r, a measure of correlation between signal sources. Pearson's r is commonly used in the sciences, including neuroscience, and ranges from -1 for negative correlations to +1 for positive correlations; zero indicates no correlation.
rxy may represent distance in space-time, and vector length represents mass-energy in space-time. An rxy of 0 would represent infinite distance and an rxy of 1 would represent 0 distance, a state found only in black-hole singularities or the big bang singularity.
Processes would be experienced as close together because they are correlated, not correlated because they are close together.
The latter is the usual assumption, usually rationalized as being due to the easy exchange of synchronizing photons or virtual photons at close range. However, we seem to be moving here toward the elimination of the light/photon construct from physics.
(Deprecated, Part 6)
A simpler possibility is d = K/rxy and rxy = K/d. This change advantageously limits how small space-time distances can become, thereby eliminating infinities from gravity calculations. K is this minimum length. With this revision, the dot product of two vectors in the simulation becomes equal to gravitational binding energy.
No Flying Spaghetti Monster would be complete without meatballs,
and these would be the Hebb processors. Each strand would have a processor on each end that may
catalyze the polymerization process. Each processor would also be connected to
a random point within the mass of strands. This connection moves along the
associated strand like a phonograph needle, reading the
recorded signal while keeping pace with the growing end. The processor also has
a front read-point in the present. The two read points may or may not be on the
same strand. If a correlation is detected between the two read points, the
correlation is permanently enhanced by a modification of the polymerization
process, per Hebb’s rule. All the orderliness of the universe is supposed to
emerge from this one type of calculation.
At this point, we have not eliminated space from physics; we have merely replaced Euclidean space by another kind of space that is less structured, in which the spaghetti machine has its being. The new space is a metric space but has neither norm nor dot product axiomatically, although they can be present in simulation. The metric is bijective with the counting (natural) numbers and is equal to the number of primordial time elements (PTEs) in a polymer of same.
Degrees of correlation less than maximal are encoded in the processor as the number of strands adhering to the processor above the minimum for inter-strand copying, namely two. One of these middle strands is illustrated in the sketch of the three-strand scheme, and they putatively degrade the fidelity of the copying process and reduce Pearson's r in proportion to their numbers, while also introducing a time delay into the copying process, due to changes in the length of the processor. A reduction in Pearson's r, which increases the encoded space-time distance, simultaneous with an increase in the copying time delay is responsible for the finite speed of light.
If N is the number of middle strands on a processor, then a reasonable guess as to its effect on Pearson’s r would be r = 1/(N + 1). (Deprecated, Part 6) Units analysis requires that the electron binding energy equal the length of the projection of the electron vector on the proton vector. This is not the dot product and it has the correct units, namely mass-energy. The revised attempt to reproduce Rydberg's formula is:
ΔE = ||e||*(1/n₁ – 1/n₂), n₂ > n₁
Equation (1)
The default interaction implemented by Hebb’s rule would be gravitational attraction. Black hole formation illustrates that gravitation has a built-in positive feedback, and this would derive from the positive feedback built into simple forms of Hebb’s rule.
To provide the hypothetical Hebb processors with such a positive feedback, postulate the following: Middle strands come and go in an energy-dependent excision/reintegration process and have a small loop of processor adhering to them when free, which explains how the processor length changes occur. A high-fidelity copying process releases more energy than does a low-fidelity copying process, and excision requires energy input. These ingredients, together with the fidelity-degrading property of a middle strand, should be sufficient for a positive feedback.
If the two read points are on the same strand, the result will be an oscillation. Electromagnetism could be the set of interactions peculiar to oscillating strands. The variance needed to express the oscillation would be deducted from a fixed total, resulting in less variance available to represent distances by its correlations. A smaller distance signal will be more rapidly modified by the Hebb processors, resulting in faster responses to forces and a smaller mass according to Newton’s a = F/m. Thus, we expect neutral particles to be more massive than charged particles, and this tendency is indeed found among the mesons and when comparing the proton and neutron. The relatively great mass of the proton and neutron and the nuclear strong force itself may emerge from a cyclic pattern of three strands (the quarks) connected by three processors. The example of the benzene molecule teaches us to expect novel results from cyclization. The metric space may itself be a simulation running on a processor situated in a still simpler space, namely a topological space, in which only a few distinctions matter, such as inside-outside and closed-open. The great mass of the baryons may come from the chaos inevitable in the celestial-mechanics version of the three-body problem, but the three bodies would be the three quarks. In the present theory, noise amplitude corresponds to mass-energy, and fast chaos looks like noise.
An extension of this idea would be that chaos is the source of all mass and the properties of a timeline depend on how many highly-correlated others (HCO) there are and the degree of this correlation. One HCO produces an oscillation but no mass; two or more produce chaos and some amount of mass. High correlation can be understood as relative, leading to a hierarchy of relationships.
There may be an alternative kind of middle strand or mode of adhesion that enhances copying fidelity upon adhesion rather than degrading it. This amendment to the theory may be required to model interactions that appear repulsive.
Hebb processors with their rear read points in the distant past would open up long-distance communication channels in space-time, giving us the by-now familiar experience of looking millions of years into the past through powerful telescopes, to see galaxies as they were when the universe was young. The communication would be instantaneous, but from an old source; not slow from a new source.
The big bang
The universe may have begun in a singularity for a trivial reason: it initially had no way to represent information, let alone correlations, because all the incoming monomers were PTEs, identical in size and having the smallest possible size. A slow direct condensation reaction in the monomer pool then gradually built up larger blocks of PTEs, causing the average size of the items adding to the strands by polymerization to increase progressively. The standard deviation of the size distribution would likewise have increased. Space would have expanded rapidly at first, as predicted by the inflationary hypothesis, because the first little bit of polymerization entropy to develop would have had a disproportionate effect on the system's ability to represent information. This predicts that the mass-energy of all particles has also been increasing ever since the big bang. Therefore, by equation (1), we expect that spectral lines from ancient luminous matter will be redder than comparable lines from contemporary matter, as found by Hubble, which explains the cosmological red shift.
Another perspective would be that the future is a false vacuum, the past is a true vacuum, and the present is the ever-expanding boundary between. The false-vacuum decay would be driven by entropy increase, not enthalpy (i.e., heat content) decrease, which is allowed by the second law of thermodynamics, because only the interior of the true-vacuum bubble would be occupied by information in the form of timelines.
Subscribe to:
Posts (Atom)